
Pretty wild to be living in a time when humans are being forced to work in order to even be considered deserving of life or even to be considered human at all, while simultaneously human labor is being systematically eliminated from the concept of work.
If I could change one thing about US culture, it would be this unshakeable and absolutely backward belief that life must be earned, and generating profit is how you earn it.
Life is a human right. Profit must be earned, and those who want to earn it must respect human rights in order to do so.