Dear Americans, Please Stop Acting Like Work Is Everything (It’s Killing You)

How did we get to this idiotic point? Workism. What’s that? “It is the belief that work is not only necessary to economic production, but also the centerpiece of one’s identity and life’s purpose; and the belief that any policy to promote human welfare must always encourage more work.” – The Atlantic