The most disgusting thing about the United States is that, after all the lies, the bungling, the corruption, and the death, Americans largely believe America is the greatest nation on Earth.
We’re a “first world country” whose homeless people are barred from public places, shunted off to the fringes so those of us blessed with homes can pretend everything is just fucking peachy. We’re brought up to both hold up charity as a virtue and denigrate those who need it. We hold fast to this myth, the American dream, and yet 90% of us will die with both debt and regrets.
Even after a global pandemic that arguably proved that capitalism cannot solve the problems of the 21st century, there are those still arguing that people need to pick themselves up by the bootstraps and try harder. This American grit fantasy we Americans like to tell ourselves is so pervasive, there are people who very obviously need help refusing help.
What use are any of these observations if we offer no viable solutions to the problem?
What use is it even thinking about these problems we face if Americans are so willing to believe bullshit over what we see with our own eyes?
Why act when we’re never going to implement the obvious solutions?
There’s a solution for climate change.
There’s a solution for poverty.
There’s a solution for homelessness, unemployment, education, healthcare, and the erosion of our values.
So why do we keep doing everything in our power to drive over that cliff when we have plenty of time to slow down and turn the fuck around?