DANIEL VAUGHAN: America Is A Traumatized Country

By 
 February 8, 2024

Every news outlet is running stories that explore some version of: "The economy or world is in a good spot, so why are Americans so negative?" I get the question on a basic level. If you look around and see positive economic news, it's fair to wonder why the American mood is so sour and doesn't reflect the positive.

There are a myriad of reasons you can point to, but I'd posit a more simple one. Everyone is more anxious now because they see the world as far more fragile and easily taken away than before the COVID-19 pandemic. We're still in a post-pandemic world, and our collective response to what happened then still reverberates throughout the country. It's a national trauma response.

When I say COVID-19 pandemic, I should probably be broader and say everything that happened from 2020-the present. COVID-19 is the glue holding everything together, but we had multiple events outside a viral disease from China.

After COVID, we had things like the George Floyd riots that burned through multiple cities, including the nation's capital. We're still riding through a multi-decade high inflation that continues to ripple throughout the economy via sky-high interest rates. And the drum beats of war seem to get louder every day.

In short, since 2020, people have been shaken out of a daze. The world doesn't look or feel the same. In a recent Wall Street Journal story on this topic, they concluded, "Americans feel sour about the economy, many say, because their long-term financial security feels fragile and vulnerable to wide-ranging social and political threats."

I have friends, particularly on the left, who like to take it back to the election of Donald Trump and say he's the reason everything feels upside down. But that's not quite right, either. People were optimistic then and still believed in the economy and their lives. What we're seeing now is a breakdown in that belief.

Pew Research did a poll last year that asked participants if they believed America was the best country on Earth. They've measured responses to this question over the years and seen a trend towards fewer Americans believing America is genuinely great, with a distinct partisan divide.

The poll found, "Today, two-in-ten Americans say the U.S. "stands above all other countries in the world." About half (52%) say the U.S. is 'one of the greatest countries, along with some others,' while 27% say 'there are other countries that are better than the U.S.'"

In 2019, 40% of Republicans said America was the best, with only 10% of Democrats saying the same. In 2023, the Republican number had fallen to 31%, while only 9% of Democrats believed America is the best. The election of Biden did not reverse the trend for Democrats.

I get believing that the country isn't in the best spot. But it is also worth asking some of these people what country they think is objectively better than America. I know on the far-left, the answer is a country like Denmark. For the far-right, they'd answer Hungary.

Both answers are objectively silly in every conceivable way. Ethnically homogenous countries living under the umbrella of American defense have the wiggle room to do politically bizarre things. Socialists can exist in countries where they don't have to pay for a defense budget. Force them to stand independently, and they'd crumble in a day.

But in terms of raw power, no other country is better than the United States. And it's not even close. The United Kingdom has to compare itself to Mississippi these days, not the United States as a country. Countries like Canada and Australia are even smaller.

China's ambition of toppling the United States economically crumbles with each passing day as its population implodes. Russia is a shadow of itself, both from the USSR era and its Tsarist past. The European Union attempted to provide a competitor but is slowly crumbling into its constituent parts.

And then there's the reality of immigration: people want to become Americans. Correction: everyone wants to become an American. There's not a country on Earth where this isn't the case.

That's why it's so bizarre to listen to Americans born here complain so loudly that the United States isn't the best place on Earth. By objectively every measure conceivable, the United States is better for human flourishing than any other place on the planet.

But yet, in polling, people are saying that. They say it to pollsters, journalists, and anyone willing to listen. What seems hard to miss at this point is that the social cohesion that knitted the country together is vanishing. People once believed America could sustain its greatness for a long time. They're less sure of that now, and much of the blame for that falls on politicians who overstepped their bounds in a pandemic. Or they allowed cities to burn during protests.

America is better than all other options. But Americans are sensing that this is starting to slip away, and that's making them more anxious. And no one has fully recovered from the pandemic years, whether they say it or not. And that's because we're still not free from things like inflation. The repercussions of decisions made in 2020 are still hitting us.

At some point, we need this negative spell to break over the country, both for the country's long-term health and our collective sanity. Until then, America remains the best country on Earth, but there's a reason everyone is pessimistic. It's not because the press is failing to tell them about the positives in the economy. It's about the national trauma everyone experienced starting in 2020. We're still feeling the impacts of that, and we're an unhealed country.

" A free people [claim] their rights, as derived from the laws of nature."
Thomas Jefferson
© 2015 - 2024 Conservative Institute. All Rights Reserved.