• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

Does the U.S. have a negative self-image? How did this happen to us?

cubbies2ws

New member
Joined
Sep 30, 2010
Messages
16
Reaction score
3
Gender
Male
Political Leaning
Libertarian
Does the U.S. need to take a trip to the psychologist? As of recently, we have rendered a horrible image of our very own country. It seems as if the rest of the world just loves to bash us for our way of life. Why is this happening? When did America stop becoming a country revered by the rest of the world? It just doesn't make sense to me because I even hear things that oppose each other. Everyone knows the "lazy, obese, American" stereotype. But I've also heard that stereotype that Americans work harder and longer than anyone else and don't know how relax. Is it just me or do these two stereotypes seem to oppose each other?

So, why is the world bashing us?
Should they be bashing us?
Why/should we be feeling guilty?
What happened to America, the best country on Earth?

My theory is that it's because we've become too tolerant. Yes. TOO TOLERANT. We are the greatest nation in existence and we are bending over backwards for others. The rest of the world no longer has respect for us because they can just walk all over us. I think we need to re-learn how to just say "no"! We have our own problems to deal with!

What do you think?
 
So, why is the world bashing us?We elected Bush twice, Tea party, health care rage
Should they be bashing us? yes
Why/should we be feeling guilty? because we should know better
What happened to America, the best country on Earth? Ronald Reagan

America has fallen from grace
 
The U.S. gets bashed partially because it is the one who takes the initiative and sometimes has to make controversial decisions that other nations refuse to make (but are necessary). So in other words, the U.S. does what everyone is thinking should be done, but it ends up taking the fall if something goes wrong. In those instances, I don't think the negative reputation is warranted and people need to stop scapegoating.

The other part is that U.S. foreign policy is genuinely disastrous and hypocritical in some areas. U.S. intervention has caused a lot of unnecessary harm in some parts of the world but like any state actor, it is also going to use its power to look out for its own selfish interests. If it wasn't the U.S. then it would just be some other power, and frankly, for the most part, I would rather it be the U.S. It's the ally of my country, it defends my country, and it has a relatively good relationship with most other nations. That said, in recent years things have taken a turn for the worst and I really hope change is on the horizon. (And no, I am not trying to quote Obama there :D)
 
Too tolerant? What empiric evidence of tolerance are you basing this conclusion on?
Tolerance in the US? We're not even internally tolerant.
 
Does the U.S. need to take a trip to the psychologist? As of recently, we have rendered a horrible image of our very own country. It seems as if the rest of the world just loves to bash us for our way of life. Why is this happening? When did America stop becoming a country revered by the rest of the world? It just doesn't make sense to me because I even hear things that oppose each other. Everyone knows the "lazy, obese, American" stereotype. But I've also heard that stereotype that Americans work harder and longer than anyone else and don't know how relax. Is it just me or do these two stereotypes seem to oppose each other?

So, why is the world bashing us?
Should they be bashing us?
Why/should we be feeling guilty?
What happened to America, the best country on Earth?

My theory is that it's because we've become too tolerant. Yes. TOO TOLERANT. We are the greatest nation in existence and we are bending over backwards for others. The rest of the world no longer has respect for us because they can just walk all over us. I think we need to re-learn how to just say "no"! We have our own problems to deal with!

What do you think?

Well - what role *did* we play when our country was first being founded?
We were everyone's honey-pot.

Every country that had trade routes, travel, and a money-making idea tried to really trump this entire area of the world (not just North America). They came over by the thousands to bartyr with the Natives. They took goods, crops and ideas that were original to this entire part of the world (S and N America) and took it back to China, Africa, Europe to futher their ventures.

After a while, though, we became colonized. When you look at the scope of history it happened rather quickly.
The more colonized we became the less "European, Spanish, Portuguese, Irish" we were individually - a new country-idea began.

Slowly we stopped being *the* cash cow that everyone could take part in - and we individualized ourselves and organized, cutting everyone off from a previously uncontrolled source of needs.

(bitterness - and lots of it - led to wars. Various countries did not let us go easily)

And so fast forward to more modern times. Suddenly this new country that *was* founded with help from a huge number of other countries is now independent, with it's own government - which is new. And new is always troublesome to many when their beliefs have been superior for centuries.

And quite quickly after that - we didn't just become a new country, we because *the* country.

A country that's been around for not even 500 years - didn't even exist while Plato, Aristotle, Constantine and the Roman Catholic Church were making their stamp on the world - was now the "end all of all disputes" - telling people what to do differently.

I can understand, now, with everyone's extensive and deep history - and culture - that a country which had it's beginnings in slavery, disadvantage, racism, their government's funds, ideology, religion and experience - as well as inequality - would suddenly be *important*

Honestly - I can hear people's thoughts, now "we help found them, those bastards."
Some people just never got over it.

You can't really blame them :shrug: Looking at it that way it makes sense.
 
America lost its good image in the Cold War, 'cause the Soviet Union fell, the blame went to America, that combined with the export of American culture, McDonald's, bad TV shows, etc. led to a sense in the rest of the world that America, the 'evil' behind the Cold War, was encroaching upon their cultures. And that led to the anti-American sentiments that we see today, or at least that's my theory.
 
Yes- things like that amount to the American-hypocrisy (as someone else from Britain explained to me some time ago).

To many people we just don't make sense - When Hurricane Katrina happened people saw some of our citizens living as the bottom-rung. I guess this was shocking to them and very new - because up until then (this one person, at least) thought we were 'better to our citizens' than that - and she couldn't imagine *how* we could have the riches of the rich AND the poorest of the poor living in one country.

From that I learned that some suffer from a lack of knowledge about overall American Structure. Their government functions differently, they don't have semi-autonomous states which can make their *own* regulations - nor are things like electricity and transportation by proxy.

We're so different in some ways it's hard to understand how we function - and why we function differently than they do.

Just like, for me, it's hard to imagine how someone can want to life in Singapore. . . or countries like Iran or Afghanistan.
 
Last edited:
Foreign policy should be decided by pragmatic national interest, not to appease the pathetic ego's of whiners who need their country validated to feel better about themselves. Every country in the world has some unflattering stereotypes, get over it.
 
Foreign policy should be decided by pragmatic national interest, not to appease the pathetic ego's of whiners who need their country validated to feel better about themselves. Every country in the world has some unflattering stereotypes, get over it.

Sometimes self-interest means getting along with the rest of the world.
 
Before WW II, the UK and France were the superpowers. During WW II, Germany and Japan were the superpowers. When WW II ended, the USA and USSR were the superpowers. Being on the winning side of WW II + being a superpower + being a free country (unlike the USSR) + Hollywood, blue jeans, Rock & Roll and Harley Davidsons = the envy of the world. During the 1950s, we thought we were perfect and the world believed us. Then the 1960s came and we finally realized that we weren't so perfect. We finally started thinking outside the box and took a second look at everything. We realized that we needed improvements. That's when the civil rights movement gained a lot of support. We also realized that it was possible for the US to be on the wrong side of a war and many people started to oppose the Vietnam War. Thinking outside the box is generally a good thing but people got carried away with it. It made people feel open minded to think of their own country as a bad country and this kind of thinking became very hip. Because of the popularity and proliferation of the US's mass media around the world, hating the US became hip everywhere, not just in the US. This was largely because they were copying us, and they were copying us because they envied us. However, if they really wanted to copy us correctly, they would've taken a second look at themselves and their own countries instead of criticizing the US. This wasn't done purely by mistake. Of course there was a reluctance to be self critical, but also, popularity tends to swing like a pendulum. Envy leads to jealousy and jealousy leads to dislike. Aside from that, people tend to get tired of things like Coca Cola and McDonald's.

Then Ronald Reagan came along and convinced Americans that America was a great country. That improved the US's self image which made us less popular around the world. They didn't like us liking ourselves. Then the Cold War ended and the US became the ONLY superpower. This made people around the world more envious and more jealous, so the hatred increased. Of course we had enemies who hated us because we supported Israel and we had other foreign policies which hurt people (Iran and Vietnam are examples), but I'm talking about being hated by the world in general, including by our allies.

During the last decade, partisan rancor increased in the US. The Democrats voted for the war in Iraq and then quickly used the power of the media to portray it as Bush's war. Then for the purpose of partisan political gain, tried to make it look like an evil war which was worse than Vietnam. This revived the US's self loathing and increased hatred of the US around the world.
 
Last edited:
Nobody ever envied the agricultural implements known as Hardly Dangerous.
 
I was beaten to it, but Reagan happened.
 
Does the U.S. need to take a trip to the psychologist? As of recently, we have rendered a horrible image of our very own country. It seems as if the rest of the world just loves to bash us for our way of life. Why is this happening? When did America stop becoming a country revered by the rest of the world? It just doesn't make sense to me because I even hear things that oppose each other. Everyone knows the "lazy, obese, American" stereotype. But I've also heard that stereotype that Americans work harder and longer than anyone else and don't know how relax. Is it just me or do these two stereotypes seem to oppose each other?

So, why is the world bashing us?
Should they be bashing us?
Why/should we be feeling guilty?
What happened to America, the best country on Earth?

My theory is that it's because we've become too tolerant. Yes. TOO TOLERANT. We are the greatest nation in existence and we are bending over backwards for others. The rest of the world no longer has respect for us because they can just walk all over us. I think we need to re-learn how to just say "no"! We have our own problems to deal with!

What do you think?

I think you are speculating without any real data on the subject.
 
The US a negative self image.. HAHAH good one. If anything it has an overt positive deceptive self image. The whole "USA USA, we built and invented everything, won WW1 and 2, and are the greatest nation on earth ever" bs is a tad... Some humility would do wonders for the US.
 
Does the U.S. need to take a trip to the psychologist? As of recently, we have rendered a horrible image of our very own country. It seems as if the rest of the world just loves to bash us for our way of life. Why is this happening? When did America stop becoming a country revered by the rest of the world? It just doesn't make sense to me because I even hear things that oppose each other. Everyone knows the "lazy, obese, American" stereotype. But I've also heard that stereotype that Americans work harder and longer than anyone else and don't know how relax. Is it just me or do these two stereotypes seem to oppose each other?

So, why is the world bashing us?
Should they be bashing us?
Why/should we be feeling guilty?
What happened to America, the best country on Earth?

My theory is that it's because we've become too tolerant. Yes. TOO TOLERANT. We are the greatest nation in existence and we are bending over backwards for others. The rest of the world no longer has respect for us because they can just walk all over us. I think we need to re-learn how to just say "no"! We have our own problems to deal with!

What do you think?

Because everyone always hates on whoever is in first place and pretty much running the show.

Don't worry, the way things are going, I don't think it'll last much longer. Then we can all hate on the Chinese or whoever. :lol:
 
The US a negative self image.. HAHAH good one. If anything it has an overt positive deceptive self image. The whole "USA USA, we built and invented everything, won WW1 and 2, and are the greatest nation on earth ever" bs is a tad... Some humility would do wonders for the US.

he said 'self-image' in the title but he discussed 'other people's image of the USA' in the post - so I followed along with that.

Yes - self-image wise we're quite proud (most are)
 
Too tolerant? What empiric evidence of tolerance are you basing this conclusion on?
Tolerance in the US? We're not even internally tolerant.

By consistently upholding our Constitution, whether in our own interests or not, we are the most tolerant country on the planet.
 
By consistently upholding our Constitution, whether in our own interests or not, we are the most tolerant country on the planet.

Yes - even when the majority of people DON'T support something - if it's a guaranteed right, it's still permitted (like burning the flag)

Tolerant - to a fault.

That *does not mean* that *individual people* are *that* tolerant - but the construct of our nation is.
 
Does the U.S. need to take a trip to the psychologist? As of recently, we have rendered a horrible image of our very own country. It seems as if the rest of the world just loves to bash us for our way of life. Why is this happening? When did America stop becoming a country revered by the rest of the world? It just doesn't make sense to me because I even hear things that oppose each other. Everyone knows the "lazy, obese, American" stereotype. But I've also heard that stereotype that Americans work harder and longer than anyone else and don't know how relax. Is it just me or do these two stereotypes seem to oppose each other?

So, why is the world bashing us?
Should they be bashing us?
Why/should we be feeling guilty?
What happened to America, the best country on Earth?

My theory is that it's because we've become too tolerant. Yes. TOO TOLERANT. We are the greatest nation in existence and we are bending over backwards for others. The rest of the world no longer has respect for us because they can just walk all over us. I think we need to re-learn how to just say "no"! We have our own problems to deal with!

What do you think?

By lying to yourselves left and right, claiming you're the greatest nation on Earth, you eventually start to believe it. And then, when someone proves you wrong, or something happens that shows that America is NOT the greatest country on Earth, it lowers your self-confidence a bit. For a while, you've been able to just clap on some blinders, stare straight ahead, and ignore anything that didn't start spontaneously chanting USA! USA! USA! But the evidence has piled up so much that your willful ignorance can no longer stand -- and thus, many of the smarter Americans are disillusioned with a quarter-century worth of lies from their government and their people about their place in the world.

In short, Americans are finally realising that they're not special, contrary to what their government tells them -- and it's a painful thing to come to terms with.
 
Back
Top Bottom