• This is a political forum that is non-biased/non-partisan and treats every person's position on topics equally. This debate forum is not aligned to any political party. In today's politics, many ideas are split between and even within all the political parties. Often we find ourselves agreeing on one platform but some topics break our mold. We are here to discuss them in a civil political debate. If this is your first visit to our political forums, be sure to check out the RULES. Registering for debate politics is necessary before posting. Register today to participate - it's free!

Has Wall Street had a mostly negative influence on America?

Has Wall Street had a mostly negative influence on America?


  • Total voters
    15

The Giant Noodle

DP Veteran
Joined
Mar 22, 2010
Messages
7,332
Reaction score
2,011
Location
Northern Illinois
Gender
Male
Political Leaning
Independent
It is used to make money for the company so they can grow and HOPEFULLY improve themselves for their customers and their employees.

What I see happen is that because companies are so tied to KEEP growing, or at least monetarily keep growing that they cut costs and those costs and NOT good for the employees or, say the environment or say their suppliers and their bottom lines. The Pharma and helthcare companies abide by the same thing. Insurance companies... same thing. And screwing their clients by any legal means possible is in their deck of cards. OH!!!!! And bribing the Feds.... errrr..... Lobbying the government to win business.... to win sales no matter WHAT the cost... just so the cost of their stock rises. :roll:

Stock holders expect the company to earn more and MORE and MORE each quater of each year so they get wealthy. They dont care what or how that company makes a profit.

Bottom line... has Wall Street had a mostly negative influence on America?
 
Wall st. is the big culprit behind the off-shoring of American jobs. This will be the downfall of the USA.
 
I think it's greed, but it's greed on the part of those CEOs and other officers within a company. When your salary is boosted by stock options and your measure is that of how well your stock is doing, these heads of companies don't care about the company itself -- only the price of the stock. The decisions they make are political ones instead of hard-headed business decisions.

I think the clearest example of this is in our car industry -- caving to completely outrageous union demands to avoid strikes -- and the pharma industry -- get that drug approved at all costs. Maybe Wall Street is where it plays out, but, in my opinion, the greed starts in the boardroom.
 
It's not Wall Street in and of itself. Not in the beginning anyway. But over the years what the government has allowed them to get away with. Also allowing them to have so mch power and influence in politics. This goes back to the Panic of 1907. It goes back to the Federal Reserve. Between Wall Street and the International bankers they have brought our country to it's knees.

Yes, Wall Street is corrupt. But is the governments fault for bailing them out everytime they make stupid decision. They have no incentive to pull their heads out of their asses.
 
Wall Street is the main reason I'm not a huge fan of capitalism (I'd still choose it over anything else though). Capitalism should benefit those directly working for a company... not to make fat-cats more fat. I understand the purpose of investments and feel they are extremely important, but at what point do people say, "alright we're making enough money... let's give our employees a bonus (or increase their salaries) for making this company what it has become"? Investors IMO make bonuses unattractive, because that's less money those in charge will make.
 
Do you mean the stock market, big business, our capitalist-ish system, or what?

I think that each of them has been a fairly strong positive.
 
Why would it be a negative influence? It is simply a good route between Broadway and South St. :roll:

.
 
It is used to make money for the company so they can grow and HOPEFULLY improve themselves for their customers and their employees.

What I see happen is that because companies are so tied to KEEP growing, or at least monetarily keep growing that they cut costs and those costs and NOT good for the employees or, say the environment or say their suppliers and their bottom lines. The Pharma and helthcare companies abide by the same thing. Insurance companies... same thing. And screwing their clients by any legal means possible is in their deck of cards. OH!!!!! And bribing the Feds.... errrr..... Lobbying the government to win business.... to win sales no matter WHAT the cost... just so the cost of their stock rises. :roll:

Stock holders expect the company to earn more and MORE and MORE each quater of each year so they get wealthy. They dont care what or how that company makes a profit.

Bottom line... has Wall Street had a mostly negative influence on America?

It has done some good, obviously - but recently, without proper oversight, unchecked greed has turned it into a potentially danerous force.
 
Bottom line... has Wall Street had a mostly negative influence on America?

In the last 20 years - Yes. Too much "financial innovation." Let's save the innovation for things that actually help people, like medicine and computers.
In the long term - No. Wall Street provides vital funding for businesses that would not otherwise be able to operate.
 
Back
Top Bottom