Is the US is a center-right country?


It has long been a mantra, especially in the media, that the US is fundamentally a center-right country. This assertion is routinely made by conservatives, who argue that any policy that is ostensibly liberal is going against this consensus and should be viewed as an aberration, and that Democratic electoral wins are statistical fluctuations that do not change this underlying reality. But is this statement true?

When compared to political parties in democracies around the globe, the answer is obviously yes, since American politics is skewed heavily to the right. The Democratic party can rightly be labeled a center-right party and the Republicans as a a far-right party, because both parties are resolutely pro-oligarchy, pro-war, and pro-big business.

But within the narrow spectrum of US politics and political labels, the phrase ‘center-right’ has been associated with anti-government, anti-regulatory, and regressive taxation and economic and fiscal policies, coupled with a conservative social agenda. This view is most closely represented by those who favor the Republican party.

The term ‘center-left’ has been associated with those who see the role of government as an important counter-balance to the excesses of the private sector, who believe that government regulatory intervention is needed to protect ordinary people and the environment, believe in a progressive tax code, and hold more liberal social values. Such people tend to call themselves Democrats.

The last election produced data that cast into serious doubt the claim that the US is a center-right country in this local sense. Exit polls revealed these findings:

65% says that illegal immigrants should be offered a chance to apply for legal status and only 28% say they should be deported.

59% say that abortion should be legal and only 36% say it should be illegal in all or most cases.

By a margin of 49% to 46% people say that their state should legally recognize same sex marriage.

Only 35% say that income taxes should not be increased on anyone.

55% think that the US economy generally favors the wealthy with only 39% thinking it is fair to most Americans.

Even Fox News analyst Brit Hume, while digesting Mitt Romney’s defeat on election night and losses by Republicans in both houses of Congress, suggested that the elections revealed that the US is more liberal than previously thought.

However he insisted that the US remains a center-right country and that basic assumption will not be re-considered. It is an essential strategy of the oligarchy that their view of how the country should be run is one that is shared by the majority of people and so the narrative of the US being a center-right country will continue to be propagated by the media whatever the facts.

Comments

  1. jamessweet says

    Wait, so I’m confused… it’s manifestly obvious that, in relation to similar developed nations, the US is “center-right”, i.e. the US center is somewhat to the right of the international center.

    Were the Fox News crew previously trying to argue that the US was center-right in relation to itself? As in, the American center was to the right of, the, uh, American center? Like I say, I’m just confused now..

  2. Mano Singham says

    Yes, that is exactly right. They argue that the US is to the right of the US center, which itself is to the right of the global center.

  3. Nick Gotts (formerly KG) says

    A comment on your link, gives a further link, showing that Americans on average select a wealth distribution close to Sweden’s as the ideal; and are systematically misinformed about the wealth distribution in the USA, believing it is much more equal than is actually the case. This research, along with polls like this one, which shows over 1/3 of Americans view socialism positively, suggest that Americans appear more right-wing than Europeans in large part because they are systematically lied to, and denied a chance to vote for what many of them would prefer.

  4. Robofish says

    It’s a mistake to think the American people have any fixed, long-term political affiliation, IMO. They voted for Republicans convincingly in 2010 and for Democrats slightly less convincingly in 2008 and 2012. And many of those percentages have varied a great deal over time. In short, on most issues, the majority of the American people are open to being persuaded either way. That’s probably true of most other countries too.

  5. left0ver1under says

    The democrats are centre-right on a good day, usually they’re just right wing. The republicans are much further right than that.

    Many outside the US -- especially those in democracies -- would label some of the republican party as fascist, and a few of the democratic party, too. What else would one label a country that commits unilateral acts of aggression against other nations without UN backing and without evidence of the allegations it makes?

    http://www.bendib.com/newones/2006/september/small/9-12-Islamofascism.jpg

  6. mnb0 says

    Being an outsider I think Nick G is correct. As soon as the USA adopts a political system like the Dutch or German one a left wing party will arise capable of gaining say 30% of the votes -- and possibly a majority in cities like San Francisco.

Leave a Reply

Your email address will not be published. Required fields are marked *