I wonder, do most foreigners perceive the United States to be a neo-fascist, theocratic, right-wing state? It seems to me that a lot of people in our own country think so, and I find that view incorrect for a number of reasons...
*In 2006, the Democrats (our major left-of-center political party) won control of our government's legislative branch. They appear poised to win the presidency this year, and, even if they don't, the alternative is a liberal Republican (our major right-of-center political party).
*At this time, the United States has the second-highest percentage of foreign born citizens as it has every had. Immigrants have a significant tendency to vote Democrat. In addition, the fastest growing segment of our population (Latinos) tend to vote Democrat an overwhelming majority of the time. Our higher education institutions continue to be hotbeds of left-of-center thinking, and it is incredible how lop-sidedly our young people support Democrats over Republicans.
*While the United States may have been a Republican-dominated country for the past 28 years, the Republican grip has grown ever weaker in the last 8 years. At no point during the "neo-Conservative Era" has the United States been anything like a fascist country. During the Republicans' time in power, religion has seen a declining importance in American society. Ironically, the "Reagan Revolution" sought to return the United States to some of the ideas of our founding fathers (freer markets, reduction in the size of government, etc.).
In my opinion, it is not that the United States has moved farther to the right. On the contrary, the political parties of the United States have moved farther to the left. The fact that we have generally elected governments from the far right of our political spectrum does not mean that we have elected governments from the far right of the political spectrum.
One other thing that I find interesting is that Europe has been moving away from its neo-socialist policies of the 1990s. Instead, much of Europe has swung back toward more conservative policies. However, here in the United States we seem to be behind the times (despite what most Americans would tell you). In fact, it seems as if we are following in the footsteps of our European cousins at the very time that they have begun heading a different direction.
What do non-Americans think of our government? Are we right-wing fascists in your eyes? Are we left-wing socialists? Are we level-headed moderates? Put your dislike of our president aside and tell me what you really think about our government.