The Conventional Wisdom, repeated ad nauseum
is that America is a center right nation, the corollary being that, if those lefty Democrats want to win elections they have to "move to the right," and if Obama ("the most extreme radical leftist President ever") is to have a prayer of passing any kind of legislation he'd better drop this socialist garbage and start acting like a Republican. You know, that Republic Party representing the majority of Americans, ahem. This CW is displayed front and center day after day on the mass media, repeated so often that most national Establishment Democrats believe it themselves, and it looks as though even President Obama swallowed the CW hook, line, and sinker, as he strove to re-create a bipartisanship in Congress which in reality had frayed apart long ago, and which most certainly became a total dead letter upon his election.
How, then, is it that Barack Obama, that "radical leftist" ever got elected in the first place? How, in heaven's name, did he carry so many so-called Red States, including Virginia---- which promptly turned around two years later and elected a hard right conservative (masquerading as a moderate) as Governor, and an Attorney General openly so far over the cliff on the right he is almost certifiable? Why is it there is such a powerful, reactionary movement like the Tea Party dominating the public square and, it seems, the Republican Party? Why do I have the gall imagine that America is not center right, but center left?