Can anyone explain why, when Democrats win elections, it is viewed as a mandate, the will of the people to adopt their agenda. Yet when Republicans win it is just ...well whatever is the opposite of mandate. It is portrayed as extreme, and looked upon as an aberration by the media.
Why should the Repubs compromise to Obama's policies? The people have spoken. It is the President and the Democrats who must compromise.
Elections have consequences, right?