I know I am wasting time, but I cannot help myself.
Liberals are emphatic that the Government has no right to be in the bedrooms of Americans. Thus sodomy laws are Unconstitutional.
Liberals are emphatic the Government has no right to tell them what to do with their bodies thus we have Roe v. Wade.
Based on these positions, how can a liberal believe the Government has a right to force me to buy insurance or be part of a Government plan to provide insurance? The current bill in the House forces us to be part of a plan.
Now some of my progressive friends will make the argument that it is in the best interest of the nation as a whole to make us all part of an insurance plan. Shouldn't we then insist that homosexuality be a crime since the vast majority of AIDS cases in the US are contained to the homosexual community? note: I am not advocating such laws, only making an argument on a bigger issue.
Should we mandate that we all have to walk two miles a day and banish Whoppers and Big Macs and cigarettes?
Some will say that unless I have the ability to pay for a catastrophic health crisis I will harm the community by having to file bankruptcy adding costs to the rest of us. Will we then mandate I have life insurance? What if I buy a car I cannot afford and lose my job? Will that be against the law? The same for a house? Where does Government mandate end?
Does the Government have a right to tell you what to do or not? Is Government intrusion into your personal choices limited to your agenda?