Most conservatives in Europe believe health care is a right. Center / Center-Right conservatives in the United States don’t. The view that not everyone should be guaranteed care is a far-right wing view that would be considered deeply immoral and, frankly, insane, anywhere else, even among conservatives. I mean, can you imagine holding the view that people who can’t afford health care should be forced into bankruptcy or die? That is the conservative Republican policy approach. It’s not hyperbole or exaggeration, it just is.
Health care is a right. Of course it is. Especially in the richest nation in the history of the world with tremendous health care. We shouldn’t allow poor people to die from treatable ailments. You’d think most Christians could be convinced of the only sane, sensible, moral approach. You’d think fiscal conservatives could be convinced of the only cost effective approach. No such luck. While conservatives outside the United States believe health care is a right, unfortunately here it’s a view only shared by the Left. In the United States, conservative and right-wing Christian ideology has long been wholly controlled by Corporate America.