The phrase, that is. A researcher found that this label employed by conservatives, the AMA and insurance lobby at various times in our history doesn't carry the negative connotation it once did.
From the article:
•Of the respondents, 67 percent said they understood what "socialized medicine" meant. Of those, 79 percent said the term means that the government makes sure everyone has health insurance. Only 32 percent said it means that the government tells doctors what to do.
•Of those who said they understand the term, 45 percent said that if America had socialized medicine, the health care system would be better, while 39 percent said it would be worse.
•Not surprisingly, opinions differed according to respondents' politics. Among Republicans, 70 percent thought socialized medicine would make the health-care system worse. Among Democrats, 70 percent thought it would make things better.
Independents were split more evenly, with 45 percent saying that socialized medicine would be an improvement, and 38 percent saying it would be worse than the country's current health-care system.
"It's still an emotionally charged term for Republicans. The phrase itself gets them very angry," Blendon says. "But Democrats and independents don't see it as a term that drives them away."
Do you find this to be good news or bad? If bad, what new phrase would you nominate for effectiveness?