Link: [On ABC News]
Be assured that this is a hard subject for me. For I was raised as an unapologetic capitalist, and what fuels capitalism is profits.
On the other hand, can good healthcare be really run as a business? For it is a standard business practice to abandon the things that are no longer profitable in favor of what is.
Alas, I have heard Rush Limbaugh say on many an occasion about business in general when I was a devoted follower, “They’re not in business for their health. They’re in business to make money!” Is this the kind of doctor and/or hospital you want to go to when it is matter of life or death, and aren’t the pharmaceutical companies in the same boat?
No, I am not saying that the people who work in those fields should not be allowed to make a good (even great!) living. For I recognize all of the effort involved, but they really should be in business for our health, along with their own, of course.
Please Also Visit:
and
2 comments:
I heard a chiropractor recently cite statistics that the leading cause of death in America is the health care industry, and how hospitals, doctors and drugs are the leading killers of Americans. I think there needs to be a radical redesign of the system and people need to snap out of their 'Readers Digest'-"All doctors are honest/all drugs are good" mentalities.
It may be grossly unfair on my part, but it really bothers me that so many in the healthcare industry now look at patients as customers, without any obligation to customer service. Since it is all so profit-orientated these days, maybe it should be changed to where they get paid in the same way as a mechanic does? For when a mechanic doesn't fix your car, he doesn't get paid. Well, at least if he wants to stay in business, that is. Whereas, it is an accepted practice to pay for everything a doctor does--regardless of whether it was right or wrong, and the only time you might get some back is when something catastrophic goes wrong.
Post a Comment