Do you trust your doctor? Do you assume that he or she has your best interests in mind, all the time?

Most of us do, but some experts in the medical community think that it is a bit naive. As one doctor himself put it so bluntly: “Don’t trust your doctor. There’s no question in my mind that today most doctors are businessmen first and doctors second.”

The problem, he said, is that doctors are in this business to make money. They do it by treating patients and writing prescriptions. They work with the drug companies.

He feels that this attitude puts patients in potential danger. If a doctor knows that a drug may be dangerous, but it hasn’t technically been proven yet, will that doctor just keep on giving the pills to the patients so that they can write out those prescriptions? Or will they tell patients to be careful and put their own health first?

He talked about a friend of his who was taking a medication later shown to increase heart attack risks. He claims that doctors did not tell their patients about it, but his friend found out about the risks on his own and stopped taking it. Other patients later had serious heart complications and wound up in court, but his friend was able to avoid the risks by learning what the doctors could have told them — if they hadn’t wanted to keep selling those dangerous drugs.

This story paints doctors in a negative light, and it’s worth noting that many are really just trying to help. However, you need to know what risks you face and how the system works. If you suffer injuries or complications as a result, be sure you know what legal options you have.