Recently I saw a press release from the U.S. Attorney's Office reporting that a chiropractor had been sentenced in an insurance fraud case. Now this isn't a unique situation; through various sources, I hear about cases all over the country of chiropractors' misdeeds. I was distressed because the doctor involved is a former student. Unfortunately, this isn't unique either, as I have had a few former students run afoul of the law.
There are a few things that do make this particular case unique. He was a patient of a good friend since a pre-teen and neither my friend nor I didn't see "criminal" in his makeup. I have had other students I didn't anticipate such problems with, and in every case I can point to the bad influence - the chiropractor they worked for - as the reason they lost their moral bearings. In this case, however, the doctor isn't an employee; he owns his own practice.
Over the years, I have thought about what causes a honest person to break their moral compass. One explanation comes from the experiments of Yale psychologist Stanley Milgram. He was interested in discovering how the Nazis got so many people to participate in the murders of millions in the Holocaust. Some question the research ethics of his study - the basic setup was that the subjects were deceived into thinking they were going to help researchers with an experiment, rather than actually be the subjects. The experiment they thought they were conducting was to determine if electrical shocks helped stimulate "research subjects" (who were really part of the research team) learn better.
The real subjects were told to give increasing levels of electrical shocks to the "learners." No actual shocks were given to the "learners," but they acted as if they had been shocked. In the first experiment, 65 percent of subjects administered what they believed were lethal shocks of 450 volts!
This series of studies showed that people tend to obey people in authority. Having researchers in lab coats or from a prestigious Ivy League school was apparently a big motivator. Additionally, in general, people do what their peers do. Thus, having people "conducting" the same experiment near the subject and telling them how they had given the maximum electrical charge to their "subject" proved to be effective. Finally, the subjects found it easier to comply if there were more distance (literally and figuratively) between themselves and the victim. For example, in one experiment the subjects ordered someone else to administer the shocks.
In clinical practice, this might explain an associate doctor committing fraud under the orders of their boss. The fact that other associates also commit fraud would make it easier for people to go against their own moral standards of honesty. The fact that the insurance company makes the person harmed (the stockholder) so distant and faceless makes it even easier to commit the fraud. This doesn't explain my former student's case, as no one in authority pushed him to commit fraud. I do not know if he has any friends/peers who committed the same kind of fraud and may not yet be indicted.
Let's look at other reasons why someone might commit insurance fraud. Of course there is out-and-out greed. Ancient Greek dramatist Antiphanes wrote: "The quest for riches darkens the sense of right and wrong." However, I doubt this is truly the essence of being morally blind. A person who is morally blind is a sociopath who only looks at right and wrong with respect to how behaviors affect themselves. They are the consummate actor, able to deceive most people most of the time. It could be a moral blind spot - moral shortsightedness. The person with a moral blind spot is generally a moral person, but has a blind spot and just doesn't see that this action is inconsistent with their basic ethical beliefs. Similar to the person who takes food from a supermarket to "test" it, even when it is not appropriate. They just don't see that what they are doing is wrong.
Someone with a moral blind spot can often see the problem with their actions once it is pointed out to them in a nonthreatening way; for example, by explaining the behavior as if it were a hypothetical situation. This can be a bit of moral self-deception/rationalization. If you asked the doctor about his actions before the FBI caught him, he might have said, "Well, the insurance companies are ripping my patients and me off, so it's OK to skim a little from them." As Milgram found, once one gets started on the path of doing something wrong, it is easier to do it again and with more intensity.
This was also the basis behind what is called the "broken windows" theory of crime prevention. Former NYC Mayor Rudy Giuliani's crime prevention program included stopping the little crimes, the so-called "quality-of-life crimes," so that the petty criminal did not progress to bigger crimes.
Fraud could be a defensive response to what doctors believe is the constant attack on our profession. This is sort of like saying, "If people weren't attacking us all the time, we would earn a decent living without having to resort to a little cheating."
In the end, until this doctor or someone else convicted in an insurance fraud case is willing to come clean and explain it all, I really don't know why, but I do know it's not worth it. When I look at the details of this particular case, it seems as if this doctor, after taxes (the money was defrauded from insurance companies, so he had to pay taxes on it) netted about $100,000 a year for three years. Now $300K is a pretty large sum of money, but if one looks at the risk vs. benefits, it really isn't that much.
Just think about it: This doctor now has to pay nearly double that amount in restitution, is selling his office and home, and has a jail sentence and a criminal record. The consequences of this crime on him and his family are far-reaching, to say the least. I say the risk was far greater than the reward. It's a trite aphorism, but one really has to think along the lines of, "Don't do the crime if you can't do the time."
Click here for previous articles by Stephen M. Perle, DC, MS.