Your brain on lying

Some scientists are claiming that fMRIs are the key to lie detection. Is it true?

Your brain on lying
What does your brain look like when you lie? [Image Credit: digitalbob8]
By | Posted January 30, 2012
Posted in: Life Science
Tags: , , , , , ,

Everybody lies. Whether it’s about big issues or little ones, lying seems to be an ingrained aspect of human psychology.

For decades, scientists have been investigating exactly what the body does when a person tells a lie. The most popular lie detection methods have centered on measuring heart rate, perspiration and general anxiety.

Now some researchers are becoming a bit more ambitious. By using a brain scanning system known as functional magnetic resonance imaging (fMRI) — a way of measuring brain cell activity by tracking blood flow — they are claiming to be able to “see” when a person lies.

Joel Huizenga, the CEO of No Lie MRI — a company specializing in fMRI lie detection — believes that his scanners could change the face of lie detection in the United States and around the world. “If this technology was accepted, people would have to stop telling lies.”

Huizenga began his career using MRIs to detect plaque accumulation patterns that could lead to heart disease. He authored a number of papers on the subject and has a master’s degree in molecular biology from State University of New York, Stony Brook, but he has not published papers specifically about fMRI lie detection.

If Huizenga’s machines are able to detect lying with 90 to 99 percent accuracy, as he claims, then fMRI is far more accurate than any lie detection method currently on the market. For example, polygraphs — the lie detectors of choice since 1924 — perform with about 60 percent accuracy.

Although No Lie MRI’s results may seem compelling, some scientists think that the data are not quite so convincing. “We don’t understand how the brain processes a lot of things,” says Steven Hsiao, a neuroscientist at Johns Hopkins University. “The more complex those aspects of perception and cognition, the more difficult it is to isolate them.”

Hsiao thinks that fMRI data is useful in certain cases, like creativity studies or potentially mapping out mental disorders, but he cautions that too many researchers could draw untenable conclusions from these scans.

Paul Glimcher, a New York University neuroscientist, agrees with Hsiao’s assessment. Glimcher explains that some researchers are able to tell if a subject is lying based on brain scans, but it isn’t as simple as Huizenga makes it seem. It takes extensive testing to be able to predict if an individual is lying, says Glimcher.

fMRIs operate by taking a series of snapshots of brain activity and then layering the pictures to create a high-resolution multicolored image. The more data a researcher collects, the more detailed the final picture. Although this seems straightforward, the machine can only test individuals. Even the brain’s anatomical structure — the physical lumps and depressions within a skull — varies greatly from person to person. Glimcher points out that it would be extremely difficult to make generalizations about the patterns any brain produces when lying because of these differences. No two brains are alike, he says.

“No one in the public domain is getting results like [No Lie MRI],” says Glimcher. “It could be that they’ve figured it out, but I’m betting they haven’t.”

Huizenga has a different explanation for why these scans are not accepted among scientists and in the public sphere. To him, it has nothing to do with the science of the scan.

It’s all about the money, Huizenga says. “There is huge opposition to this. It’s because people are fearful of the government sticking their heads into an MRI and asking if they paid their taxes. They don’t even want people to know that anyone’s heard of it. People want to be able to lie.”

Huizenga started No Lie MRI by securing the proprietary rights to a scanner based on research conducted by Daniel Langleben, a University of Pennsylvania neuroscientist who has written extensively on the topic. Huizenga has attempted to get these scans into courtrooms since founding the company in 2005.

Judges around the country, however, seem to disagree with Huizenga. Tennessee and California courts have both thrown out fMRI scans because the system doesn’t withstand the so-called Daubert Standard. This test was established when Jason Daubert sued Merrell Dow Pharmaceuticals for creating a drug that his parents believed caused his birth defects. The case was eventually thrown out because the evidence Daubert referenced during the trial was not widely accepted within the scientific community. Judges now use the standard to decide whether certain scientific expert testimony should be admissible during a trial.

In a study from Harvard published in 2009 revealed an immense amount of variability within fMRI data. It showed that cells in the prefrontal cortex, the brain’s personality center, activate in different patterns depending upon the kind of lie a person is telling. If a lie is simply a bending of the truth, the brain’s activity might look different than when someone is lying about something more serious.

Because there is so much unpredictability when analyzing this data, the authors of the paper — psychologists Joshua Greene and Joseph Paxton — think that fMRI lie detection is not commercially viable. “fMRI lie detection technology has not been shown to reliably detect lies in realistic contexts, with real people genuinely attempting to deceive authorities,” says Greene.

While the specifics of No Lie MRI’s technology are not available to the public and Glimcher acknowledges the possibility that Huizenga’s scanner is a more advanced fMRI, he still remains doubtful.

“This could have huge implications for our legal system. We’re nowhere near being able to tell when someone is lying from a brain scan,” says Glimcher. “Anyone who says that is either lying to you or to themselves.”

Posted in: Life Science

Related Posts


comments

All comments are moderated, your comment will not appear on the site until it has been approved.

No comments yet.