
Illustration by Paul Blow
Impulsive destruction? Frontal-lobe damage. Explosive anger? Amygdala damage. The likeliest reason Charles Whitman killed his mother and wife, then climbed a tower at the University of Texas at Austin and shot 45 people with a high-powered rifle? A brain tumor, found in an autopsy, compressing the amygdala. Whitman left a note saying he didn’t understand his own mind anymore: “I cannot rationally pinpoint any specific reason for doing this.”
Then there are the true psychopaths, whose brains have never fully registered other people’s fear or pain. They can reason logically, but they don’t possess enough empathy to develop a moral compass, and they don’t experience guilt or remorse. There was a time we would have called them “soulless.”
Back then, society blamed violent crime on spiritual causes: Satan lured us into sin. In that schema, justice meant punishment and retribution.
In the 20th century, reformers cited social causes instead—poverty, lack of education, a childhood history of neglect and abuse. Slowly and spottily, emphasis started to shift to rehabilitation.
Now neuroscience is pointing to physical causes, from shrunken or squeezed brain structures to faulty neurotransmitters.
And that throws both punishment and rehabilitation into question.
•
Psychology never won much respect in the legal system. It was too fuzzy for the judges; too complicated for the juries.
Neuroscience, now—that’s a different story. It’s harder, more empirical, and it’s even got brain scans for show and tell. Neuroscience can wow a jury the way DNA evidence does. And what it’s finding could someday rewrite our laws and reorganize our prisons.
The problem is, neuroscience is still in its infancy. It’s just beginning to give us insights into why somebody might have committed a certain crime, how much conscious control he would have had over his actions, and whether treatment could prevent him from doing it again. It’s also beginning to correct some dangerous assumptions about how people (juries as well as criminals) make decisions. And it may soon be able to tell us, far more reliably than the polygraph, whether someone is lying.
Neuroscientists are the first to urge wariness, however. They say the companies selling expensive functional magnetic resonance imaging (fMRI) scans that “prove” deception…are premature. Showing a jury a picture of somebody’s brain activity at a given moment, or pointing out some structural quirk of size or shape, is misleading at best.
Enter the MacArthur Foundation’s Law & Neuroscience Project.
Dr. Marcus Raichle—a Washington University School of Medicine neurologist with expertise in radiology, neurobiology, and psychology—helped shape the project and co-directs one of its sections.
He remembers MacArthur writing to 100 of its fellows four years ago, asking what the foundation should be paying attention to. Robert Sapolsky, a neuroendocrinologist at Stanford University, fired back a suggestion. “He said that because of the advances in our understanding of how the brain works, neuroscience has the possibility of radically changing the criminal-justice system,” Raichle recalls. “Provocative, to say the least. It got their attention, and a group was assembled to give some thought to this.”
Raichle was a member, as was his friend Michael Gazzaniga, a psychologist who directs the SAGE Center for the Study of the Mind at the University of California, Santa Barbara. Famous for his work with split brains, Gazzaniga is known as one of the founders of cognitive neuroscience. “He called me and said, ‘Hey, are you interested in neuroscience and the law? Stay tuned,’” Raichle recalls.
The Law & Neuroscience Project’s initial committee mushroomed into working teams, then drew in neuroscientists, legal scholars, and philosophers from around the country. Raichle wound up on the board of governance, whose honorary chair was former Supreme Court Justice Sandra Day O’Connor. He also co-directs a research network with a two-pronged focus: legal decision-making (how criminals, judges, and jurors make decisions) and neuroscientific evidence in the courtroom. A second research network focuses on criminal responsibility (how to assess it in cases of brain trauma or damage, psychopathy, or addiction) and on predicting future behavior and response to treatment.
Now finishing its third and final year, the project has aimed:
• To figure out how much neuroscience is being used already—and with what results—in courtrooms, parole hearings, sentencing recommendations, or plea bargains.
• To bring neuroscientists and lawyers together to rethink addiction and psychopathy, both of which have roots in the physical structures of the brain.
• To give judges and lawyers a guide to neuroscience: what the terms mean, what’s possible and what’s not, and how much information we can reliably conclude from a brain scan.
Participants say it’s been a bit like a reality show, watching the interactions between lawyers—who navigate a centuries-old, black-and-white system—and neuroscientists, who probe delicate, complex brain responses we’ve never seen before. They’ve managed to learn a common language (“Suddenly, I’m talking about ‘mens rea’ [criminal intent],” Raichle chuckles) because their common goals are so clear.
Move to the deeper, underlying issues, though, and nobody agrees.
“The question is whether understanding neuroscience challenges the law’s fundamental notions of free will and responsibility,” explains Josh Greene, an assistant professor of psychology at Harvard University who helped fuse the new field of neurophilosophy. Intuitively, he says, we assume “there is a nonphysical, conscious self pulling the strings, a deep inner you that is independent of the physical mechanisms of your brain in the world. But human behavior is ultimately caused by physical mechanisms, and there’s no separate ‘mind’ that’s in charge. The notion that there is wickedness in your soul is a joke.
“That doesn’t mean the notion of responsibility goes out the window,” he adds quickly. “But the impulse to punish people simply because they ‘deserve’ it—deservingness and choice, in a deep metaphysical sense, are an illusion. We make choices, and they are informed by our beliefs and values, but these are all ultimately physical things that are encoded in the neurons inside our heads.
“What everyone wants to know is, ‘Is it really him, or was it a brain tumor?’” Greene adds wearily. “But the neuroscientific view of the world is that there is nothing but your brain, and any bad behavior is going to have some physical explanation, some misfiring. No one is ‘deep in their soul’ innocent or guilty.”
Greene takes a pretty radical position on criminal responsibility. “When we see actions as physical events, rather than soul events, we stop being moralistic and start being pragmatic,” he says. “We need to rethink our criminal-justice system so it’s about what will work and not about retribution. ‘Are you the kind of criminal who can respond to a positive environment, or do we just need to contain you?’ Right now, we put everybody into prison—where might makes right—and we don’t think about the fact that most of these people are going to come out again. It’s just ‘Get these people away from us, and make them suffer.’ But spending years in a social network of criminals is not very good training to come back to society.”
•
Early in the Law & Neuroscience Project, Wash. U.’s Raichle found himself sitting in the audience, listening to the dean of the College of Law at the
University of Illinois at Urbana-Champaign give “a dazzling lecture full of legalese. She was talking about the issue of consent, and one word stuck in my craw: ‘reasonable.’ It’s a concept in the law: ‘the reasonable person.’ But when someone makes a decision, how much of that decision-making process is even conscious? Most of what your brain is doing has to do with internal operations, and very little, surprisingly, with information coming to you. The amount of data that gets to our brain is a fraction of what is out there, and the amount that gets into the brain is far less. Your brain is taking an impoverished amount of information and making a best guess. If every step had to be reasoned out, you would never get out of bed.”
Not only do we work from limited data, but as Gazzaniga’s split-brain studies showed, the interpreter part of our brain often provides rational explanations for our decisions after we’ve made them. “So here I am, listening to elegant legal discussions about consent, and realizing this is really complicated,” Raichle grins. “It gets into the touchy discussion of free will. One of the judges said, ‘Man, that just scares the bejesus out of us.’ You remember Flip Wilson, ‘The devil made me do it’? Well, you could turn that around and say, ‘The brain made me do it.’”
Brain researchers have found instances in which the brain is already starting to produce an action before the individual makes a conscious decision to act. They have found instances in which we make a choice irrationally, and then swiftly rationalize it to sustain the illusion that we’re being reasonable. They note that with certain types of brain damage, deficit, or trauma, it is possible to know the difference between right and wrong, yet be unable to regulate your behavior.
“Organic impairment,” Sapolsky calls it. “Neurobiologically based volitional impairment.” Or, in kindergarten terms, the inability to “make good choices.”
The legal system determines the severity of an offense by considering how consciously and deliberately the person acted. But what if our actions aren’t as conscious as we think they are—or as controllable?
Take, for instance, George Porter Jr., a Korean War veteran who pleaded guilty to murdering his former girlfriend and her boyfriend in Melbourne, Fla. In December, the Supreme Court overturned his death sentence, because Porter’s lawyer had failed to inform the jury that Porter had suffered lifelong psychological effects from childhood abuse and horrific combat experiences.
Sapolsky’s own research focuses on the effects of stress on the brain: on memory, decision-making, impulse control, anxiety, fear, pleasure, and depression. “An emerging literature suggests that chronic stress—most interestingly, in the form of low socioeconomic status in kids—produces a frontal cortex that doesn’t work as well in decision-making and impulse control,” he notes. “And a newer literature is showing that stress can cause atrophy of neurons in the frontal cortex. All this winds up being very relevant in making sense of why some individuals are better at regulating their behaviors than others.”
Asked what distortions drive him craziest, Sapolsky, like Greene, mentions the common assumption “that there is a person who is somehow separate from the sum of their neurons/neurotransmitters/enzymes/etc.” That separate person we’d all like to hypothesize would have complete control over his or her behavior, regardless of damage or deficits in the brain, and could therefore be held responsible for all actions.
That separate person doesn’t exist—and we’re not sure what to do with that realization.
An Italian court, for instance, recently reduced convicted murderer Abdelmalek Bayout’s sentence by a year, because he possessed five genes linked to violent behavior, most notably the low-activity variant of the MAO-A (Monoamine oxidase A) gene, dubbed “the warrior gene” in the popular press. Studies show this variant may predispose individuals to an aggressive response when provoked. A researcher at Florida State University recently found men with the variant twice as likely to join gangs and, once in those gangs, four times as likely to use weapons.
The notion that genetic makeup should be considered in sentencing provokes a bit of a violent reaction itself.
“Do we have no free will?” Raichle asks. “Some people would say we have ‘free won’t.’ To think we are completely rational about everything is a complete fiction. But we can consciously interrupt the brain’s programs. We have veto power.”
Walter Sinnott-Armstrong, co-director of the Law & Neuroscience Project and Stillman professor of practical ethics at Duke University, also lands somewhere in the middle. “People ought to recognize that responsibility is not an on-off proposition,” he says firmly. “You get knee-jerk reactions when you suggest that people with brain problems are not responsible for their actions. But saying that doesn’t mean they are not at all responsible—simply that they’re not fully responsible.
“The law’s been so black-and-white,” he adds. “If you think that way, your picture is going to be way too crude to capture the subtleties of real human situations.”
•
The most chilling example of our old assumptions’ irrelevance? The psychopath.
Neuroscientist Kent Kiehl, an associate professor at the University of New Mexico and a principal investigator for The Mind Research Network, is the colleague to whom everybody else in the Law & Neuroscience Project instantly refers questions on psychopaths. He’s been screening their brains with a mobile fMRI unit, and he believes psychopathy is caused by a defect in the parts of the brain that process emotion, form inhibition, and control attention.
Kiehl’s working his way through the 15 to 30 percent of the North American prison population that scores high on the Hare Psychopathy Checklist–Revised (PCL-R). The list looks for such endearing traits as grandiose self-image, deceitfulness, conning, manipulation, and a parasitic lifestyle. (And no, it’s not self-reported; these people lie.)
To study psychopaths, we spend only a fraction of what we spend on, say, schizophrenia research. As a result, there’s a lot we don’t know yet: how hereditary psychopathy is; whether you can be just a little bit psychopathic; how many “successful” psychopaths are out there, leading law-abiding lives.
“Ted Bundy was halfway through law school,” notes Kiehl. “If you had assessed him without knowing he was a serial killer, he wouldn’t have scored high on our checklist. We have no measures that adequately capture the lack of guilt and remorse in a community sample.”
There was little doubt, though, about Brian Dugan, an Illinois serial killer who pleaded guilty to raping and murdering a 10-year-old girl. He scored 37 out of 40 points on the PCL-R, putting him in the upper half of the 99th percentile, and fMRI scans were used in the sentencing phase of his trial in an effort to illustrate deficits in his brain. Called as an expert witness, Kiehl carefully avoided suggesting that brain abnormality caused Dugan to commit his crimes, but said the data clearly provided “evidence that his brain
is different.”
The jury initially returned a life sentence, then went back into deliberation and gave Dugan the death penalty. But the use of fMRI evidence set an interesting precedent.
Roughly one-fifth of the prison population meets the diagnostic criteria for psychopathy. A Canadian study showed that the average psychopath is convicted of four violent
crimes by the age of 40. Psychopaths are the most likely offenders to wind up back in prison within a year of release.
Their impulsivity decreases with age, but the core traits—lack of empathy, guilt, and remorse—remain stable. And because they lack empathy and remorse, it doesn’t even bother them to know they lack empathy
and remorse.
“They see all that as an encumbrance that burdens the rest of us,” Kiehl observes wryly. “They don’t like the label ‘psychopath,’ though, because it keeps them from getting out of prison. I’ve had one inmate cross out the word ‘psychopath’ and write in ‘superman.’”
•
Asked where the legal system most needs to exercise caution, Sapolsky replies by email, “Premature, ridiculous (mis)uses of neurobiology.” His best example? “The use of functional brain imaging to tell if someone is being truthful, most unnervingly in this company called No Lie MRI. There is zero scientific basis for it being used in the courts.”
According to No Lie MRI’s website, “The technology used by No Lie MRI represents the first and only direct measure of truth verification and lie detection in human history! No Lie MRI is a proprietary product that objectively measures intent, prior knowledge, and deception.”
Raichle snorts. “No Lie MRI and Cephos are two companies that will for several thousand dollars administer a lie-detection test using fMRI imaging. The evidence that it works any better than the polygraph is unconvincing.” After years teaching “the CIA and the FBI, all the spy catchers” about lie detection and the flaws of the polygraph, Raichle was ready to stop. “It was interesting, but I thought, ‘Gol’ darn, that’s enough,’” he says. Then he heard that one of the fMRI lie-detection tests was going to be introduced into a courtroom, and he sighed heavily and agreed to join his colleagues as expert witnesses challenging its accuracy. “The company withdrew,” he says.
David Faigman, the John F. Digardi Distinguished Professor of Law at the University of California Hastings College of Law, understands why attorneys are so tempted by such claims. With polygraphs ruled inadmissible in court, there’s a burning need to pinpoint liars and validate truth-tellers. “Brain imaging is a brand-new technology, and it just might win acceptance from the courts,” Faigman says, explaining that instead of relying on secondary, variable responses such as blood pressure, heart rate, or sweating, it goes straight to the brain. “The theory is that the brain looks different when it’s being deceptive than when it’s simply retrieving information. And if you can see it in the brain, that is going to be very persuasive to a jury.
“But I don’t think we’re there yet.”
Faigman fully expects neuroscientific evidence to reliably identify broad, general aspects of deception—as well as signs of physical pain or psychological trauma—within the next 10 or 20 years. Not next week. “Some lawyers have a tendency to use science before it’s fully baked,” he says ruefully. “Lawyers are always looking for the next best thing, and neuroscience is very sexy.”
The jury’s out on the degree to which fMRI and other brain imagery will bias a jury. Some studies have suggested that jurors might automatically give such evidence more weight, but recent research confounded the Law & Neuroscience experts by suggesting just the opposite.
In any event, “snapshots” of the brain will need to be interpreted in context—and that’s not easy. “We show you scans with pretty things lit up, as though we turned something on,” Raichle says. “You assume the brain wasn’t doing anything until something activated it. And that’s absolutely false. The picture only represents a few percentage points’ change. Cells in the brain are talking to each other continuously, and the conscious part is only the tip of the iceberg.
“It’s easy to look at what the brain does when you ask it to do something,” Raichle adds. “It’s an entirely different matter to ask what the brain is doing when it’s not doing anything you can see. What people used to call ‘noise’ contains a huge amount of information.”
Slowly, that information’s becoming accessible to us. But if it enters the courtroom prematurely, it runs the risk of getting all similar evidence pronounced inadmissible. Setting that precedent will make it virtually impossible to get such evidence into court once it really is reliable.
And it’s too promising to risk that.
The only blog that staff writer Jeannette Cooperman reads faithfully (other than SLM’s) is The Frontal Cortex—which has her convinced that neuroscience is explaining human nature far better than Freud did.