Should a 15-year-old be able to vote? Should a 13-year-old be able to decide when to go to bed? Should a 16-year-old be able to decide to sext with their significant other? Should a 12-year-old be allowed to choose what to eat and what not to eat for dinner?
Chances are that in answering these questions, at least some people will turn for their answers to neuroscience. Brain imaging and brain science have become central to many discussions of young people’s autonomy and young people’s choices. A typical popular article declares that “we now know why” young people get drunk at parties or wear outlandish clothes. “The areas of the brain that control decision-making don’t fully develop until early adulthood.” Another headline reads, “Why teens make bad decisions: Blame the brain.” Young people shouldn’t vote; they shouldn’t sext, they shouldn’t be trusted to choose food or bedtimes. Their brains don’t work right. It’s science.
But it isn’t science. There is no straightforward way to get from findings in neuroscience to an explanation of why a young person’s sense of style is different from their parents. You can’t look at a frontal lobe and understand why a young person—or an adult for that matter—decides to get drunk on a particular night. Using research studies in brain imaging to explain individual behavior is not supported by science. It’s quackery. It doesn’t show that young people can’t make decisions. It does show, though, that adults, developed brains or no, often look for excuses to make poor decisions about children.
Adolescent risk-taking
The most famous, and most cited neuroscience findings of adolescents involve risk-taking. A good deal of research suggests that adolescents and young adults take more risks than older people. They text while driving, for example, and may engage in less safe sexual practices. Brain imaging studies suggest that young people’s frontal cortex doesn’t fully develop until they are in their 20s. The frontal cortex is responsible for decision-making, emotional regulation, and impulse control. So many popular accounts conclude that adolescents make poor decisions because they have a cognitive deficit; their brains aren’t yet fully formed, and so they cannot control reckless and dangerous desires.
This is a straightforward conclusion which fits with folk and pop portrayals of adolescents as id-driven, irresponsible, and out of control. However, the science supporting it is thin. In 2008, Temple University Psychology professor Laurence Steinberg pointed out that “It is important to point out that our knowledge of changes in brain structure and function during adolescence far exceeds our understanding of the actual links between these neurobiological changes and adolescent behavior.” In other words, while scientists have brain imaging data about how adolescent brains develop, and some information about adolescent behavior, they don’t know that the first causes the second.
Teens find novel experiences stimulating. This can have some downsides. But it’s also vital because teens are at a point in their lives where they need to learn lots of new things
A 2017 review of the literature argued, in fact, that teens are not especially impulsive. Instead, a lot of behavior that is labeled as dangerous or risk-taking is in fact exploratory. Lead author Daniel Romer, Ph.D., research director of the Annenberg Public Policy Center of the University of Pennsylvania, specifically argues that adolescent behaviors “are not symptoms of a brain deficit.” Teens find novel experiences stimulating. This can have some downsides. But it’s also vital because teens are at a point in their lives where they need to learn lots of new things—how to drive, how to form new peer relationships, how to gain new skills for jobs. “Risk taking” in many cases is just young people trying things for the first time. You could just as easily say that adult brains have deteriorated, and use that to explain why adults are inflexible, staid, overweight, dress badly, and listen to crappy Billy Joel and Phil Collins records.
Blaming adult brains for Billy Joel seems ridiculous, in part because we assume adults are the standard and that an adult brain is the ideal, fully functional brains to have. But in fact, there is no one point of stable optimum brain development. Brain development and wiring can continue into the late 20s. That’s also the point at which cognitive decline can begin. “Men in their 40s shed 50% of the cells in their hypothalamus,” Stephanie Simon-Dack, a professor of psychological science at Ball State University (and, full disclosure, my cousin) told me. “We do see huge brain changes in older people that can lead to in some cases more adaptable thinking and in some cases less adaptable thinking. There’s a spectrum.”
There’s no point at which people attain some perfect pinnacle of brain development and judgment. Humans are always changing and flawed. And for that matter, there isn’t one perfect brain optimization for all situations and all life periods. Someone more comfortable with risk and change may be better able to deal with some situations; someone more risk-averse may make better decisions in others. The fact that adolescent brains are different than older people’s brains doesn’t mean that older people make better decisions. It was older people, after all, who were more likely to vote for Donald Trump.
Rights and autonomy
Using brain science or assumptions about intelligence and mental competence to make decisions about people’s rights and autonomy also has an ugly history. “Neuroscience is a terrible route for this,” Simon-Dack says. “Saying, ‘Oh, look at the science,’ when you don’t really understand the science or you’re just picking and choosing the science—I think it’s dangerous.”
In the early 20th century, respected scientists argued that women shouldn’t vote because they had inferior brains and too much thinking and deliberation harmed their ability to bear children. Psychologist Berkley Jensen in the 1970s, and political scientist Charles Murray in 1994, argued that Black children and children of color had less innate cognitive ability than white children; the argument implicitly justified less expenditure on education for Black children.
Arguments from “brain science,” then, have traditionally told us less about the brains of marginalized people, and more about the interests of those in power. Popular uses of brain science tell us little about the behavior or capacity of marginalized people. But it often tells us a depressing amount about how people justify inequality and the status quo.
Neuroscience and policy
It’s true that there are some cases in which arguments about children’s brains have been used to advance admirable policies. Neuroscience findings influenced a Supreme Court decision to end life without parole for juveniles. Similarly, age of consent laws sometimes uses neuroscience as a justification. But these decisions are a double-edged sword. Antonin Scalia argued that if juveniles really did not have the capacity for decision-making, then young people’s right to decide on abortion should be curtailed. Arguing for children’s incapacity tends to make them more, not less, vulnerable to state intervention. Advocates of shorter sentences for youth might do better to argue for young people’s inexperience and capacity for change over time. And the disproportions of power between adults and young people provide a firmer footing for the age of consent laws than the uncertain science around brain imaging does.
In general, instead of basing policy on young people’s brains, we should think about what it means that people are so eager to embrace inconclusive scientific findings to abrogate young people’s autonomy. When parents demand that a young child hug a relative, even if they don’t want to, is that a sign that adult brains are more advanced? Or is it just a sign that adults don’t take children’s consent seriously? Is too much screentime really a huge threat to young people? Or are parents misassessing risk because they’re nervous about new things? If parents spank their children, is it because young people’s brains can’t be trusted? Or is it because older people can’t?
Rather than outsourcing our ethics to tendentious and ill-informed readings of neuroscience data, we need to acknowledge that the real corrupter of judgment, of kindness, and of tolerance is power. If we refuse to let young people advocate for themselves, or refuse to grant them bodily autonomy, it is not because there is something wrong with their decision-making capacity. It’s because there is something wrong with ours.
I sometimes wonder the same question about people who can legally vote…