15-11-2011, 03:52 AM
Future wars may be fought by synapses
Neuroscientists consider defense applications of recent insights into the brain
By Laura Sanders
Web edition : Friday, November 11th, 2011
Instead of the indiscriminate destruction of the atom bomb or napalm, the signature weapon of future wars may be precise, unprecedented control over the human brain. As global conflicts become murkier, technologies based on infiltrating brains may soon enter countries' arsenals, neuroethicists claim in a paper published online October 31 inSynesis. Such "neuroweapons" have the capacity to profoundly change the way war is fought.
Advances in understanding the brain's inner workings could lead to a pill that makes prisoners talk, deadly toxins that can shut down brain function in minutes, or supersoldiers who rely on brain chips to quickly lock in on an enemy's location.
The breadth of brain-based technologies is wide, and includes the traditional psychological tactics used in earlier wars. But the capacity of the emerging technologies is vastly wider and may make it possible to coerce enemy minds with exquisite precision.
In the paper, neuroscientists James Giordano of the Potomac Institute for Policy Studies in Arlington, Va., and Rachel Wurzman of Georgetown University Medical Center in Washington, D.C., describe emerging brain technologies and argue that the United States must be proactive in neuroscience-based research that could be used for national intelligence and security.
"A number of these different approaches are heating up in the crucible of possibility, so that's really increased some of the momentum and the potential of what this stuff can do," Giordano says.
In the not-too-distant future, technologies called brain-machine interfaces could allow the combination of human brains with sophisticated computer programs. Analysts with a brain chip could quickly sift through huge amounts of intelligence data, and fighter pilots merged with computer search algorithms could rapidly lock onto an enemy target, for instance.
Neuroscience could also find its way into interrogation rooms: As scientists learn more about how the brain generates feelings of trust, drugs could be developed that inspire that emotion in prisoners and detainees. Oxytocin, a hormone produced by mothers' bodies after childbirth, is one such candidate. Perhaps a whiff of oxytocin could dampen a person's executive functions, turning an uncooperative detainee into a chatty friend.
Other sorts of psychopharmacological manipulation could be used to boost soldiers' performance, allowing them to remain vigilant without sleep, heighten their perceptual powers and erase memories of their actions on the battlefield. Because neuroscientists are beginning to understand how the brain forms memories, it's not inconceivable that a drug could be designed to prevent PTSD. Such technology could enable more sinister applications, though, such as creating soldiers who wouldn't remember atrocities they committed or detainees who couldn't recall their own torture.
Some of these abilities are more probable than others, says bioethicist Jonathan Moreno of the University of Pennsylvania in Philadelphia. Drugs exist that increase alertness, but so far no drug has clearly boosted brain function. "Honestly, there isn't much, compared to caffeine or nicotine," he says.
Giordano and Wurzman also describe drugs, microbial agents and toxins derived from nature that could harm enemy brains in a more traditional way. The list includes a neurotoxin from a shellfish that is water soluble, able to be aerosolized and causes death within minutes; a bacterium that can induce hallucinations, itchiness and strange tastes; and an amoebic microbe that crawls up the olfactory nerve to invade the brain, where it kills brain tissue.
"The article contains an arsenal of neuroweapons, and these raise lots of ethical and legal issues," says bioethicist Jonathan Marks of Pennsylvania State University in University Park. "Any kind of drug that you administer for national security purposes raises profound questions."
Some scientists have already committed to resisting the application of their research to what they consider illegal or immoral military purposes. "It's not enough just to study the issue of ethics," says Curtis Bell of Oregon Health & Science University in Portland. "The potential for misuse of this knowledge is so strong that the responsibility of neuroscience goes further than just studying."
Bell has circulated a petition for neuroscientists, pledging signatories not to participate in developing technology that will be knowingly used for immoral or illegal purposes. "Neuroscientists should not provide tools for torture," he says. So far, about 200 neuroscientists from 18 countries have signed, he says.
Ideally science would have no place in combat, Giordano acknowledges, but that view ignores reality. "On one hand, what you'd like to say is science and technology should never be used to do bad things," says Giordano, who also holds positions at the University of New Mexico and the University of Oxford in England. "Yeah, and Santa Claus should come at Christmas and the Easter Bunny should come at Easter, and we should all live happily. History teaches us otherwise, so we have to be realistic about this."
The United States military is investing in brain-related research, though it's difficult to get a solid estimate of how much research is happening, Moreno says. The Defense Advanced Research Projects Agency, or DARPA, lists several neuroscience-related projects on its website, including "Accelerated Learning," "Neurotechnology for Intelligence Analysts" and "Cognitive Technology Threat Warning System."
"The fact of the matter is that we do live in a world in which there are people who would like to do bad things to us or our friends," Moreno says. "Eventually, some of this stuff is going to be out there."
Neuroscientists consider defense applications of recent insights into the brain
By Laura Sanders
Web edition : Friday, November 11th, 2011
Text Size
Instead of the indiscriminate destruction of the atom bomb or napalm, the signature weapon of future wars may be precise, unprecedented control over the human brain. As global conflicts become murkier, technologies based on infiltrating brains may soon enter countries' arsenals, neuroethicists claim in a paper published online October 31 inSynesis. Such "neuroweapons" have the capacity to profoundly change the way war is fought.
Advances in understanding the brain's inner workings could lead to a pill that makes prisoners talk, deadly toxins that can shut down brain function in minutes, or supersoldiers who rely on brain chips to quickly lock in on an enemy's location.
The breadth of brain-based technologies is wide, and includes the traditional psychological tactics used in earlier wars. But the capacity of the emerging technologies is vastly wider and may make it possible to coerce enemy minds with exquisite precision.
In the paper, neuroscientists James Giordano of the Potomac Institute for Policy Studies in Arlington, Va., and Rachel Wurzman of Georgetown University Medical Center in Washington, D.C., describe emerging brain technologies and argue that the United States must be proactive in neuroscience-based research that could be used for national intelligence and security.
"A number of these different approaches are heating up in the crucible of possibility, so that's really increased some of the momentum and the potential of what this stuff can do," Giordano says.
In the not-too-distant future, technologies called brain-machine interfaces could allow the combination of human brains with sophisticated computer programs. Analysts with a brain chip could quickly sift through huge amounts of intelligence data, and fighter pilots merged with computer search algorithms could rapidly lock onto an enemy target, for instance.
Neuroscience could also find its way into interrogation rooms: As scientists learn more about how the brain generates feelings of trust, drugs could be developed that inspire that emotion in prisoners and detainees. Oxytocin, a hormone produced by mothers' bodies after childbirth, is one such candidate. Perhaps a whiff of oxytocin could dampen a person's executive functions, turning an uncooperative detainee into a chatty friend.
Other sorts of psychopharmacological manipulation could be used to boost soldiers' performance, allowing them to remain vigilant without sleep, heighten their perceptual powers and erase memories of their actions on the battlefield. Because neuroscientists are beginning to understand how the brain forms memories, it's not inconceivable that a drug could be designed to prevent PTSD. Such technology could enable more sinister applications, though, such as creating soldiers who wouldn't remember atrocities they committed or detainees who couldn't recall their own torture.
Some of these abilities are more probable than others, says bioethicist Jonathan Moreno of the University of Pennsylvania in Philadelphia. Drugs exist that increase alertness, but so far no drug has clearly boosted brain function. "Honestly, there isn't much, compared to caffeine or nicotine," he says.
Giordano and Wurzman also describe drugs, microbial agents and toxins derived from nature that could harm enemy brains in a more traditional way. The list includes a neurotoxin from a shellfish that is water soluble, able to be aerosolized and causes death within minutes; a bacterium that can induce hallucinations, itchiness and strange tastes; and an amoebic microbe that crawls up the olfactory nerve to invade the brain, where it kills brain tissue.
"The article contains an arsenal of neuroweapons, and these raise lots of ethical and legal issues," says bioethicist Jonathan Marks of Pennsylvania State University in University Park. "Any kind of drug that you administer for national security purposes raises profound questions."
Some scientists have already committed to resisting the application of their research to what they consider illegal or immoral military purposes. "It's not enough just to study the issue of ethics," says Curtis Bell of Oregon Health & Science University in Portland. "The potential for misuse of this knowledge is so strong that the responsibility of neuroscience goes further than just studying."
Bell has circulated a petition for neuroscientists, pledging signatories not to participate in developing technology that will be knowingly used for immoral or illegal purposes. "Neuroscientists should not provide tools for torture," he says. So far, about 200 neuroscientists from 18 countries have signed, he says.
Ideally science would have no place in combat, Giordano acknowledges, but that view ignores reality. "On one hand, what you'd like to say is science and technology should never be used to do bad things," says Giordano, who also holds positions at the University of New Mexico and the University of Oxford in England. "Yeah, and Santa Claus should come at Christmas and the Easter Bunny should come at Easter, and we should all live happily. History teaches us otherwise, so we have to be realistic about this."
The United States military is investing in brain-related research, though it's difficult to get a solid estimate of how much research is happening, Moreno says. The Defense Advanced Research Projects Agency, or DARPA, lists several neuroscience-related projects on its website, including "Accelerated Learning," "Neurotechnology for Intelligence Analysts" and "Cognitive Technology Threat Warning System."
"The fact of the matter is that we do live in a world in which there are people who would like to do bad things to us or our friends," Moreno says. "Eventually, some of this stuff is going to be out there."
"Where is the intersection between the world's deep hunger and your deep gladness?"