Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Homeland Security: to Read Our Minds & Emotions
#1
Homeland Security Embarks on Big Brother Programs to Read Our Minds and Emotions


By Liliana Segura, AlterNet. Posted December 9, 2009.

Half-baked Homeland Security is spending millions to develop sensors capable of detecting a person's level of 'malintent' as a counterterrorism tool.


n the sci-fi thriller Minority Report, Tom Cruise plays a D.C. police detective, circa 2054, in the department of "pre-crime," an experimental law enforcement unit whose mission -- to hunt down criminals before they strike -- relies on the psychic visions of mutant "pre-cogs" (short for precognition) who can see the future. It may be futuristic Hollywood fantasy, but the underlying premise -- that we can predict (if not see) a person's sinister plans before they follow through -- is already here.

This past February, the Department of Homeland Security (DHS) awarded a one-year, $2.6 million grant to the Cambridge, MA.-based Charles Stark Draper Laboratory to develop computerized sensors capable of detecting a person's level of "malintent" -- or intention to do harm. It's only the most recent of numerous contracts awarded to Draper and assorted research outfits by the U.S. government over the past few years under the auspices of a project called "Future Attribute Screening Technologies," or FAST. It's the next wave of behavior surveillance from DHS and taxpayers have paid some $20 million on it so far.

Conceived as a cutting-edge counter-terrorism tool, the FAST program will ostensibly detect subjects' bad intentions by monitoring their physiological characteristics, particularly those associated with fear and anxiety. It's part of a broader "initiative to develop innovative, non-invasive technologies to screen people at security checkpoints," according to DHS.

The "non-invasive" claim might be a bit of a stretch. A DHS report issued last December outlined some of the possible technological features of FAST, which include "a remote cardiovascular and respiratory sensor" to measure "heart rate, heart rate variability, respiration rate, and respiratory sinus arrhythmia," a "remote eye tracker" that "uses a camera and processing software to track the position and gaze of the eyes (and, in some instances, the entire head)," "thermal cameras that provide detailed information on the changes in the thermal properties of the skin in the face," and "a high resolution video that allows for highly detailed images of the face and body … and an audio system for analyzing human voice for pitch change."

Ultimately, all of these components would be combined to take the form of a "prototypical mobile suite (FAST M2) … used to increase the accuracy and validity of identifying persons with malintent."

Coupled with the Transportation Security Administration's Behavior Detection Officers, 3,000 of whom are already scrutinizing travelers' expressions and body language at airports and travel hubs nationwide, DHS officials say that FAST will add a potentially lifesaving layer of security to prevent another terrorist attack. "There's only so much you can see with the naked eye," DHS spokesperson John Verrico told AlterNet. "We can't see somebody's heart rate…. We may be able to see movements of the eye and changes in dilation of the pupil, but will those give us enough [information] to make a determination as to what we're really seeing?"

Ideally, Verrico says, FAST mobile units would be used for security, not just at airports, but at "any sort of a large-scale event," including sporting events or political rallies. ("When the Pope visited Washington D.C.," he says, "it would have been nice to have something like this at the entrance of the stadium.")

"Basically," says Verrico, "we're looking to give the security folks just some more tools that will help to add to their toolbox."

If you think eye scanners and thermal cameras sound like the twisted props of some Orwellian dystopia, you're not alone. FAST may be years from being operational, but civil libertarians have already raised concerns over its implications.

"We think that you have an inherent privacy right to your bodily metabolic functions," Jay Stanley, director of the ACLU's Technology and Liberty program told AlterNet. "Just because somebody can build some high-tech piece of equipment that can detect your pulse and perspiration and breathing and heart rate, that doesn't mean that it should be open season to detect that on anybody without suspicion."

Besides, he says, the FAST program is based on "the same old pseudo-scientific baloney that we've seen in so many other areas. As far as I can tell, there's very little science that establishes the efficacy of this kind of thing. And there probably never will be."

Bruce Schneier, a security technologist and bestselling author who has been one of the most vociferous critics of such new high-tech DHS initiatives, concurs. In fact, he says, all the evidence suggests the opposite. "The problem is the false positives," he says.

Beyond the fact that ordinary travelers are likely to exhibit many of the symptoms supposedly indicative of malintent (how many people run to catch a plane and end up overheated and out of breath?), compare the rarity of terrorist attacks with the millions of travelers who pass through a security checkpoint. Statistically, Schneier argues, it's a fool's errand. "If you run the math, you get several million false positives for every real attack you find. So it ends up being as useless as picking people randomly. If you're going to spend money on something, you can spend money on dice -- it's cheaper. And equally as effective."

'The Theory of Malintent'

The FAST program, and others like it, have been in the works for a few years. In 2007, New Scientist reported on a DHS project called Project Hostile Intent, which "aims to identify facial expressions, gait, blood pressure, pulse and perspiration rates that are characteristic of hostility or the desire to deceive." Under the purview of DHS's Advanced Research Projects Agency (HSARPA), the project would "include heart rate and breathing sensors, infrared light, laser, video, audio and eye tracking."

According toNew Scientist, "PHI got quietly under way on 9 July, when HSARPA issued a 'request for information' in which it asked security companies and U.S. government labs to suggest technologies that could be used to achieve the project's aims. It hopes to test them at a handful of airports, borders and ports as early as 2010 and to deploy the system at all points of entry to the U.S. by 2012."

Subsequent news reports have conflated Project Hostile Intent and FAST, claiming that the latter is the same program, under a new name. But Verrico says this is incorrect. They are two separate programs, both seeking to "find the things that we can't see with the naked eye."

FAST was inspired by what DHS officials refer to as the "Theory of Malintent." Don't bother Googling it; it seems to exist primarily in relation to FAST, apparently pioneered in the service of the program. According to Verrico, the theory was -- and continues to be -- developed by Dr. Dan Martin, an adviser to the program, who posits that one can identify specific physiological cues that are diagnostic of malintent. Currently, Verrico says, researchers are trying to devise an algorithm that can differentiate between people whose heart rate is up because they are, say, afraid of flying, and those who are potential terrorists about to carry out some sort of attack. Verrico says they are searching for the "combination of signs that will tell us the difference between somebody who's just stressed or out of breath or overheated or whatever … and somebody who really is planning to do something nasty." But can such (admittedly common) variables really be distilled into an equation and fed into a machine?

Stanley argues that it is misguided to pour so much faith into "this idea that everything can be reduced to machinery and numbers." He says it shows naivete on the part of government officials about the limits of technology. He also blames it on "vendors pushing expensive new products." In the search for the next cutting-edge counter-terrorism tool, DHS has thrown millions of dollars at scientists purporting to be developing the Next Big Thing in security technology. As private military contractors know, providing security equals big bucks.

"I've heard it called the 'Security Industrial Complex,'" says Schneier. "There's money to be made and there are people out there who are going to say it can be done. And, yeah, it's techie and sexy and sounds good."

Schneier, who travels around the world speaking about the intersection of security and technology, says this has been especially true since 2001. "After 9/11 the money faucets turned on. And anybody with any half-baked security idea got funded."

Technology v. Fourth Amendment

It will probably be years before FAST is implemented. "It's sort of at the 'gee whiz' stage," says Stanley. The technology has only been tested using human subjects twice; once last year, at the Prince George's County Equestrian Center in Upper Marlboro, MD, and another time this summer at Draper Labs.

According to Verrico, the demonstrations were partly intended to test the theories behind FAST, "but were mostly done to demonstrate the system to government observers and the media."

"We can't go into too much detail on the laboratory protocol," he says, "but basically, participants were told they were going to attend a special event. Some of them were asked to create some sort of disturbance. As they were entering the facility, they were asked a series of questions while being observed by the various sensors. The earlier tests were done to determine whether the sensors could detect the physiological signs we were looking for and to validate their accuracy. For example, some people wore contact heart monitors and readings were compared to those picked up by the remote sensor."

Verrico is quick to clarify that none of the study's participants had their personal data stored; last December DHS issued an official Privacy Impact Statement asserting that subjects would have their privacy vigorously protected.

As for broader privacy concerns about the program itself, Verrico denies there's a problem. "We're not X-raying you," he says. Besides, "these are things that you are already presenting. Your body temperature is what it is. The fluctuations on your skin are what they are. Your heart rate is what it is. All we're doing is trying to see it a little better."

But when similar logic was presented to the Supreme Court, in Kyllo v. United States a few years back, the justices were unconvinced that this was not a violation of the Fourth Amendment. In that case, federal agents used a thermal imaging device in order to detect an unusual level of heat emanating from the home of an Oregon man named Danny Lee Kyllo. According to authorities, there was an unusually high level of heat radiating from Kyllo's garage, as compared with the rest of the house, suggesting that there were high-intensity lamps inside, of the type used to grow marijuana. On these grounds, federal agents searched the house, uncovering more than 100 marijuana plants; a crime for which Kyllo was subsequently convicted. Kyllo's appeal reached the Supreme Court, and in 2001, the justices ruled 5 to 4 in his favor.

"It would be foolish to contend that the degree of privacy secured to citizens by the Fourth Amendment has been entirely unaffected by the advance of technology," Judge Antonin Scalia wrote for the majority. "The question we confront today is what limits there are upon this power of technology to shrink the realm of guaranteed privacy."

'We Don't Live in a Police State'

Existing precursors to FAST, like the TSA's SPOT (Screening Passengers by Observation Technique) program, have so far had pretty dismal results. As I reported last month, in 2008 alone, TSA's Behavior Detection Officers across the country pulled 98,805 passengers aside for additional screenings, out of whom only 813 were eventually arrested. SPOT's defenders argue that at least this means we are catching "bad guys" -- as Dr. Paul Ekman, who helped pioneer the program told AlterNet, "I would think that the American public would not feel badly that they are catching money or drug smugglers, or wanted felons for serious crimes" -- but Bruce Schneier calls this "ridiculous."

"I can just invent a program where I arrest one in every ten people in the street," he says. "I guarantee you I'm gonna catch bad guys. I mean, shoot, how about we arrest everybody whose name starts with G?"

"We don't live in a police state," says Schneier, "so be careful of the logic, 'Well, you know, we catch some bad guys.'"

Jay Stanley hopes the FAST machinery will never get off the ground. "But it's possible that this kind of thing could be perceived as blunderingly effective, even though it's violating privacy rights and it could catch people who are nervous for other reasons," he warns. "The authorities could push to expand it and that's a very troublesome notion. I think that only concerned citizens making their voices heard could stop it if things turn out this way."

"I think maybe we need more English majors in the Department of Homeland Security," he jokes, "because each person is like a walking War and Peace: We all have complicated lives that could be written into thousand-page novels. The idea that somebody could take a snapshot of our breathing rate and decide that, of all the possible sources of human stress and excitement, that it is a terrorist attack we're plotting is simply absurd."
"Where is the intersection between the world's deep hunger and your deep gladness?"
Reply
#2
Yeah, I saw another article on this matter. They even are trying to be able to read thoughts, not just mood. So, I guess now if you are about to seek revenge on an unfaithful lover, et al. you'll be arrested for mal-intent; held in some dungeon without lawyer and tortured. Soon only those mindless happy-face types will be safe. Have a nice day!

What they really want is a mental loyalty test - the loyalty to what is in question. Corporate fascism, methinks.....and mindlessness - certainly no political convictions at odds with the 'authorities'; even those [allegedly] allowed under the [say what!] Constitutional Rights [say what?].
"Let me issue and control a nation's money and I care not who writes the laws. - Mayer Rothschild
"Civil disobedience is not our problem. Our problem is civil obedience! People are obedient in the face of poverty, starvation, stupidity, war, and cruelty. Our problem is that grand thieves are running the country. That's our problem!" - Howard Zinn
"If there is no struggle there is no progress. Power concedes nothing without a demand. It never did and never will" - Frederick Douglass
Reply
#3
The ability to read thoughts and emotions (if it is ever feasible and accurate?) is mind-numbingly dangerous, as soon or later it will be spun off into various corporations who can use it to test employees and potential employees.

We must pray it remains in the realm of science fiction as mankind will be truly Godlike with such a tool. Imagine the state of the world under such such conditions. The word "fuck up" doesn't even come close.
The shadow is a moral problem that challenges the whole ego-personality, for no one can become conscious of the shadow without considerable moral effort. To become conscious of it involves recognizing the dark aspects of the personality as present and real. This act is the essential condition for any kind of self-knowledge.
Carl Jung - Aion (1951). CW 9, Part II: P.14
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Facebook experiment to manipulate human behaviour and emotions David Guyatt 3 6,277 10-07-2014, 02:57 PM
Last Post: Magda Hassan
  How typeface influences the way we read and think Magda Hassan 1 2,843 15-06-2013, 04:28 AM
Last Post: Albert Rossi
  National Security Agency's 'Bumblehive' Facility to read our emails - over One Billion in Cost Adele Edisen 7 5,914 19-04-2013, 09:35 AM
Last Post: David Guyatt
  Scientists can read your dreams with 60% accuracy David Guyatt 11 6,090 16-04-2013, 09:21 PM
Last Post: David Guyatt
  Homeland Attack On Internet Anonymity Albert Doyle 2 3,564 16-10-2012, 01:12 PM
Last Post: Magda Hassan
  Big Brother, Apple and Google watchin' over you.....can almost read what your reading. Peter Lemkin 2 4,019 13-06-2012, 04:58 PM
Last Post: Peter Lemkin
  Your cat may actually be making you crazy!....read on. Peter Lemkin 10 7,764 11-02-2012, 07:42 AM
Last Post: Peter Lemkin
  ... German Security Agencies Caught Planting Spyware on Private Computers Ed Jewett 0 3,238 16-10-2011, 10:32 PM
Last Post: Ed Jewett
  Security Grifters Partner-Up on Sinister Cyber-Surveillance Project Ed Jewett 1 3,219 06-07-2011, 05:51 PM
Last Post: Ed Jewett
  Navy calling on gamers to help with security Ed Jewett 0 2,547 23-05-2011, 05:38 PM
Last Post: Ed Jewett

Forum Jump:


Users browsing this thread: 1 Guest(s)