Homeland’s future Invasion of Privacy



This past summer, at an undisclosed location in a northeastern metropolis, the U.S. Department of Homeland Security (DHS) was trying to predict the future. There were no psychics or crystal balls, just a battery of sensors designed to determine human intention through the subtlest of changes in heart rate, gaze, and other physiological markers.  Together, the sensors are called Future Attribute Screening Technology, or FAST, a $20 million federal project that aims to highlight airport passengers whose bodies betray hostile intentions. In theory, FAST has the potential to detect terrorists in the final minutes before they act, but critics warn that the system may have other consequences, such as flagging innocent travelers through false positives while letting some with ill intent sneak by through false negatives. The DHS, for its part, maintains that FAST is merely improving on a far older and more fallible crime predictor: human judgment.  About 3,000 DHS officers already roam the nation’s airports scanning for suspicious behavior and facial expressions in a program called Screening of Passengers by Observational Techniques, or SPOT. The automated FAST system is intended to supplement SPOT by catching signals that are undetectable to the naked eye. FAST is not designed to replace the decision-making of human screeners, but government officials hope it will eventually be able to passively scan airport passengers and single out those worth pulling aside for additional screening.  In recent trials, DHS recruited subjects and had them attend a mock event, such as a technology expo. Some of the subjects, chosen at random, were asked to perform an objectionable action at the event—not bring in a bomb, obviously, but perhaps steal a CD. Before entering the expo, subjects reported to a kiosk containing a suite of body sensors, each able to take precise measurements from about 20 feet away: A cardiovascular and respiratory sensor measured heart rate and breathing, an eye tracker followed gaze and position of the eyes, thermal cameras measured heat on the face, and floor sensors and a high resolution video system tracked body movement.  That first round of measurements is an essential step, says FAST program manager Robert Middleton, since it assures that individuals are measured against their own baseline rather than some universal standard of agitation. “The system was designed this way from the beginning to avoid simply identifying individuals who enter screening already anxious or angry,” he says.  After the baseline was established, volunteers were then asked a series of questions ranging from innocuous (“Have you been in the area all day?”) to direct (“Are you planning to commit a crime?”). The interview acts like a stimulus: Theoretically, it should trigger a more robust physiological response from conspirators than from innocent passengers. John Verrico, the DHS spokesman for the project, acknowledges the impracticality of a screening system that relies on interrogation but suggests commercial versions will be better. Ultimately, he says, the measurements would be taken in a process akin to passing through a metal detector, and only people with suspicious vital signs would be taken aside for questioning.  DHS’s faith in the technology is based on the controversial theory of malintent, developed 
in 2007 by clinical psychologist and FAST research consultant Daniel Martin. By combining ideas from neuroscience, psychophysiology, psychology, and counterterrorism, Martin concluded that the physiological signs of a future hostile actor would increase with 
the severity of the impending act and as the moment of the crime approaches. If so, a terrorist who plans to blow up a plane in an hour should be easier to detect than a man who plans to cheat on his wife during a business trip. Martin also concluded that the physiological signs, such as heart rate and skin temperature, would be too minute to manipulate, eliminating the possibility that terrorists would outsmart the system. “The system analyzes responses that people have little or no control over,” he claims. “And even if someone can avoid detection on one sensor, it is unlikely he can avoid detection on all.”  So far, DHS has tested FAST on more than 2,000 subjects; the results have been better than chance but not overwhelming. The system correctly determined whether a person was going to commit a malevolent act 78 percent of the time. FAST officials have released some of their findings to peer review and have repeatedly stated their intention to release more, but without complete data in the public forum, some scientists have questioned the feasibility of the program.  Stephen Fienberg, a professor of statistics and social science at Carnegie Mellon University, is one of the most vocal skeptics. In 2003 he led a National Research Council panel that found polygraph testing too imprecisefor use as a screening tool in government hiring. Fienberg has reached a similar conclusion about FAST. “It’s mainly baloney,” he says. “What evidence do we have coming out of physiology, psychology, or brain imaging that we can do any of this? Almost all of what I’ve seen and heard is hype.”  Even if Martin’s theory holds up, the success of FAST also hinges on the reliability and sensitivity of sensors that were not designed for airport security. Robert Middleton cites numerous internal studies showing that remote sensors have the sensitivity required to do the job, and that they performed on par with sensors connected directly to the body. “Scientists who have reviewed our work have typically been impressed with our progress,” he says. However, that work has yet to be scrutinized by the general scientific community.  Criticism of FAST is not limited to sci­entific and tech­nological issues. Some argue that the system violates a basic tenet of personhood: free will. Thinking about doing something and actually doing it are two very different things, says Steven Aftergood, a senior research analyst at the Federation of American Scientists, which provides science-based recommendations on national security issues. “FAST is potentially dehumanizing,” he says. “It denies that a person has the capacity to direct his own actions, to change his mind, and to adapt his behavior to new circumstances.”  The DHS has plenty of time to mull these issues, as FAST is years away from reaching a real airport. But Middleton insists the technology is worth pursuing. “Metal detectors and X-ray machines have been around for a long time, and most people assume they are necessary to keep citizens safe,” he says. “Our program is attempting to do something new, and we are using scientific 
methods to verify that it works.”


Share Your Thoughts

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s