First they saw us naked - Now they want to look deep inside our heads

Can a lie detector for the airport 
reveal a brain with bad intent?

This is an older report, from a January 2010 edition of IEEE Spectrum magazine. How far this deception-detecting technology progressed?
The U.S. Department of Homeland Security (DHS), which operates airport security checkpoints in the United States, is spending upward of US $7 million a year trying to develop technology that can detect the evil intent of the terrorists among us. Yes, you read that correctly: They plan to find the bad guys by reading their minds.

Dozens of researchers across the country are in the middle of a five-year program contracted primarily to the Charles Stark Draper Laboratory, in Cambridge, Mass. They've developed a psycho-physiological theory of "malintent"—basically, a hodgepodge of behaviorism and biometrics according to which physiological changes can give away a terrorist's intention to do immediate harm. So far, they've spent $20 million on biometric research, sensors, and a series of tests and demonstrations.

This is no mere fantasy, DHS officials insist. And it isn't: It's a noble fantasy. And it's destined to be a noble failure. It's called the Future Attribute Screening Technology, or FAST.

"We're not reading minds," says Robert P. Burns, a deputy director of innovation at the Homeland Security Advanced Research Projects Agency and the FAST project manager. "We're just screening."
Critics say there are two main problems with this approach. Bad guys can either train themselves or take mood-altering drugs to eliminate any tell-tale signs of anxiety that might flag the system. But more importantly, the number of false positives would be immense.
Such a system would be useless in the field, says Bruce Schneier, an expert on applying technology to security problems and a frequent critic of airport screening systems. He notes that more than 700 million people board airliners every year in the United States, and there have been very few attacks. "Imagine an attack every five years," he says, "and a system with a very good 0.1 percent false-positive rate," meaning that of every 1000 people screened, one gets stopped but isn't a bad guy. "Over those five years, the system will still have 3.5 million false alarms." And that's for a system with far better performance numbers than the "87 percent accurate" system that DHS says it now has.
Loser: Bad Vibes. A quixotic U.S. government new security system seeks to look into your soul. IEEE Spectrum>>

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.

Related Posts with Thumbnails