Saw this linked at SayUncle.
At airport security checkpoints in Knoxville, Tenn. this summer, scores of departing passengers were chosen to step behind a curtain, sit in a metallic oval booth and don headphones.AndWith one hand inserted into a sensor that monitors physical responses, the travelers used the other hand to answer questions on a touch screen about their plans. A machine measured biometric responses -- blood pressure, pulse and sweat levels -- that then were analyzed by software. The idea was to ferret out U.S. officials who were carrying out carefully constructed but make-believe terrorist missions.
The test alone signals a push for new ways to combat terrorists using technology. Authorities are convinced that beyond hunting for weapons and dangerous liquids brought on board airliners, the battle for security lies in identifying dangerous passengers.Then they go on to describe SPOT.The method isn't intended to catch specific lies, says Shabtai Shoval, chief executive of Suspect Detection Systems, the start-up business behind the technology dubbed Cogito. "What we are looking for are patterns of behavior that indicate something all terrorists have: the fear of being caught," he says.
To date, the TSA has more confidence in people than machines to detect suspicious behavior. A small program now is using screening officers to watch travelers for suspicious behavior. "It may be the only thing I know of that favors the human solution instead of technology," says TSA chief Kip Hawley.I suppose the personal observation tests are more reasonable, but makes me wonder where they will be allowed to go if the person just flatly refuses to cooperate. Frankly, I doubt this could stand before a fourth-amendment trial. But then, you don't have the right to fly either. Personally, I'm never happy about flying, so my behavior in an airport isn't my norm.The people-based program -- called Screening Passengers by Observation Technique, or SPOT -- began undergoing tests at Boston's Logan Airport after 9/11 and has expanded to about a dozen airports. Trained teams watch travelers in security lines and elsewhere. They look for obvious things like someone wearing a heavy coat on a hot day, but also for subtle signs like vocal timbre, gestures and tiny facial movements that indicate someone is trying to disguise an emotion.
TSA officers observe passengers while consulting a list of more than 30 questionable behaviors, each of which has a numerical score. If someone scores high enough, an officer approaches the person and asks a few questions.
"All you know is there's an emotion being concealed. You have to find out why the emotion is occurring," says Paul Ekman, a San Francisco psychologist who pioneered work on facial expressions and is informally advising the TSA. "You can find out very quickly."
More than 80% of those approached are quickly dismissed, he says. The explanations for hiding emotions often are innocent: A traveler might be stressed out from work, worried about missing a flight or sad because a relative just died. If suspicions remain, the traveler is interviewed at greater length by a screener with more specialized training. SPOT teams have identified about 100 people who were trying to smuggle drugs, use fake IDs and commit other crimes, but not terrorist acts.
The TSA says that, because the program is based on human behavior, not attributes, it isn't vulnerable to racial profiling. Critics worry it still could run afoul of civil rights. "Our concern is that giving TSA screeners this kind of responsibility and discretion can result in their making decisions not based on solid criteria but on impermissible characteristics such as race," says Gregory T. Nojeim, associate director of the American Civil Liberties Union's Washington legislative office.
But then, I would probably completely refuse to step into a machine that obviously has no cognitive abilities and will decide whether I'm hiding something or not. I don't like their false positive rate either.
The biggest challenge in commercializing Cogito is reducing false results that either implicate innocent travelers or let bad guys slip through. Mr. Shoval's company has conducted about 10 trials in Israel, including tests in which control groups were given terrorist missions and tried to beat the system. In the latest Israeli trial, the system caught 85% of the role-acting terrorists, meaning that 15% got through, and incorrectly identified 8% of innocent travelers as potential threats, according to corporate marketing materials.Where does one go when your singled out falsely? You miss that flight, miss the deal, get fired and all because a machine was screwing up. You think that they'll institute a frequent flyer program to allow people to be prescreened and allowed to skip by this thing?The company's goal is to prove it can catch at least 90% of potential saboteurs -- a 10% false-negative rate -- while inconveniencing just 4% of innocent travelers.
No comments:
Post a Comment