face misrecognition software exposed


3 January 2002, ACLU Press Release, "Drawing a Blank: Tampa Police Records Reveal Poor Performance of Face-Recognition Technology."

NEW YORK--Facial recognition technology on the streets of Tampa, Florida is an overhyped failure that has been seemingly abandoned by police officials, according to a report released today by the American Civil Liberties Union.

System logs obtained by the ACLU through Florida's open-records law show that the system never identified even a single individual contained in the department's database of photographs. And in response to the ACLU's queries about the small number of system logs, the department has acknowledged that the software -- originally deployed last June, 2001 -- has not been actively used since August.

"Tampa's off-again, on-again use of face-recognition software reminds us that public officials should not slavishly embrace whatever latest fad in surveillance technology comes along," said Howard Simon, Executive Director of the ACLU of Florida, which made the records request last August.

The logs obtained by the ACLU also indicate that the system made many false matches between people photographed by police video cameras as they walked down Seventh Avenue in Tampa's Ybor City district and photographs in the department's database of criminals, sex offenders, and runaways. The system made what were to human observers obvious errors, such as matching male and female subjects and subjects with significant differences in age or weight.

"Face recognition is all hype and no action," said Barry Steinhardt, Associate Director of the ACLU and an author of the report. "Potentially powerful surveillance systems like face recognition need to be examined closely before they are deployed, and the first question to ask is whether the system will actually improve our safety. The experience of the Tampa Police Department confirms that this technology doesn't deliver."

[...] Several government agencies have already abandoned facial-recognition systems after finding they did not work as advertised, including the Immigration and Naturalization Service, which experimented with using the technology to identify people in cars at the Mexico-U.S. border.

And Steinhardt noted that more controlled studies of face recognition software -- by the federal government's National Institute of Standards and Technology, by the Defense Department, and by independent security expert Richard Smith -- have found levels of ineffectiveness similar to those in Tampa [...]



15 January 2002, John Schwartz, "New Side to Face-Recognition Technology: Identifying Victims," The New York Times.

[...] No facial recognition system is perfect, or even close: all make mismatches and overly broad matches. Many can be confounded by simple subterfuges like wigs or glasses. Civil liberties and other groups say they cast too wide a net, invading privacy and extending the reach of surveillance too far.

And the technology's credibility has not been helped, many experts agree, by exaggerated claims for its effectiveness. "These software companies have popped off numbers that they can't really substantiate," said Ron Cadle, a vice president of Pellco Inc., which is adapting facial recognition systems for use in Fresno Yosemite International Airport. "It's kind of given them a black eye."

Mr. Amanovich agreed. "There's a lot of false claims out there and a lot of specious claims to what all technologies can do," he said.

Nevertheless, Mr. Cadle, who uses recognition programs from Visionics Inc. and Viisage, said his company had boosted the reliability his partners' software so that it can make a match 80 percent of the time and falsely claim a match with just 1 of every 500 passengers. Mr. Amanovich, however, said such figures are so malleable at this early stage that claims are not useful [...]



5 March 2002, Bill Berkowitz, "Surveillance cameras are watching you in the name of the "war on terrorism,'" Working for Change.

Surveillance technology research by government agencies has been going on for decades. In mid-April, the Washington Times reported that Government agencies have spent more than $50 million during the past five years developing camera surveillance technology, and proposed federal spending on such systems has increased since September 11, according to a recent report released by the General Accounting Office [GAO].

At the request of House Majority Leader Dick Armey, (R-Texas), the GAO surveyed 35 agencies and found that "17 reported obligating $51 million to [red-light, photo radar and biometric camera surveillance] as of June 2001, with the largest amount reported for facial recognition technology." According to the report, "Following the terrorist attacks on September 11, federal interest in facial recognition technology as a security measure appears to have increased." The report found that the development of facial recognition technology was not a new project, the first funding requests for photo-radar cameras came from the Navy in 1974. But no one seems to recall why the Navy wanted the devices or what they planned to do with them. The report also found that the Defense and Justice departments have spent more money than the other agencies combined on facial recognition since 1997.

Katie Corrigan, legislative counsel for the American Civil Liberties Union, said part of the reason the government has not fully committed to using facial recognition is that "the technology is ineffective and fallible. Several government studies have found, including a National Institute for Standards in Technology [report], that the technology has a high number of false negatives." She said that its study found that after faces were inserted into a database for 18 months, 43 percent of all scans turned up false negatives [...]



14 May 2002, the American Civil Liberties Union, Data on Face-Recognition Test at Palm Beach Airport Further Demonstrates Systems' Fatal Flaws.

MIAMI--Interim results of a test of face-recognition surveillance technology obtained by the American Civil Liberties Union from Palm Beach International Airport confirm previous results showing that the technology is ineffective, the ACLU said today.

"Once again, a test of facial recognition by security professionals has shown that the technology is just not an effective way to increase our safety," said Randall Marshall, Legal Director of the ACLU of Florida.

"First there were lab tests, then experiments at the Super Bowl and on the streets of Tampa, and now at the Palm Beach airport. In every case, the experience has been the same: facial recognition is a clunker that holds little promise to make us safer.

According to documents released to the ACLU pursuant to a request under Florida's open-records law (the "Sunshine" law), the system failed to match volunteer employees who had been entered into the database fully 503 out of 958 times, or 53 percent of the time.

"Even with recent, high quality photographs and subjects who were not trying to fool the system, the face-recognition technology was less accurate than a coin toss," said Barry Steinhardt, director of the National ACLU's Technology and Liberty Program. "Under real world conditions, Osama Bin Laden himself could easily evade a face recognition system."

Seemingly disappointed airport officials also noted other limitations of the system. In order to work well:

--The subject could not be wearing glasses: "Eyeglasses were problematic," according to a summary of the test findings. "Glare from ambient light and tinted lenses diminished the system's effectiveness."

--The angle of the facial image could not vary: "There was a substantial loss in matching if test subject had a pose 15 to 30 degrees (up/down, right/left) off of input camera focal point."

--The subject had to be perfectly still: "Motion of test subject head has a significant effect on the system ability to both capture and alarm on test subject."

--The subject had to be properly lit: "System required approximately 250 lux of directional lighting" to work.

--The airport had to have high quality photographs. "Input photographs populating the database need to be of a good quality."

"It hardly takes a genius of disguise to trick this system," said Marshall. "All a terrorist would have to do, it seems, is put on eyeglasses or turn his head a little to the side." The findings were based on the first four weeks of an eight-week trial. The test was conducted on about 5,000 passengers and employees a day at Palm Beach's Concourse C security checkpoint using a test group of 15 airport employees and a database of 250 photographs that, according to press accounts, features suspected criminals. There was no indication that the system successfully flagged any of those 250 suspects.

In addition, Steinhardt noted that the system's two to three false alarms per hour could prove debilitating if officials tried to expand the system to all passengers, since the number of errors would presumably expand as more photographs were added to the database.

"Once again, even in a pristine test using photographs of cooperative subjects taken under ideal conditions, face-recognition is a disaster," said Steinhardt. "We hope that Palm Beach County will recognize that this system is a waste of money and preserve scarce security resources for programs that will actually make us safer."



16 May 2002, Julia Scheeres, "Airport Face Scanner Failed," Wired News.

Facial recognition technology tested at the Palm Beach International Airport had a dismal failure rate, according to preliminary results from a pilot program at the facility. The system failed to correctly identify airport employees 53 percent of the time, according to test data that was obtained by the American Civil Liberties Union under Florida's open records law.

"The preliminary results at the Palm Beach International Airport confirm that the use of facial recognition technology is simply ineffective and of no value," said Randall Marshall, legal director of the state ACLU chapter.

The manufacturer of the system, Visionics, said the results were poor because their product was not used correctly.

Ever since the Sept. 11 terrorist attacks, face scanning technology has been touted by manufacturers as the perfect device for recognizing terrorists in airports. In theory, the systems use surveillance cameras to scan crowds for bad guys and sound an alarm when a match is made between a live person and the system's database of known criminals.

The Palm Beach airport tried Visionics' FaceIt system, which snaps photographs of passersby using a security camera and breaks down their facial features into a numeric code that is matched against the photograph database.

The month-long test compared 15 employees against a database containing the mug shots of 250 airport workers, said airport spokeswoman Lisa De La Rionda, who declined to comment on the quality of the system.

"They never made promises to us about how successful the system would be," she said, stressing that it was tested free of charge.

But the ACLU said the study was done under optimal conditions and still exhibited fatal flaws. Out of 958 attempts to match the 15 test employees' faces to the database, the system succeeded only 455 times.

The Tampa police department has also been testing the FaceIt system over the last six months, and the technology has yet to make a match with a database of known criminals.

"The system could be serving as a deterrent for criminal activity... we still believe in its potential for law enforcement," said police department spokeswoman Katie Hughes.

The airport trial found that the photographs included in the database had to be good quality to avoid false alarms and ensure successful matches. Head motion, indirect lighting, sunglasses and eyeglasses also flummoxed the system.

The finicky nature of the software was previously documented by Internet privacy and security consultant Richard Smith. Last fall, Smith analyzed the FaceIt software and found a 50 percent failure rate as he adjusted for variables such as face angle and hats.

"If you adjusted everything just right you could get OK results," he said.

Visionics, whose face scanning systems are being tested at four U.S. airports, bristled at the ACLU's conclusions.

"The decision makers will not be reading a report from the ACLU, they'll be looking at the real data," said Visionics spokesman Meir Kaahtan.

He said that similar tests at the Dallas-Fort Worth and Boston Logan airports showed a 90 percent success rate and insisted that the poor results at the Palm Beach International airport were due to incorrect lighting. Results for the other pilot programs were not immediately available.



15 October 2002, Karen Dearne, Biometric checks must improve.

The accuracy of biometric systems needs to improve if widespread acceptance is to be achieved, the Biometrics Institute Conference has been warned. ANZ Bank fraud risk head Lawrence Cox said a 1.6 per cent false acceptance rate "on 80 million cheques issued across Australia in 2001 would mean more than one million cheques being issued and accepted falsely".

"In 2001, we identified that 12,500 cheques had been falsely issued, so there's a big difference there," he said. "How much is that in dollar terms? Likewise, the false rejection rate is a concern. I don't want to be called to my managing director's office and asked why someone is locked up in an overseas country because their cheque or smartcard has been refused." There was potential to use biometrics to improve customer service and the security of systems, but the error rate would need to be cut before there was widespread acceptance, Mr Cox said.

The conference, held in Sydney last Friday, canvassed the latest developments in iris, facial, fingerprint, handwriting and voice recognition systems. Many of these technologies are being deployed in security environments such as airports for passenger control and customs, as well as the financial sector and government agencies [...]



31 December 2002, Randy Dotinga, Biometrics Benched for Super Bowl.

When the Super Bowl comes to San Diego next month, hundreds of eyes will scan the stands looking for criminals and terrorists. But none of those eyes will belong to computers. Facial-scan technology, which made a flashy and controversial debut at the 2001 Super Bowl in Tampa Bay, Florida, is essentially out of the business of working the crowds.

"We're stepping back from surveillance and concentrating more on controlling access," said Cristian Ivanescu, chief technology officer of Graphco Technologies. The company is one of several manufacturers of systems that confirm identities by using biometrics -- measurements of parts of the body such as the face, retina and fingerprints or of other physical characteristics such as the voice. It turns out that facial-scan technology works best when it's used to confirm whether people are who they say they are. Picking a crook out of a cast of thousands is a lot harder, and false alarms are more common.

That wasn't so obvious two years ago when Graphco teamed up with several other companies to bring facial scans to the Super Bowl. Civil libertarians howled in outrage before the game. Afterward Tampa Bay police reported that the technology pinpointed 19 people with criminal records out of a crowd of 100,000. This year, in preparation for one of the most security-conscious Super Bowls in history, police barely considered the idea of facial scans.

Assistant police chief Bill Maheu, head of Super Bowl security for the San Diego Police Department, said he'd heard too many bad reports about such systems' inability to correctly "capture" faces (in other words, match them to those in databases of bad guys). "If you have your settings low enough to capture people, it captures everybody -- way too many people," Maheu said. "If you set it too high, it doesn't capture anyone." Ivanescu said the problem isn't so much sensitivity as getting clear face scans. The technology works well "as long as you can control the lighting, the glare and the position of the camera," he said. In a public place where many things can go wrong with the scanning process, "the technology is more of a deterrent than anything else," Ivanescu said.

But the story is different in places where people being scanned stand still for a few seconds. Casinos compare the faces of suspicious gamblers to databases of infamous cheats, government facilities scan faces to make sure imposters don't get into secure areas and cops use facial scans to confirm the identity of misdemeanor suspects, said Cameron Queeno, vice president of marketing for Viisage, a manufacturer of facial-recognition technology. It appears, however, that biometrics technology isn't making a big dent in the fight against terrorism, although it could conceivably screen passengers at airports.

Back in San Diego, where the big game looms on Jan. 26, technology won't be entirely absent from Super Bowl security. An estimated 50 video cameras will watch the crowd, and everyone who enters the stadium will go through a metal detector. But the lion's share of the security work will fall to ordinary cops and government agents, said Maheu. "They're not only looking at faces. They're looking at other indicators too -- a guy with a trench coat on a sunny San Diego day, a guy with a backpack. Any kind of indicator that would raise your eyebrows a bit." Just the type of thing that wouldn't get a second glance from a face-scanning computer.



27 February 2003, Karen Dearne, Face recognition fails test.

The much-heralded SmartGate facial recognition trial at Sydney Airport has 0suffered an embarrassing setback, with two Japanese visitors fooling the system simply by swapping passports. The automated system, which is a world-first attempt at using photo-matching technology for border control, falsely identified both men as matching the images contained in each other's travel documents. SmartGate is intended to replace the identity check performed by Customs officers and is supposed to take into account differences such as age and ethnicity and variations due to hairstyles or glasses. The men were attending a demonstration along with other international members of IATA's Simplifying Passenger Travel committee meeting in Sydney this week.

Minister for Customs Chris Ellison has confirmed that two Japanese participants were falsely accepted as each other. "At the time, SmartGate was in demonstration mode and the men were not subject to all the security checks incorporated within the system," he said. "The two men have previously been misidentified by a different face recognition system on trial overseas, and on this occasion they deliberately tried to fool SmartGate under the supervision of Customs officers."

Face recognition and other biometric identity systems have been widely criticised by security and privacy advocates because the technologies are immature and high failure rates are common. When Senator Ellison launched the $1.2 million project involving 3000 Qantas aircrew in January, he said SmartGate was faster and more accurate than other biometric systems on trial elsewhere. But IT privacy expert Dr Roger Clarke described facial recognition as "atrociously bad technology", while Geoffrey Ross, head of Australia's largest IT security firm, SecureNet, said facial recognition was "categorically not" an answer for airport security and identity checking. Senator Ellison said Customs officers conducted "additional testing" with the two men, providing "valuable data" that would assist in refining the system.

"This is the only "false accept" in more than 16,000 transactions involving 3500 people," he said. "This compares with favourably with other biometric systems and Customs is very happy with SmartGate's performance."



20 August 2003, Ybor cameras won't seek what they never found by Brady Dennis of The St. Petersburg Times.

TAMPA - In the end, everyone the secret cameras scanned turned out to be just another face in the crowd. Two years after Tampa became the nation's first city to use facial-recognition software to search for wanted criminals, officials are dropping the program. It led to zero arrests.

"I wouldn't consider it a failure," said police spokesman Joe Durkin. "You are always looking for new and efficient ways to provide the best service to the community. There's going to be ups and downs."

Tampa Mayor Pam Iorio did not return calls Tuesday seeking comment about the practice.

The city first toyed with the technology during the 2001 Super Bowl, when surveillance cameras monitored people entering Raymond James Stadium. That led critics to dub the game, "Snooper Bowl." And although cameras picked up 19 "hits," or possible matches with wanted criminals, none were arrested.

That June, New Jersey-based Visionics Corp. offered the city a free trial use of a similar program called Face-It, and the software was installed on 36 cameras in the Ybor City entertainment district. A Tampa police officer in a room three blocks away monitored a wall of televisions and, with a click, could pick out faces from the crowd to scan and run through a criminal database to search for matches.

Even as the software proved unsuccessful in nabbing wanted offenders, it did a superb job of attracting outrage from critics. Republican Dick Armey, the House Majority Leader at the time, called for congressional hearings on the controversial surveillance technology. Leaders from the American Civil Liberties Union denounced the practice, likening it to something out of George Orwell's novel 1984. Scores of protesters donned bandanas, masks and Groucho Marx glasses and took to the streets of Ybor City on a busy Saturday night to show their contempt for the face-scanning system.

The software also created false alarms, faces that seemed to match but didn't. In at least one instance, both police and a Tampa man ended up embarrassed. Rob Milliron, then 32, wound up on a surveillance camera one day while at lunch in Ybor City. Tampa police used his photo to demonstrate the system to local news media. A woman in Tulsa, Okla., saw his picture and fingered him as her ex-husband who was wanted on felony child neglect charges. Three police officers showed up at Milliron's construction job site, asking if he was a wanted man. Turns out he had never married, never had kids, never even been to Oklahoma. "They made me feel like a criminal," Milliron said at the time.

Critics of Face-It celebrated on Tuesday, saying that the Millirons of the world can finally walk down the street without fear of humiliation. "It's a relief," said Darlene Williams, chairwoman of the Greater Tampa Chapter of the ACLU. "Any time you have this sort of technology on public streets, you are subjecting people who come to Ybor to an electronic police lineup, without any kind of probable cause. The whole episode was very troubling."

Scanning companies such as Visionics and Identix (which since have merged and are known as Identix) saw their stocks soar in the wake of the Sept. 11 attacks. But the technology's success in actually catching wanted criminals or terrorists has apparently been marginal. Critics claim it is unreliable and ineffective, and potential customers such as the Palm Beach International Airport have passed on the equipment after test runs, saying it gave too many false positives and wasn't cost-effective. The company could not be reached for comment.

Durkin emphasized Tuesday that the trial run with Face-It didn't cost the city any money. But even so, he said, its use likely benefited the city. "Something that's intangible is how many wanted persons avoided (Ybor City) because the cameras were there," he said. "That's something we may never calculate." Durkin said even without the face-recognition software, the cameras in Ybor will remain.

Meanwhile, facial-recognition technology has been in use at the airport, jail and jail visitation center in Pinellas County for more than a year, and at the courthouse since late April. And Pinellas sheriff's officials have no plans to discard it, although they have not attributed any arrests to the technology. Pinellas sheriff's Lt. James Main, who heads the program for Sheriff Everett Rice, said Rice's office is confident the technology works well and is a useful security tool, despite the lack of arrests.

"We don't have any plans to change anything here," Main said. "The fact that we aren't making arrests doesn't mean the technology isn't working." He said Tampa's use of the technology is far different than in Pinellas. In Tampa, the technology isn't used in a controlled environment like the inside of a well-lighted courthouse, where people can be asked to take off hats and glasses. Rather, he said city officials across the bay gambled on the ability to pick faces out of a crowd. "To Tampa's credit, they were trying something new."



23 August 2003, About face: Two years of controversial failure finally persuaded the Tampa Police Department to give up onfacial-recognition technology in Ybor City editorial by The St. Petersburg Times.

With some strong nudging by Tampa Mayor Pam Iorio, the Tampa Police Department has decided to face facts. After two years, it is finally dumping a facial-recognition technology system that had been deployed in the entertainment district of Ybor City. The ostensibly ground-breaking technology was touted as a way to scan a crowd and tell the good guys from the bad. As it turns out, it couldn't differentiate the guys from the girls. The department is ending the failed experiment because the Face-It software didn't lead to a single arrest. That is not to say the software didn't produce "matches," just that all the matches were false positives. According to Tampa police records, Face-It sometimes even matched a male face with a female identity.

The real question is why it took two years to call it quits. Just how much police time had to be wasted before before the department decided to turn off the cameras and get on with serious law enforcement efforts? According to Cpt. Bob Guidara, spokesperson for the Tampa Police Department, the cost of the technology was "zero." He said the software was donated and the officer deployed to monitor the dozens of cameras would have been doing so anyway. But in fact, the Face-It system diverted substantial police resources.

On any given Friday night, about 125,000 people visit Ybor City. For facial comparisons to take place, an officer has to focus on an individual, scanning his or her face into the computer. The process takes time and diverts attention from more general surveillance, particularly if the system keeps spitting out false "matches."

From the start, it was clear this technology was not ready for prime time. Ever since it landed in Tampa with a dud during the 2001 Super Bowl, or "Snooper Bowl," experts have warned that all sorts of nuances can defeat the system, including sunglasses (gee, sunglasses in Florida?) and changes in facial hair. A 2000 report on face recognition technology by the National Institute of Standards and Technology said a mere 15-degree difference in position between the comparison photos will "adversely affect performance."

Private companies and government grants are pushing more biometric surveillance and identification, but not enough attention has been paid to the impact of these systems on privacy. The Pinellas County Sheriff's Office has received millions of dollars in federal grants to use facial-recognition systems and has deployed them at the airport, the jail visitation center and the courthouse. No public hearings occurred before all of us became facial-recognition guinea pigs.

We should be free to walk into public buildings and on public sidewalks without being subjected to an electronic line-up. As the technology improves, pressure will increase to utilize computer scanning and identity retrieval systems. But these methods go far beyond the cop on the beat. It is more like living in a fishbowl - a condition that might make us a bit safer, but sacrifices too much privacy.



1 September 2003, Airport anti-terror systems flub tests, by Richard Willing, USA TODAY.

Camera technology designed to spot potential terrorists by their facial characteristics at airports failed its first major test, a report from the airport that tested the technology shows. Last year, two separate face-recognition systems at Boston's Logan Airport failed 96 times to detect volunteers who played potential terrorists as they passed security checkpoints during a three-month test period, the airport's analysis says. The systems correctly detected them 153 times. The airport's report calls the rate of inaccuracy "excessive." The report was completed in July 2002 but not made public. The American Civil Liberties Union obtained a copy last month through a Freedom of Information Act request.

Logan is where 10 of the 19 terrorists boarded the flights that were later hijacked Sept. 11, 2001. The airport is now testing other security technology, including infrared cameras and eyeball scans, spokesman Jose Juves says.

Face recognition works by matching faces picked up by surveillance cameras with pictures stored in computer databases. Relationships between a face's identifying features, such as cheekbones and eye sockets, are converted to a mathematical formula and used to make a match. In the Logan Airport experiment, photographs of 40 airport employees were put into a database. The employees then attempted to pass through two security checkpoints where face-recognition cameras were used.

The ACLU opposes facial recognition because it says the government can use the technology to invade citizens' privacy. "But before you even get to the privacy concern, there's a fundamental question about our security," says Barry Steinhardt, who specializes in privacy issues at the ACLU's national office in New York. "The thing just plain doesn't work."

A spokesman for one of the companies whose system was tried at Logan Airport says the test was not a fair measure of the technology. Meir Kahtan of Identix of Minnetonka, Minn., says the technology is far better suited for "one-to-one" identification, such as comparing photos on passports or driver's licenses, than random searches of photo databases. A government test in 2002 found that face-recognition systems scored correct matches more than 90% of the time when used for such one-to-one identifications.

A spokesman for Visage Technology of Littleton, Mass., the other company that failed the Logan test, declined to comment.

The Logan Airport report is the latest piece of bad news for a technology that was once touted as the state-of-the-art method for picking faces out of crowds. Last month, Tampa police announced that they were shutting down face-recognition cameras because they had failed to make any matches during a two-year test period. The cameras, which were mounted in a popular tourist area, were designed to match pictures captured at random against stored photos of wanted suspects and runaway children. Virginia Beach, police, who have operated a similar system for the past year, reported no matches as of July.

The Logan experiment was the largest test of facial-recognition technology made public. The technology has also been tested using smaller groups of volunteers at airports in Dallas/Fort Worth, Fresno, Calif., and Palm Beach County, Fla., with similar results. The Transportation Security Administration, which is responsible for passenger screening, has tested other airport security technology but has not made results public. Phone calls requesting comment on the Logan Airport test were not immediately returned.

Kelly Shannon, spokeswoman for the State Department's consular affairs office, said the Logan Airport results would not affect plans to use face recognition to enhance passport security. Beginning in October 2004, the United Kingdom, Japan and 25 other countries whose nationals are permitted to travel to the USA without visas are required to convert to passport photos that are compatible with face-recognition systems.



3 September 2003, Face recognition devices failed in test at Logan, by Shelley Murphy and Hiawatha Bray, Boston Globe.

Facial recognition technology designed to detect terrorists failed to match identities of a test group of employees at Logan International Airport in 38 percent of the cases, according to a study released yesterday by the American Civil Liberties Union. Other technology that scanned the eyes of airport employees entering secure areas to verify their identities was rejected recently by Massachusetts Port Authority officials, partly because some employees found it too intrusive. Yet, officials said other technologies tested recently at Logan have been more successful and will soon be adopted permanently, including the installation of infrared cameras to detect intruders around the perimeter of the airport, and hand-held computers that State Police can use to run background checks on people or to check license plates.

"The reason we do [testing] programs is to determine what's effective in a real world environment, to get them out of the lab and into the passenger terminal to see how they work," said Jose Juves, spokesm an for Massport.

The highly touted facial recognition technology, which was tested at Logan between January and April 2002 and rejected last summer, failed to detect employees who volunteered in the program on 95 occasions when they passed through checkpoints at two terminals, according to a study that the ACLU obtained from Massport after requesting it under the Freedom of Information Act. During the test, the photographs of 40 employees who volunteered for the program were scanned into a database. Cameras at two checkpoints at the airport relayed the images of everyone -- passengers and employees -- passing through to a computer, which compared them to the pictures stored in its memory. It used facial recognition technology to come up with a match.

The technology, provided by Viisage, in Littleton, and Identix, in Minnesota, successfully identifed employees 153 times, and falsely matched wrong people with the stored images three times, according to the study.

"It is making a match where none exists, so innocent people could well be targeted while at the same time it may not catch terrorists," said Carol Rose, executive director of the ACLU of Massachusetts. "It's dangerous to go down the path of high technology or newfangled ideas that not only won't keep us safe, but threaten our liberty." USA Today reported the results of the facial recognition study at Logan yesterday.

The ACLU isn't alone in its doubts about the use of face scanning at airports. Viisage president and CEO Bernard Bailey, who was named head of the company after the Logan test was completed, said that he, too, is opposed to the idea, simply because the technology still isn't good enough.

"I don't think that's the best use of our technology," Bailey said. "The hype of this technology got way ahead of the capabilities of it." Bailey said that the accuracy of airport facial scanning is hampered by the fact that scanning devices still have trouble coping with different lighting conditions and poses. Even changing the position of a person's head can cause an inaccurate identification. Bailey said that the Viisage technology is better suited for tasks such as photographing people applying for visa applications or those who've been arrested, situations in which photos could be taken under precisely controlled conditions, with subjects required to face the camera in a consistent manner.

But Meir Kahtan, spokesman for Identix in Minnetonka, Minn., defended his company's FaceIt identification system, saying company statistics showed an 85.7 percent correct identification rate during the Logan test. "To characterize it as a failure is disinformation," Kahtan said. "Given the results of the Logan test, the catch rate was sufficient to have stopped between 11 and 12 of . . . the 19 terrorists" on Sept. 11, 2001, said Kahtan.

State Police Major Thomas Robbins, who oversees Troop F at the airport, said officials were hopeful that facial recognition would help them spot and arrest terrorists if they came to the airport. But the new technology "was very labor-intensive . . . and quite frankly, it's not ready for prime time." Both Robbins and Juves said officials had decided not to use iris scanning technology after testing the equipment earlier this year on airport employees. "That was another example of weeding out products or technologies that don't work for an airport," said Juves, who added that some employees had an "aversion" to having their eyes scanned to verify their identities before they could enter certain areas. He said that Massport is considering whether fingerprint scanning would be more effective.

Tom Nutile, spokesman for Tier Technologies, the Walnut Creek, Calif., company that loaned the iris scanning equipment to Logan, said it's one of the least intrusive technologies for identification. "Many new technologies take some getting used to," he said.

Still, Massport officials and State Police have embraced other technologies since trying them out this year. After testing six high-tech surveillance systems to guard the perimeter of Logan with infrared cameras, Juves said, Massport is poised to permanently install such a system at the airport. Massport will be seeking bids from vendors within the next week or so to install the heat-sensitive system that will guard the airport's perimeter from intruders. Two video cameras mounted on Logan's control tower focus on sources of heat using infrared technology. The cameras relay images to a computer at Massport's operations center and to handheld computers carried by State Police assigned to the airport.

There were other testing programs at the airport within the past year that have led to new security upgrades as well. State Police currently use a scanning system that checks the authenticity of licenses and passports for employees and hope for its eventual use at passenger terminals.




Contact the Surveillance Camera Players

By e-mail SCP@notbored.org or by snail mail: SCP c/o NOT BORED! POB 1115, Stuyvesant Station, New York City 10009-9998



NOT BORED!