Tag: Biometrics

Most of us secure our digital lives with passwords — hopefully different, strong passwords for each service we use. But the inherent weaknesses of passwords are becoming more apparent: this week, Evernote reset 50 million passwords after a breach, and that’s just the latest in a series of high-profile password-related gaffes from Twitter, Sony, Dropbox, Facebook, and others.

Isn’t security a problem that biometrics can solve? Biometric technologies allow access based on unique and immutable characteristics about ourselves — ideally something that can’t be stolen or faked. Although it’s often the stuff of spy movies, big businesses, governments, and even parts of academia have been using biometric authentication like fingerprints, voice recognition, and even facial scans to authenticate users for years.

So why aren’t the rest of us using these tools to guard our devices, apps, and data? It can’t be that hard … can it?

The myth of the fingerprints

For centuries (maybe even millennia), fingerprints have been used to verify identity, and fingerprint scanners have been options in mainstream business computers for about a decade. Typically, users swipe a finger over a narrow one-dimensional scanner, and the system compares the data to an authorized user. The process of scanning and matching is complex, but as the technology has evolved, accuracy has improved and costs have come down.

But fingerprint readers have downsides. The most obvious are injuries like burns and cuts — imagine being locked out of your phone or computer for a week because a potholder slipped. Stains, ink, sunscreen, moisture, dirt, oils, and even lotion can interfere with fingerprint readers, and some fingerprints just can’t be scanned easily. I’m personally a good example — parts of my fingertips are worn smooth (or blistered) from playing instruments, but lots of people who work with their hands often have thin ridges (or none at all) on their fingers. Trained law-enforcement personnel could tease prints off me if I get hauled down to county, but my luck with fingerprint readers in notebooks is abysmal, and I can’t imagine using one on a phone — outside in the rain, since I live in Seattle.

So far, there’s been only one mainstream smartphone with a fingerprint reader — the Motorola Atrix, which mostly went nowhere along with Motorola’s Webtop technology. But things might be changing: for instance, Apple’s recent acquisition of Authentec last year has fueled speculation future iPhones and iOS devices will offer fingerprint recognition, and long-time fingerprint tech firm Ultra-Scan says they have an ultrasonic reader 100 times more accurate than anything on the market.

“I believe the shift to using fingerprints in consumer electronics is set to happen very fast,” wrote Vance Bjorn, CTO of access management firm Digital Persona. “Passwords are widely regarded as the weakest link of security, and they are becoming even less convenient as consumers need to type them in on smartphones and via touch screens. The use of a fingerprint for authentication solves both problems — security and convenience.”

Your voice is your password

Voice authentication seems well-suited to smartphones: They’re already designed to handle the human voice, and include technologies like noise filtering and signal processing. Just as with fingerprint readers, voice authentication technology has existed for years in high-security facilities, but hasn’t broken into mainstream consumer electronics.

Approaches to speaker identification vary, but they all have to handle variations in speech, background noise, and differences in temperature and air pressure. Some systems compare speakers with pre-recorded phrases from an authorized user, like a PIN number. More sophisticated systems perform complex calculations to work out the acoustic characteristics of a speaker’s vocal tract, and even compare speech patterns and style to determine if someone is (literally) who they say they are.

The first approach is brittle, and can lock out users who have a cold or cough. They’re also vulnerable to recordings. Reformed hacker Kevin Mitnick has spoken of tricking a financial firm’s CEO into saying the numbers zero through nine, recording the numbers, and using them to bypass the bank’s phone-based voice authentication.

More sophisticated approaches can be computationally intensive. These days, the heavy lifting is often done at the receiving end — which means your biometric data gets transferred, often with no protection at all.

“Existing voice technology uses the equivalent of plaintext passwords,” said Manas Pathak, a researcher in privacy-preserving voice authentication who recently completed his Ph.D. at Carnegie Mellon University. “You’re giving part of your identity to the system.”

A voiceprint stored on a remote system can be stolen just like a password file. Moreover, voice data alone can reveal our gender and nationality — even our age and emotional state.

Recent developments aim to work around these problems. A new open standard being developed by the FIDO Alliance would support multiple authentication methods, but biometric data would never leave users’ devices. Pathak and other researchers have developed a sophisticated system that creates a stable representation of a user’s voice (talking about anything), divides it up into similar samples, then cryptographically protects them before performing any comparison. The remote systems never see users’ biometric data, and it can’t be recreated.

“We have about 95 percent accuracy,” said Pathak. “We would need to install it on more phones with wider deployment and testing. That’s outside the realm of academic research, but it’s ready for commercialization.”

Still, it’s hard to discount noise and environmental factors. Traffic noise, conversation, televisions, music, and other sounds can all reduce accuracy. There are times voice recognition is simply not practical: Imagine someone on a bus or subway shouting at their phone to unlock it.

Face the face

Face recognition and iris scans are other common forms of biometric authentication — and could make sense for consumer electronics, since almost all our phones and tablets have high-resolution cameras.

Facial recognition systems work by noting the size, shape, and distances between landmarks on a face, like the eyes, jaw, nose, and cheekbones. Some systems also consider elements like wrinkles and moles, while some high-end gear constructs 3D models — those work even with profile views.

Most of us have seen face recognition technology on Facebook and in applications like Apple’s iPhoto — and the results are pretty uneven. Commercial face recognition systems are more robust, but they struggle with the same things that can make Facebook’s efforts laughable: crappy photos. Bad lighting, glasses, smiles, goofy expressions, hats, and even haircuts can cause problems. Even the best facial recognition systems struggle with angled images, and people’s faces can change radically with age, weight, medical conditions, and injury.

“A few years ago we had a high-end facial recognition system failing a senior engineer almost every time,” said a security coordinator for a Boston-area firm who didn’t want to be identified. “Why? He looks like a mad scientist, complete with thick glasses, crazy hair, and full beard. But I failed the same system myself after eye surgery when I wore a patch for a couple weeks.”

Iris recognition applies pattern matching technologies to the texture (not color) of a user’s iris, usually with a little help from infrared light.

Iris patterns are probably as unique as fingerprints — and they’re generally much more stable. Moreover, matching iris patterns doesn’t require tons of processing power and (in good conditions) has very low false-match rates.

For years, iris recognition technology was largely locked up by key patents held by Iridian, but those patents expired a few years ago and the field has seen significant new development. However, much of it has been aimed at government-funded identification programs, rather than consumer applications.

Iris recognition also has pitfalls. Users would likely have to hold a device closed to their face, with decent lighting and little to no motion. Most eyewear would have to be removed, and some drugs and medications can deform an iris pattern by dilating or constricting pupils — try passing an iris scan following an eye exam.

Just as voice authentication can be vulnerable to recordings, iris scanners can be fooled by quality photographs, or even contact lenses printed with fake irises. As a result, right now the technology is mostly used in human-supervised situations — like immigration and passport control — rather than automated systems.

Securing ourselves

There’s no doubt passwords are an increasingly feeble way to secure our digital lives, and biometric authentication technologies can lock things down using something we are rather than just something we know. However, none of these technologies are a magic bullet for digital security: They all fail for some people some of the time, and they all carry risks and vulnerabilities. Moreover, if biometric information is ever compromised, there may be no going back. After all, you can change your password, but good luck changing your thumbs.

Nonetheless, biometric technologies seem poised to move into consumer electronics soon, most likely in multi-factor systems offering a layer of security in addition to passwords.

“Passwords will never go away — ‘what you know’ will remain a critical tool for security,” noted Digital Persona’s Vance Bjorn. “But I do see the day soon where that tool is no longer viewed as sufficient for most services consumers or employees access.”

View Source

Gun owners and advocates are fond of saying, “Guns don’t kill people, people kill people.”

This might be a more useful aphorism: Smart-guns don’t kill the wrong people.

Technology exists, or could exist, that would make guns safer. The idea of a safe gun might seem to be the ultimate oxymoron: guns are designed to kill. But something missing from the gun-control debate that has followed the killing of 20 children and six adults at an elementary school in Newtown, Conn., is the role of technology in preventing or at least limiting gun deaths.

Biometrics and grip pattern detection can sense the registered owner of a gun and allow only that person to fire it. For example, the iGun, made by Mossberg Group, cannot be fired unless its owner is wearing a ring with a chip that activates the gun.

But you would be hard pressed to find this technology on many weapons sold in stores. “The gun industry has no interest in making smart-guns. There is no incentive for them,” said Robert J. Spitzer, a professor of political science at SUNY Cortland and the author of four books on gun policy. “There is also no appetite by the government to press ahead with any kind of regulation requiring smart-guns.”

Why can we open our front doors with our iPhones and have cars that drive themselves, but we can’t make a gun that doesn’t fire unless its registered owner is using it?

“We can,” Dr. Spitzer said. “These safety options exist today. This is not Buck Rogers type of stuff.” But gun advocates are staunchly against these technologies, partly because so many guns are bought not in gun shops, but in private sales. “Many guns are bought and sold on the secondary market without background checks, and that kind of sale would be inhibited with fingerprinting-safety technologies in guns,” he said.

I called several major gun makers and the National Rifle Association. No one thinks a smart-gun will stop a determined killer. But I thought Smith & Wesson and Remington, for instance, would want to discuss how technology might help reduce accidental shootings, which killed 600 people and injured more than 14,000 in the United States in 2010. The gunmakers did not respond, and neither did the N.R.A.

A Wired magazine article from 2002 gives a glimpse of the N.R.A.’s thinking. “Mere mention of ‘smart-gun’ technology elicited sneers and snickers faster than a speeding bullet,” the magazine wrote. It quoted the N.R.A.’s executive vice president, Wayne LaPierre, as saying, “Tragic victims couldn’t have been saved by trigger locks or magazine bans or ‘smart-gun’ technology, or some new government commission running our firearms companies.”

After the massacre at Sandy Hook Elementary School in Newtown in December, Mr. LaPierre created a new aphorism: “The only thing that stops a bad guy with a gun is a good guy with a gun.” He said violent video games and movies were part of the problem, but he didn’t mention smart-guns as a solution.

TriggerSmart, an Irish company, has patented a childproof smart-gun. One feature is a “safe zone” that can be installed in schools and acts as a force field, disabling any TriggerSmart gun that enters a designated area. Robert McNamara, the company’s founder, has been trying to persuade gun makers to adopt the technology. He isn’t having much luck. “One gun manufacturer told us if we put this technology in one particular gun and some kid gets shot with another gun, then they will have to put them in all guns,” he said.

“We believe we could have helped prevent the Newtown massacre.”

View Source

The days of sneaking out for three-hour lunch breaks will soon be over at a Bay Street law firm after it decided to install fingerprint-scanning technology to monitor its employees’ whereabouts.

Last month, McCague Borlack LLP announced plans for a revamped security system that will require staff (except lawyers who spend much of their time with clients) to clock in and out of the office with a finger swipe, keeping track of morning late-comers or those who try to jump-start their weekends by slipping out early on a Friday.

“Some people were abusing the system,” said founding partner Howard Borlack, 58. “We had people taking two to three hours for lunch and we had no way of knowing. . . . Some people were complaining.”

Other Toronto firms use security passes and honour systems to keep track of time worked. McCague Borlack, which focuses mostly on insurance law and employs about 200 people, has gone a step further with a system that not only provides office access via fingerprint, but also records employees as they enter and leave.

Come mid-November, when the system is expected to go live, the office will be equipped with finger-scanning machines supplied by Utah-based Qqest, Inc. that will keep a rolling record of the time spent in the office.

Read More

Police may soon be able to catch criminals by the ink they are sporting.

Computer scientists are developing a new program that will be able to identify suspects by their tattoos and match them to photos in police databases or on social media.

Automatic identification through recognition of body art could provide a much needed breakthrough in detective work, often thwarted by grainy footage from surveillance videos that make it difficult to see a criminal’s face to use facial recognition.

‘Those photos are often so bad that face recognition wouldn’t come even close’ to finding a match in a database, Terrance Boult, a computer science professor at the University of Colorado, explained to Live Science.

To rectify this problem, Boult worked with a team of researchers to develop a computer program that reviews body ink, scars, moles and visible skin markings in photos.

The program scans images for these identifiable skin symbols and then looks for people bearing the same markings in a photo database.

The program is designed to pick up patterns in tattoos and could even link together members of gangs, who often share body tags.
Though this isn’t the first program to examine body markings for identification, the computer program was designed to better handle low quality photos, like those taken from a smart phone.

Read More

The gender of suspects can now be determined from latent fingerprints. Peptides and proteins left behind in fingerprints can be processed using a technique called mass spectrometric imaging, a new milestone in fingerprinting technology.

The technique, entitled the “matrix assisted laser desorption ionisation” process was developed by Simona Francese and Rosalind Wolstenholme of the Biomedical Research Center at Sheffield Hallam University. The project is supported by UK’s Home Office in order to enhance the country’s law enforcement capacity.

In certain cases, the peptides and proteins left behind in fingerprints will be the only data police will have on suspects, especially if the suspect’s profile is not matched within the UK’s National Fingerprint Database. Determining the gender, based on the chemical composition of a finger mark, can help narrow down suspects.

Based on a study using matrix assisted laser desorption ionisation profiling, 85 percent accuracy was gained using the technique. The process can detect drug use or drug handling from lipids present in latent fingerprints. The profiling can help investigators create suspect composites along with determining the nutritional habits, drug use or hormonal status of a person.

View Source

It could be time for you to start worrying about what Facebook might be doing with the identity information collected on you and “tagged” photos.

The Hamburg Commissioner for Data Protection and Freedom of Information in Germany has announced legal action against the company and charged that Facebook’s use of facial-recognition technology is illegal.

In addition, the Federation of German Consumer Organizations is ordering Facebook to stop giving third-party applications users’ data without their consent.

If the social network doesn’t do this by Sept. 4, the FGCO will sue. Earlier this month, Norway also announced that it is looking into the legality of the social network’s use of face-matching technology.

Unlike the United States, Germany has regulations that allow Internet users control over their data.

Regarding photo tags, a Facebook spokesperson told CNET: “We believe that the photo tag suggest feature on Facebook is fully compliant with EU data protection laws. During our continuous dialogue with our supervisory authority in Europe, the Office of the Irish Data Protection Commissioner we agreed to develop a best practice solution to notify people on Facebook about photo tag suggest.”
Facebook: facial recognition profiles without user consent

A number of companies – like Facebook, Apple and Google – have facial recognition or detection as an automatic part of various services and apps.

With Apple and Google, users must opt-in, and they can opt-out. While its users can remove tags, Facebook’s facial recognition feature is active by default.

But what happens with that information? It’s not just that Facebook is using facial recognition (biometric data) to increase the worth of its data for sale, trade, or for whatever currency it’s lining litterboxes with in Menlo Park.

In its December 2011 comments the Electronic Privacy Information Center told the Federal Trade Commission:

(…) the Commission should specifically prohibit the use of Facebook’s biometric image database by any law enforcement agency in the world, absent a showing of adequate legal process, consistent with international human rights norms.

Facebook reportedly possessed an estimated 60 billion photos by late 2010, and approximately 2.5 billion photos are uploaded to Facebook each month.

The democratization of surveillance

EPIC’s comments came after the FTC held a day-long forum called “Face Facts: A Forum on Facial Recognition Technology,” focusing on the commercial applications of facial recognition technology and its potential privacy implications.

The “Face Facts” participants came from disparate sides of the discussion. They included FTC attorneys, the Face.com CEO, Facebook’s senior privacy advisor and director, and reps from Google, the Privacy Rights Clearinghouse, the Center for Democracy and Technology and the ACLU.

Demos were done for participants by Intel AIM Suite (Audience Impression Metrics: a CMS-friendly and API-ready, public-use face detection software product) and Andrew Cummins, self-described strategy expert in tech/defense markets and the chief technology officer of controversial app-maker SceneTap.

There was also a representative from the National Institute of Standards and Technology. Interestingly, in 2010 NIST tested various facial recognition systems and found that the best algorithm correctly recognized 92 percent of unknown individuals from a database of 1.6 million criminal records.

FTC Chairman Jon Leibowitz opened “Face Facts” saying this summit was timely because, “Facebook has launched new facial recognition technology” and that “These sorts of technologies have already taken hold in law enforcement and the military; in that area, they are as controversial as they are interesting.”

I’m not sure if his use of a clip from the Tom Cruise film Minority Report in his opening remarks was meant to be ironic or not. Perhaps Mr. Leibowitz misses working for the MPAA (where he was chief lobbyist until being tapped for the FTC by George W. Bush in 2004).

Leibowitz did say, “We must confront openly the real possibility that these technologies, if not now, then soon, may be able to put a name with the face, so to speak, and have an impact on our careers, credit, health, and families.”

The Face Facts meeting raised an alarm for privacy organizations; Privacy Rights Clearinghouse director Beth Givens stated outright that there is insufficient public awareness about all aspects of facial recognition tech, and zero auditing mechanisms in place for any entity using the technologies.

Six months after the FTC meeting, Facebook acquired one of the biz-dev side participants, Face.com.

It’s clear that after meetings and summits, even with good intentions for privacy protections, regulators like the FTC are merely only still on the outside looking in.

Ties between video profiling in private and government sectors more murky than ever

Back in July at a Senate Judiciary subcommittee hearing Senator Al Franken said, “Facebook may have created the world’s largest privately held database of face prints without the explicit knowledge of its users.”

Franken continued to link the holes in citizen protections and stressed implications with the then-new Federal Bureau of Investigation facial-recognition pilot program.

Franken stated that any law-enforcement gains from the program could come at a high cost to civil liberties. “The FBI pilot could be abused to not only identify protesters at political events and rallies, but to target them for selective jailing and prosecution, stifling their First Amendment rights,” he said.

Think about the implications of facial recognition profiles on social media sites along with current trends in cybersecurity legislation hysteria.

Remember CISPA? The surveillance bill would have given Homeland Security a backdoor pass to access your email, private information and social network data without a warrant or notice if it fit into a plan to stop “cybersecurity” threats. CISPA would have made it so that Facebook would be completely unrestricted (say, by your rights) to cooperate with Homeland Security to the fullest extent.

Just in the past few weeks, information has surfaced from a Wikileaks leak of private intelligence documents that the purpose of a surveillance product called TrapWire is to combine various intelligent surveillance technologies with tracking and location data, individual profile histories from various sources (datamining and social media), and image data analysis (such as facial recognition; TrapWire’s video component) to monitor people under the guise of threat detection.

TrapWire is a commercial product sold to and implemented by private entities, the US Government “and its allies overseas.”

Too little FTC, too late?

Facial recognition technologies are no longer held back from commercial sectors by high costs and poor accuracy and are quickly becoming directed at recording faces in public places and business establishments, rather than only online.

The FTC had impressed the point that the avenue of interest for “Face Facts” would solely address commercial uses and does not address the use of facial recognition technologies for security purposes or by law enforcement or government actors.

Right now there is nothing that requires any private entity to provide the individual with notice that facial recognition information is being collected, or the duration of the period in which the information will stored, or used.

There is nothing preventing private entities (businesses, app developers, data brokers or advertisers) from selling, trading, or otherwise profiting from an individual’s biometric information – or from disclosing or disseminating the information without obtaining the individual’s consent or pursuant to a valid warrant or subpoena.

It will be interesting to see how Facebook handles its newest privacy problem in Germany.

View Source

FBI To Add Tattoos To Biometric ID Capabilities

The Federal Bureau of Investigation is looking to add tattoos to its portfolio of human identifying characteristics, and it’s reaching out to industry for information on how to do that, according to a request for information.

The FBI’s Biometric Center of Excellence (BCOE) is seeking information regarding existing databases of tattoos, according to the July 13 RFI, which was posted on the Federal Business Opportunities website, FBO.gov, and first reported by NextGov.

The FBI has a list of questions for industry and law enforcement agencies on tattoo databases, including whether “possible meanings and gang affiliations” are provided and whether gang experts are involved in that analysis. The RFI inquires about the underlying database technology, how information is extracted, and standards for metadata and images.

The FBI describes BCOE as a “one-stop shop for biometric collaboration and expertise.” Best known for its fingerprint and DNA identification services, the center is also developing capabilities in voice, iris, and other identifying characteristics. In the area of emerging biometrics, it’s exploring footprints and hand geometry.

Tattoo recognition is part of the FBI’s Next General Identification program, a multiyear initiative to develop ID capabilities beyond fingerprints and criminal mug shots. On July 18, Jerome Pender, deputy assistant director of the FBI’s Criminal Justice Information Service division, told a Senate subcommittee that three phases of the program are in development, including one focusing on scars, marks, and tattoos. The capability to support that is scheduled for deployment in the summer of 2014.

Read more

With at least 30 million surveillance cameras watching Americans every day, one aspect of the world of George Orwell’s dystopian novel 1984 has already come to pass, and more is on the way. In the next two years, for example, the FBI plans to test a nationwide database for searching iris scans to more quickly identify persons “of interest” to the government. The human iris, which is the doughnut-shaped, colored part of the eye that surrounds the black pupil, exhibits a pattern unique to each individual, just as fingerprints do, and iris recognition has been a staple of science fiction stories and films for years.

Iris scanning is part of the FBI’s Next-Generation Identification system, a multiyear $1 billion program built by Lockheed Martin and already well underway for several years, which will expand the FBI’s server capacity to allow for rapid matching not only of iris scans, but also of additional physical identifiers, such as fingerprints, palm prints and facial images. The FBI intends to test the system in conjunction with prisons, some of which already use iris scans to track prisoners and prevent mistakes of identification. According to the FBI, the time for urgent criminal fingerprint searches will eventually be reduced from 2 hours to 10 minutes, while the use of iris scans and other markers should ensure greater accuracy.

Although privacy advocates have little criticism of the use of iris scanning in correctional settings, the fact that the FBI and state prison officials are using a database owned and maintained by a private corporation, BI2 Technologies, gives many pause. Jennifer Lynch, a staff attorney at the digital rights group Electronic Frontier Foundation, points out that privately-run databases, including well-encrypted ones at banks and other financial businesses, have experienced serious data breaches exposing private customer information, and that leaks of fingerprints or iris scans would be potentially much more serious. “You can change your credit card data. But you can’t change your biometric data.”

And in light of the fact that the New York Police Department, in cahoots with major Wall Street banks and finance firms, used security cameras to identify Occupy Wall Street protesters, suspicions that iris scans might be used to target non-criminals who are disliked by powerful cannot be dismissed out of hand.

Read more

The FBI plans to test by 2014 a database for searching iris scans nationwide to more quickly track criminals, according to budget documents and a contractor working on the project.

The Next-Generation Identification system, a multiyear $1 billion program already under way, is expanding the server capacity of the FBI’s old fingerprint database to allow for rapid matching of additional physical identifiers, including facial images and palm prints.

Today, iris scans conjure images of covert agents accessing high-security banks and laboratories. But, increasingly, law enforcement agencies are spending state and federal funds on iris recognition technology at jails to monitor inmates. Some Missouri prisons are buying the same system the FBI acquired, partly so that they can eventually exchange iris images with federal law enforcement officials. And many counties are storing pictures of prisoner irises in a nationwide database managed by a private company, BI2 Technologies.

The FBI expects to collect many of these state and local iris images, according to B12 officials and federal documents.

A May 17 budget justification document states one of the “planned accomplishments for BY13” — the budget year that begins Oct. 1 — is to “demonstrate iris recognition capabilities via the iris pilot.”

A June FBI advisory board memo that Nextgov reviewed states, “supervised release/corrections are candidates for the pilot, being that many already have the capability in place. The additional goal is to start to build an iris repository.” Iris recognition is a helpful identification tool, according to the memo, because it “is very accurate,” does not require human intervention and “the hardware footprint is also very small [due] to the size of the iris image.”

The aim of iris recognition at corrections facilities, according to law enforcement officials, is to promptly catch repeat offenders and suspects who try to hide their identities.

Building a Repository

Officials at the Pinal County Adult Detention Center in Florence, Ariz., appreciate the nonintrusiveness of the BI2 iris recognition system, which does not touch prisoners’ faces when snapping photos of irises or scanning eyes for recognition. The inmates place their eyes three to 10 inches away from binocular-like lenses, which record the iris image, so wardens stay out of harm’s way during head counts, county officials said. The technology also ensures the center does not mistakenly release similar-looking siblings, twins or parents, when one family member comes up for parole, they added.

President and Chief Executive Officer Sean G. Mullin said BI2 Technologies has been working closely with the FBI unit chief responsible for implementing NGI. “BI2 Technologies provided the FBI [Next-Generation Identification system] over 12,000 iris images from current law enforcement agency clients for analysis and testing by NGI,” he said. Company officials said they were not aware of a specific pilot program that has been undertaken to demonstrate iris searching capabilities.

Mullin said his company was told the FBI plans to conduct an iris pilot in 2014. Local agencies in 47 states now participate in B12’s nationwide Inmate Identification and Recognition System, or IRIS, which has been operating for six years, he said.

FBI officials declined to comment on progress using NGI for iris matching. “Because we are in the early stages of development of additional biometric capabilities, including the facial recognition pilot, there is no new information to report at this time,” said Stephen G. Fischer Jr., a spokesman for the FBI’s criminal justice information services division.

The interstate network that BI2 maintains uses a high-resolution camera to obtain an image of an offender’s iris during the booking process. Special software then transforms the picture into a digital file that is encrypted and stored with the company. For recognition purposes, the camera takes a live shot of an individual’s iris and the software then compares the new image with archived iris pictures collected during intake to confirm the person’s identity.

“Everybody that gets booked into our adult detention center, we get a capture of their iris. That gets hooked to their photo. And then everybody that’s being released goes through the system again to make sure we’re getting ready to release the same person,” said James Kimble, deputy chief of the Pinal County Adult Detention Center.

Pinal County used $30,000 in state funds to buy three cameras, supporting devices and access to BI2’s nationwide iris database, he said. Within a few months, some Pinal patrol officers will receive a handheld recognition tool that synchs with the database through an iPhone app.

The Yavapai County Sheriff’s Office in Arizona also is using iris recognition for many of the same safety purposes, said Dwight D’Evelyn, media/crime prevention coordinator for the office. Yavapai contracts with BI2 using in-house jail enhancement funds. “The data is stored in both the system of record at the Yavapai County Sheriff’s Office and the national server,” he said. “The iris images are stored, accessed and utilized by participating agencies on the national server, which is located at a secure site in Texas.”

D’Evelyn stressed that the iris files are the property of the sheriff’s office and, “during transmission, the iris images are always encrypted.”

Security Concerns

Jennifer Lynch, a staff attorney at the Electronic Frontier Foundation, a digital rights group, found the concept of a privately run, national iris network disconcerting because of the many recent data breaches at businesses. She cited financial institutions exposing customer account data and passwords stolen from job seekers using the professional networking website LinkedIn.

“That’s really concerning to me — the fact that they are held by a private company,” Lynch said. “You can change your credit card data. But you can’t change your biometric data.”

Oftentimes, however, the data cribbed during these incidents was not adequately encrypted, cybersecurity experts are quick to note.

BI2’s iris images are “encrypted using strong cryptographic algorithms to secure and protect them,” the company website states. “Thus, standing alone, biometric templates cannot be reconstructed, decrypted, reverse-engineered or otherwise manipulated to reveal a person’s identity. In short, biometrics can be thought of as a very secure key: Unless a biometric gate is unlocked by using the right key, no one can gain access to a person’s identity.”

The average iris recognition time — from when an image is captured to when an officer receives a response — is 7.8 seconds, Mullin said.

“No agency — and there are more than 400 BI2 systems in operation across the nation — that has implemented BI2’s IRIS technology has ever had an erroneous or mistaken release because of an identification error,” he said.

During a six-month period at the Los Angeles County Sheriff’s Department, BI2’s system immediately spotted 119 repeat offenders previously booked by the department who provided different names and identification to avoid detection, Mullin said.

The June FBI advisory board memo states the bureau has chosen an L-1/MorphoTrust iris capture system for NGI. (L-1 Identity Solutions was acquired in 2011 by Safran and reorganized as MorphoTrust.) In 2011, the Missouri Sheriff’s Association bought the same system using federal grant money partly so the association’s database could eventually interface with NGI, said Jeff Merriman, a grant consultant for law enforcement agencies. He also works part time for the Jasper County Sheriff’s Office in Missouri, where he was a former police commander.

Jasper and more than 50 other Missouri agencies are hooked up to the association’s central system for statewide sharing, he said.

“Not only are we capturing multibiometrics at jails and prisons, we are also linking dozens of disparate criminal records systems across the state, connecting the dots between all the offenders and using that information tactically to combat crime,” Merriman said.

But the Missouri iris scans can’t get to the FBI. The problem is the Missouri State Highway Patrol, which is responsible for sharing criminal history records with the FBI, doesn’t have an iris database to collect the state’s iris files, he said. The FBI visited the Missouri Sheriff’s Association biometric system as part of the bureau’s NGI research, according to Merriman.

Now, he is working with law enforcement agencies in Oklahoma and Tennessee to acquire grant money for starting iris database systems that can connect with the Missouri Sheriff’s Association biometric system.

Separately, York County Prison in Pennsylvania has been using an LG Electronics iris recognition system for about a decade, prison spokesman Joe Borgiel said.

The Electronic Frontier Foundation’s Lynch said she was concerned by the breadth of iris recognition in the law enforcement realm. That said, she added, iris scans can be less sneaky than facial searches, which governments and social networks such as Facebook are embracing. Nextgov reported in 2011 that the FBI would begin a limited trial of facial recognition in early 2012.

“With iris scans and facial recognition, one of the differences is you can take a picture of a face surreptitiously,” Lynch said.

Thomas E. Bush III, who helped develop NGI’s system requirements when he served as assistant director of the FBI’s criminal justice information services division between 2005 and 2009, acknowledged people will worry about authorities combing through candid videos and photos for suspects, and, inadvertently, collecting images of innocent passersby.

“I’m an American citizen. I get that,” he said, but, “no, we will obtain these from the people who come into contact with law enforcement.”

Bush, now a private consultant, added, “It’s not public source data.” And, the FBI would not upload a bank vault’s iris database into NGI. “The FBI’s No. 1 priority is protection of civil rights,” he said.

In 2008, the bureau distributed a privacy impact assessment describing controls to ensure NGI complies with federal privacy regulations. FBI officials have said the bureau has an elaborate system of checks and balances to guard irises, palm prints, mug shots and all manner of criminal history data.

“The information sharing of the future is biometrically based,” Bush said. “That’s when you know that you have Tom Bush. This makes me more confident that I do have the right [bad] Tom Bush and then the good Tom Bush goes on his merry way. It’s about getting the right bad guy . . . We’ve got limited resources.”

Read more

Law enforcement officials searching for an easier way to confirm someone’s identity may have a solution in sight. A new device is nearing the mass production stage that combines iris, facial and fingerprint recognition scanning into a smartphone, giving nearly instantaneous identification results to an officer using it in the field.

First tested in 2010, the handheld Mobile Offender Recognition and Information System (MORIS) connects to a smartphone and allows a user to take a snapshot of a person’s face or iris, or a fingerprint scan, and then transmit the data over a secure wireless network to a national database. If a match is found, seconds later the officer using the device is given confirmation on the person in question, including any previous criminal history.

The biometric technology itself isn’t new. Created by Biometric Intelligence and Identification (BI2) Technologies, MORIS is based on the company’s Inmate Identification and Recognition System (IRIS), which has been used at jails and detention centers for the past few years to identify and record the scans and information of inmates and sex offenders.

Sean Mullin, CEO of BI2 Technologies, said the smallest IRIS device weighs 12 to 15 pounds and had to be tethered to a server or laptop. But after getting some feedback from clients that agencies wanted their officers to have the same functionality in the field, his company focused on miniaturizing its product and combining all three scans — iris, face and fingerprint — into a mobile platform.

In comparison to the MORIS devices that were tested in 2010, Mullin said the ergonomics were changed from a “boxy” configuration to a more hand-friendly shape. Another change included decreasing the amount of steps required in the capture recognition process.

“The fewer times the user has to hit a button or type something in, the better,” Mullin said. “I think we reduced it down to … about three button clicks.”

The company is in the process of getting its app and the MORIS hardware approved for use on Android devices and Apple’s iPhone. BI2 will also assemble and produce the finished device. The total price for each unit, including the smartphone, will be roughly $3,000, plus an annual maintenance fee, which Mullin priced at approximately 18 percent of the acquisition price.

Mullin said MORIS should be ready for a full rollout by early fall and will be available only to government agencies and officials.

“It’s a game-changer for law enforcement,” said Paul Babeu, sheriff of Pinal County Ariz., who recently saw a demo of the mobile system at a meeting of Arizona sheriffs.

Babeu said that because his jurisdiction is larger than the state of Connecticut, it can sometimes take officers up to two hours to get back to the main office to verify someone’s identity. But with MORIS, Babeu is confident officers will be able to get that information in seconds and know what type of person they are dealing with, from minor offenders to wanted terrorists, and take the appropriate actions.

“The greatest benefit is timeliness,” Babeu said. “In 10 to 15 seconds you’re going to find out if that person is in our database.”

Mullin said an iris scan sent over the network will usually yield a result in five to seven seconds, depending on whether a person is located in a building or out in the open. Facial recognition results take a bit longer, averaging from 45 seconds to just more than a minute. Fingerprint results can vary as well, but take typically around a minute, according to Mullin.

Making a Connection

The algorithms used by IRIS and MORIS scan matches through various law enforcement databases. Mullin said fingerprints and facial recognition scans are compared against the Integrated Automated Fingerprint Identification System (IAFIS) biometric database, which is maintained by the FBI.

The FBI database contains mug shot photos, fingerprints and criminal history for more than 66 million people, including fingerprints from 73,000 known and suspected terrorists, according to the FBI’s website.

Mullin and Babeu both called scans of a person’s iris the most accurate form of identification, however, iris scans taken by MORIS are judged against a completely separate records system. That information resides in a system built by BI2 Technologies in conjunction with the National Sheriffs’ Association.

While Mullin said the iris database doesn’t have as many records nor is as widely used as IAFIS, he expects that to change over time.

“To give you an idea, it is getting filled quickly and in a period [from] November and December 2010 to January 2011, this database performed 3.12 billion cross-matches successfully without one false match,” Mullin said.

Babeu, whose department currently uses IRIS, was confident in the accuracy of the system and the technology behind it. “There has not been one false positive,” Babeu said regarding scans taken by his department using IRIS. “Every part of the information has been accurate.”

Looking Ahead

MORIS, however, does have at least one drawback. It can’t simultaneously take and submit all three channels of information — iris, facial, fingerprint — and get one all-encompassing result. But Mullin hinted that advancement could be on the horizon.

“Today, except for what we are working on — in some let’s call it ‘department of defense-type intelligence applications,’ which are not publicly available — no one has put together what we call the ‘fusion’ algorithm,” Mullin said.

Babeu said he hasn’t seen anything else out there in the market that provides this sort of mobile technology and revealed that he has already placed an order for 75 MORIS devices. But he also added that a lot more law enforcement agencies need to commit to the technology for it to really become a force multiplier across the country.

“My only concern is that there is a building time,” Babeu said. “Even though we’re fully embracing it … for this to work to its maximum potentially, we need to roll this out nationally.”

Read more