Building on work that had sought to create fake partial fingerprints for fooling biometric scanners, researchers have used machine learning and artificial intelligence to construct full images of fake fingerprints.
Philip Bontrager, Aditi Roy, Julian Togelius and Nasir Memon, researchers at New York University Tandon, and Arun Ross, researcher at Michigan State University, developed DeepMasterPrints, which are AI-generated images of fake fingerprints that could fool biometric sensors.
“This is the first work that creates a synthetic MasterPrint at the image level, thereby further reinforcing the danger of utilizing small-sized sensors with limited resolution in fingerprint applications,” the researchers wrote in their report. “This work directly shows how to execute this exploit and is able to spoof 23% of the subjects in the dataset at a 0.1% false match rate. At a 1% false match rate, the generated DeepMasterPrints can spoof 77% of the subjects in the dataset.”
DeepMasterPrints build on MasterPrints
The groups’ past research into fake fingerprints — dubbed MasterPrints — exploited the fact that fingerprint sensors don’t scan a subject’s entire finger, but just the part that touches the sensor first. This means fingerprint scanners only compare partial images to each other and often focus on “minutiae points,” which the researchers describe as the “ridge endings and ridge bifurcations” of a fingerprint.
Because of this, the researchers realized they could develop DeepMasterPrints, which include multiple common minutiae points, and use the same generated fingerprint image to impersonate more than one user. The DeepMasterPrints were “twice as good at spoofing a system as a random real fingerprint.”
“This suggests that the generated images display common features more often than the real data distribution,” the researchers wrote. “As a sanity check, we provide images of randomly generated noise to the matchers and they found no minutiae points. This means that the generator is not only producing images that look like fingerprints to humans, but they are algorithmically being identified as fingerprints too.”
The researchers said they could use DeepMasterPrints to run practical attacks on biometric sensors that were similar to dictionary attacks. And, because DeepMasterPrints generated full images of fake fingerprints that look like real fingerprints to the human eye and were identified as real fingerprints by sensors, the attacks could scale as well.
“The method proposed in this paper was found to (1) result in DeepMasterPrints that are more successful in matching against fingerprints pertaining to a large number of distinct identities, and (2) generate complete images — as opposed to just minutiae templates — which can potentially be used to launch a practical DeepMasterPrint attack,” the researchers wrote. “Experiments with three different fingerprint matchers and two different data sets show that the method is robust and not dependent on the artifacts of any particular fingerprint matcher or dataset.”
Bimal Gandhi, CEO of Uniken, agreed the potential for attacks with DeepMasterPrints is real.
“This news of potential synthetic biometrics is alarming and could eventually turn out to be a new permutation in credential stuffing, as hackers are able to access parts of fingerprints, reproduce them, then use them in large scale attacks,” Gandhi said. “Institutions seeking to thwart the threat of these attacks need to move beyond relying on solely a biometric, and consider invisible multifactor authentication solutions that cannot be replicated by third parties, such as cryptographic key based authentication combined with device, environmental and behavioral technologies. By their very nature, they are easy to use, issued and leveraged invisibly to the user, defying credential stuffing and the threat of synthetic biometrics.”
Synthetic fingerprint research has upside
Sam Bakken, senior product marketing manager at OneSpan, suggested the DeepMasterPrints research could also be used to improve the security of biometric authentication and shouldn’t be cause for immediate concern.
“The costs of executing such an attack are far from negligible and attackers probably don’t see a good return on investment at this time,” Bakken said. “In addition, no security system should rely solely on fingerprint authentication. Defense-in-depth with multiple safeguards can prevent such an attack. A layered approach might include taking into account additional contextual data to score the risk associated with the transaction and if that risk is too high, ask the user to provide another authentication factor.”