AI Technology Creates Fake Fingerprints to Bypass Biometric Scanners

Researchers from New York University have developed a method for creating synthetic fingerprints using artificial intelligence, capable of deceiving both biometric scanners and the human eye. These replicas, termed DeepMasterPrints, successfully matched 23% of fingerprints in systems that typically boast an error rate of one in a thousand. In scenarios with a false match rate of one in a hundred, DeepMasterPrints accurately imitated real prints 77% of the time.

These artificial fingerprints could be particularly effective in systems where extensive fingerprint data is stored, as opposed to personal devices like your smartphone, which typically hold only a few of your own prints. An attacker may leverage trial-and-error tactics, akin to brute force or dictionary attacks used against passwords, to achieve success.

Most biometric scanners primarily detect partial prints, which necessitates the user moving their finger during setup—such as with TouchID on iOS or fingerprint unlocks on Android. As a result, biometric systems often fail to reconstruct a complete image of a fingerprint, relying instead on comparisons with partial records. This increases the chances of a malicious actor matching a segment of a legitimate print with a generated one.

DeepMasterPrints exploit the fact that while full fingerprints are unique, they frequently share common characteristics. Thus, a synthetic print incorporating these shared features has a higher likelihood of success compared to one that is entirely random. To create these prints, the researchers devised a neural network that targets a variety of partial fingerprints, training a generative adversarial network with a dataset of real prints.

The appearance of DeepMasterPrints is notably convincing, making them capable of misleading human observers as well. In contrast, a previous attempt known as MasterPrints produced synthetic prints that had sharp, angular edges, easily recognizable as fake by human scrutiny, even if they managed to deceive scanners.

The researchers aim to enhance the security of biometric systems through their findings. Philip Bontrager from NYU's engineering school stated, "Without ensuring that a biometric sample is from a living person, many adversarial attacks become feasible. Our ultimate goal is to advance liveness detection in biometric sensors."

Most people like

Find AI tools in YBX