News of potential synthetic biometrics is alarming and could eventually turn out to be a new permutation in credential stuffing, as hackers are able to access parts of fingerprints, reproduce them, then use them in large-scale attacks.
Institutions seeking to thwart the threat of these attacks need to move beyond relying on solely a biometric, and consider invisible multifactor authentication solutions that cannot be replicated by third parties, such as a reliance on cryptographic key-based authentication combined with device, environmental and behavioral technologies.
By their very nature, they are easy to use, issued and leveraged invisibly to the user, defying credential stuffing and the threat of artificially produced synthetic biometrics.
Recently, researchers have used a neural network to generate artificial fingerprints that work as a “master key” for biometric identification systems and prove fake fingerprints can be created.
And according to a
As with much research, any flaws in existing authentication systems are a demonstrably important part of future development to secure systems. The report explains that “the underlying method is likely to have broad applications in fingerprint security as well as fingerprint synthesis.” The results specifically revealed that DeepMasterPrints are more successful in matching against fingerprints pertaining to a large number of distinct identities, and generate complete images — as opposed to just minutiae templates — which can potentially be used to launch a practical DeepMasterPrint attack.
In layman’s terms, this means these systems do not blend all the partial images to compare the full finger against a full record. They simply compare the partial scan against the partial records. So essentially, an attacker has to match just one of tens or hundreds of saved partial fingerprint in order to be granted access.