Second irrational fear: maybe a criminal or an enemy will make a copy of my face using a 3D printer (they're more affordable than ever before(Opens in a new tab), guys!) Maybe this will become a hot new crime wave: look for friendly drunk girl. Take hi-res photo. Steal phone. Print her face. Steal EVERYTHING. Minority Report is not that far from reality, I'm just saying.
The increasing ubiquity of high-quality cameras, microphones and fingerprint readers in many of today's mobile devices means biometrics will continue to become a more common method for authenticating users, particularly as Fast ID Online has specified new standards for authentication with biometrics that support two-factor authentication with biometric factors.
However, the adoption of face recognition technologies like these is occurring without meaningful oversight, without proper accuracy testing of the systems as they are actually used in the field, and without the enactment of legal protections to prevent internal and external misuse. This has led to the development of unproven, inaccurate systems that will impinge on constitutional rights and disproportionately impact people of color.
People should not be forced to submit to criminal face recognition searches merely because they want to drive a car. They should not have to worry their data will be misused by unethical government officials with unchecked access to face recognition databases. They should not have to fear that their every move will be tracked if face recognition is linked to the networks of surveillance cameras that blanket many cities. Without meaningful legal protections, this is where we may be headed.
Technical issues endemic to all face recognition systems mean false positives will continue to be a common problem for the foreseeable future. Face recognition technologies perform well when all the photographs are taken with similar lighting and from a frontal perspective (like a mug shot). However, when photographs that are compared to one another contain different lighting, shadows, backgrounds, poses, or expressions, the error rates can be significant.5 Face recognition is also extremely challenging when trying to identify someone in an image shot at low resolution6 or in a video,7 and performs worse overall as the size of the data set (the population of images you are checking against) increases, in part because so many people within a given population look similar to one another. Finally, it is also less accurate with large age discrepancies (for example, if people are compared against a photo taken of themselves when they were ten years younger).
All government data is at risk of breach and misuse by insiders and outsiders. However, the results of a breach of face recognition or other biometric data could be far worse than other identifying data, because our biometrics are unique to us and cannot easily be changed.
As of December 2017, NGI included more than 74 million biometric records in the criminal repository and over 57.5 million records in the civil repository.41 By the end of fiscal year 2016, it also already contained more than 51 million civil and criminal photographs searchable through face recognition.42
Adding face recognition to body-worn cameras would also undermine the primary original purposes of these tools: to improve police interactions with the public and increase oversight and trust of law enforcement. People are much less likely to seek help from the police if they know or suspect not only that their interactions are not being recorded, but also that they can be identified in real time or in the future. This also poses a grave threat to First Amendment-protected speech and the ability to speak anonymously, which has been recognized as a necessity for a properly-functioning democracy since the birth of the United States.107 Police officers are almost always present at political protests in public places and are increasingly wearing body-worn cameras while monitoring activities. Using face recognition would allow officers to quickly identify and record specific protesters, chilling speech and discouraging people who are typically targeted by police from participating. Face recognition on body-worn cameras will also allow officers to covertly identify and surveil the public on a scale we have never seen before.
Although some believe that Congress is best positioned to ensure that appropriate safeguards are put in place for technologies like face recognition, Congress has been unable to make non-controversial updates to existing law enforcement surveillance legislation,120 much less enact new legislation. For that reason, the best hope at present is that states will fill the void, as several states have already in other contexts by passing legislation that limits surveillance technologies like location and communications tracking.121
Microsoft has established several principles to address the ethical issues of facial recognition systems. It has released training resources and new materials to help its customers become more aware of the ethical use of this technology.
The security of the biometric authentication data is vitally important, even more than the security of passwords, since passwords can be easily changed if they are exposed. A fingerprint or retinal scan, however, is immutable. The release of this or other biometric information could put users at permanent risk and create significant legal exposure for the company that loses the data.
Apple unveiled its new iPhone X Tuesday, and it will include extensive face recognition capabilities. Face recognition (as I have discussed) is one of the more dangerous biometrics from a privacy standpoint, because it can be leveraged for mass tracking across society. But Apple has a record of achieving widespread acceptance for technologies that it incorporates into its phones. So what are we to think of this new deployment?
Of course, whatever promises Apple makes today could be rolled back in the future, not to mention ignored by other companies if the technology becomes standard. Our big worry is that face recognition will be used to identify and tag people in new, privacy-invasive contexts, leading ultimately perhaps to a pervasive system of identification that tracks Americans in their every movement. Face recognition from mobile phone unlocking could certainly in the future become a key part of such a surveillance infrastructure.
So to wrap it all up. Apple's neural network has an abundance of information about human faces (think Arya Stark with slightly less murder), learns your individual face and feeds it into their neural network in order to authenticate the user. This is all done in an instant and doesn't drain your battery. The best part.... the information never leaves your device. Just like Touch ID, all your personal biometric data is locked within the secure enclave and never uploaded to anyone's servers.
There are a lot more nuances to how Face ID works and Apple was generous enough to provide us with a white paper on the details. Our team is already working on new applications that will utilize the API's provided for the 3D mapping of your face. Stay tuned by subscribing to the AndPlus blog!
MFA [multifactor authentication] adoption will continue to grow for both business and personal use, including increased use of biometric forms of authentication that improve security and convenience (that is, unlocking devices with a fingerprint or face identification).
Accelerated digital transformation, remote working, more connected devices, new technology, and demand for mobility and access create ever-growing environments for security teams to guard and protect. More and more security signals from across entire organizations will generate growing volumes of disparate log and event data that must be collected, investigated and responded to quickly to effectively address potential issues.
Machine learning algorithms (MLAs), supervised and unsupervised, are used heavily in different well-known applications, such as spam filtering, expert systems, and friend suggestions in a social network. Many programming libraries in all programming languages have been written to allow the implementation of MLA in few lines. This allows researchers to focus on the developed application and the interpretation of the data. Figure 8 shows the most popular MLA utilized in the conducted smart device sensor hidden data extraction works. As shown in the figure, the number of these algorithms is massive and they cannot be introduced in one paper. However, three main algorithms will be introduced in the next sections: random forest, support vector machine (SVM), and artificial neural network (ANN). These algorithms have been selected since they have been leveraged in more than 70% of the conducted research surveyed in this paper.
Information technology is used for all kinds of surveillance tasks. Itcan be used to augment and extend traditional surveillance systemssuch as CCTV and other camera systems, for example to identifyspecific individuals in crowds, using face recognition techniques, orto monitor specific places for unwanted behaviour. Such approachesbecome even more powerful when combined with other techniques, such asmonitoring of Internet-of-Things devices (Motlagh et al. 2017).
In two decades, biometric recognition transitioned from spy-movie futuristic tech to an everyday utility. Biometric passports allow us to jump queues at airports. Personal devices, such as mobile phones, now come with state-of-the-art fingerprint scanners for unlocking. Law enforcement agencies all over the world use facial recognition to look for suspects and identify threats. The tech is so easily available today that some drive-in restaurants, such as Wow Bao in the US, started to use face recognition to match payments and authorise take-out delivery. Biometric signatures are billed as the solution to people forgetting usernames and passwords. But the tech is not fool-proof, especially the consumer-grade kits that are emerging everywhere, and the dangerous trend of overusing flawed tech is makes me fear potential disasters. 2b1af7f3a8