Beyond the iris scans of science fiction and spy movies, the real world often relies on fingerprint and face authentication on smartphones and voice verification by smart speakers and in brokerage calls. In daily life, generative artificial intelligence (AI) transformed “selfies” to beautified “shallow fakes” and culturally idealized avatars, which are recycled to train dynamically updated AI models or extended in “deepfakes” to defraud the public. Facial authentication/recognition technology facilitates the identification of individuals in mobs or victims of war but is also vulnerable to bias and misuse.
Massachusetts protected against identity theft by providing security for “biometric indicators” in 2007 with G.L. c. 93I; it also regulated law enforcement use of facial recognition with the 2021 enactment of G.L. c. 6, § 220. Laws and regulations in other states and countries that recognize the sensitivity of biometric data—such as the Illinois Biometric Information Protection Act which provides a private right of action— apply to many data collectors and processors with which regional individuals and businesses routinely interact.
Join us for an online briefing on the law, risks, and opportunities of biometric authentication.
MCLE webcasts are delivered completely online, underscoring their convenience and appeal. There are no published print materials. All written materials are available electronically only. They are posted 24 hours prior to the program and can be accessed, downloaded, or printed from your computer.