On May 16, 2018, the ACLU of Massachusetts – together with two members of the 2018 Assembly cohort at Harvard Law School’s Berkman Klein Center and MIT Media Lab, represented by the law firm Fish & Richardson – presented a Freedom of Information Act request to the U.S. Department of Homeland Security (DHS) for documents related to facial recognition technology used by Customs and Border Protection (CBP) and the Transportation Security Administration (TSA).
The ACLU and Berkman Klein Center/MIT researchers submitted the records request to obtain more information about facial recognition technologies deployed at airports and other facilities, as well as CBP and TSA policies governing their use. The request asks for information about the reliability of the government’s facial recognition systems, and specifically, for records documenting how the facial recognition algorithms perform depending on the race, gender, national origin, and/or ethnicity of the data subject. The request also seeks information about DHS policies and procedures, including all policy directives and internal memos related to the use of facial recognition systems, privacy policies that govern the use of the technology, and all contracts with external entities including airlines and airports.
“We as a society are barreling head first into a nightmare scenario, in which privacy and anonymity in public are no longer available to ordinary people,” said Kade Crockford, director of the Technology for Liberty Program at the ACLU of Massachusetts. “But at present, we know very little about how CBP and TSA are using facial recognition technology, despite its serious implications for civil rights and civil liberties. We are making this request to learn more about the technology and the legal framework for its use across the country because we have a right to know what our government is doing with our sensitive biometric data. Only when we have a clear picture of the government’s surveillance activities can we meaningfully engage in a democratic debate about the appropriate limitations of those powers.”
The ACLU’s request follows a trial collaboration last summer between an airline and CBP, allowing JetBlue to experiment with facial recognition software in place of boarding passes. Passengers at Boston’s Logan Airport used a biometric scan of their faces instead of showing their boarding passes on flights to Aruba. Images were then compared against passport or visa photos on file with CBP. Upon matching, customers were able to board without showing their tickets. Facial recognition technology has since expanded to several airlines and airports nationwide.
"We believe ordinary people should have complete knowledge about how their personal data are retained, shared, and processed," said Thom Miano, a Berkman Klein and MIT Media Lab Assembly member. "At the moment, informed consent doesn't really exist for data-driven technologies. We want to empower humans so that they can assert their voices among the machines. Given the growing use of facial recognition and other similar machine learning systems, we seek to examine the policy and societal implications of these technologies as they are currently implemented, giving critical consideration to their present and possible negative externalities.”
“Because artificial intelligence can be spoofed or tricked, there must be an open discourse around the necessary policy mechanisms and governance frameworks for this new surveillance technology,” said Daniel Pedraza, a Berkman Klein and MIT Media Lab Assembly member. “We are hoping to get answers to help determine the unintended consequences and imperfections in the plan to deploy facial recognition technologies widely, including a deeper understanding of the limitations and biases in the machine learning algorithms.”