It took decades to challenge systemic barriers and exclusion for disabled Americans to secure their right to exist and fully participate in our society. Thirty-three years ago, Congress passed the Americans with Disabilities Act to protect people with disabilities from disability discrimination. In essence, the law was intended to facilitate access to transportation and public places, such as restaurants and workplaces. However, people with disabilities still face a myriad of challenges that infringe on their right to survive and thrive in our society, and mass surveillance technology only makes the problem worse.
Data-driven surveillance technology has been incorporated into almost every sector of public life, appearing in our malls, recreation centers, and the transit lines that connect them. City officers and corporate representatives promised that these tools would make our lives easier and our forays into the public sphere more secure. However, countless studies have shown that these technologies are inherently partial and discriminatory because they are not being built or used with accessibility in mind.
Biometric monitoring software is programmed to compare behaviors using a baseline built into its design that does not account for diversity and nuances of disabilities. Assumptions rooted in ableism about how disabilities can and should be viewed are entrenched in these systems, putting people with disabilities at risk of being singled out or suffering dehumanizing punishment for simply existing as themselves.
Take, for example, Amazon’s Flex program, which uses an app to track Amazon delivery drivers with the intention of incentivizing or penalizing them based on their efficiency. This rules out the experiences of workers with disabilities, and Amazon’s algorithmic management system has been reported to fire the slowest people, regardless of the person’s disability or access needs.
The spread of biometric technology is also a threat to the health and safety of people with disabilities. In March, New York City Mayor Eric Adams proposed that stores ban customers from refusing to remove masks and expose their faces to surveillance cameras equipped with facial recognition. This policy discriminates against immunocompromised individuals, putting those who rely on the life-saving health benefits of masks at risk. Additionally, by flagging shoppers who cannot remove their masks, Mayor Adams is setting a standard of denying entry and even criminalizing those who cannot adjust their behavior and appearance to the demands of the surveillance state.
Mass surveillance technologies imposed on people with disabilities threaten the very meaning of accessibility and risk excluding them from society, misinterpreting their behavior as dangerous and robbing them of their autonomy. As a country, we must do more to ensure that technological change does not come at the expense of disability rights and justice.
Lawmakers have been slow to act, as they often are with technological changes, but there is some progress. The New York City Council recently introduced a landmark ordinance that bans facial recognition in places of public accommodation, ensuring that biased technology does not threaten or exclude people with disabilities from public life.
Reportedly ( as these surveillance tools continue to become more prominent and inescapable, the urgency with which lawmakers must act cannot be overstated.
Sarah Roth is a Development and Communications Fellow at the Surveillance Technology Oversight Project (STOP).
Evan Enzer is a privacy professional and legal member of STOP
Copyright 2023 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.