Group argues AI is putting students’ privacy at risk.

Student Groups Band Together to Prevent Use of F/R on Campuses

Evan Greer, deputy director of Fight For The Future, announced the group has teamed up with advocacy group Students for Sensible Drug Policy to work towards banning facial recognition on U.S. campuses. To kick off the grassroots movement, the organizations launched a website and an organizing toolkit for student groups, according to a story by Kyle Wiggers in has written many stories about how AI facial recognition programs are being reined in by city and town councils in several areas across the country. When some people think about F/R, they view it as a crime-fighting tool. But that’s not the only use for the AI program, and it’s not only a threat to criminals. All of the information collected can be hacked and used for harm, as well.

The push is part of Fight for the Future’s broader Ban Facial Recognition campaign, which launched in July 2019 and calls on local, state, and federal lawmakers to prevent government and law enforcement use of facial recognition. While facial recognition isn’t widely deployed on U.S. campuses, Greer of Fight for the Future asserts that it is likely to threaten privacy, civil liberties and equity as companies increasingly market the tech to schools.

“Facial recognition surveillance spreading to college campuses would put students, faculty, and community members at risk. This type of invasive technology poses a profound threat to our basic liberties, civil rights, and academic freedom,” said Greer. “Schools that are already using this technology are conducting unethical experiments on their students. Students and staff have a right to know if their administrations are planning to implement biometric surveillance on campus … The data collected is vulnerable to hackers and in the wrong hands could be used to target and harm students. And it’s invasive, enabling anyone with access to the system to watch students’ movements; analyze facial expressions; [and] monitor who they talk to, what they do outside of class, and every move they make.”

A number of efforts to use facial recognition systems within schools have met with resistance from parents, students, alumni, community members and lawmakers alike. The Lockport City School District in upstate New York abandoned plans to pilot components of a face-recording system after parents learned that the district planned to flag suspended students. At the college level, a media firestorm erupted after people learned that a University of Colorado professor secretly photographed thousands of students, employees, and visitors on public sidewalks for a military anti-terrorism project, and after University of California San Diego researchers admitted to studying footage of students’ facial expressions to predict engagement levels.

Perhaps unsurprisingly, a growing number of activists, academics, and lawmakers have called for restrictions or outright bans on facial recognition technology. Last fall, California imposed a three-year moratorium on facial recognition use in law enforcement body cameras, preceding San Francisco’s ban on facial recognition use by police and city departments. Oakland followed suit in June, after which Berkeley passed a ban of its own.

And in two House Oversight and Reform committee hearings last summer, some of the most prominent Republicans and Democrats in the U.S. Congress joined together on proposals for legislative reform, after the introduction of a bill that would force businesses to receive consent before using facial recognition.

Governments tend to not tell its citizens about its spying activities, such as evidence revealed in the Pentagon Papers. Just last year, the Afghanistan Papers revealed Americans weren’t given the truth. Certain agencies or nefarious employees might abuse such tech. China already uses f/r nationally to track its citizens. It can report jaywalkers and litterers, not to mention demonstrators in Hong Kong. It can present itself as a tool for oppression, something Americans have a problem with.