Fighting Disciplinary Technologies – from EFF

An expanding category of software, apps, and devices is normalizing cradle-to-grave surveillance in more and more aspects of everyday life. At EFF we call them “disciplinary technologies.” They typically show up in the areas of life where surveillance is most accepted and where power imbalances are the norm: in our workplaces, our schools, and in our homes.

At work, employee-monitoring “bossware” puts workers’ privacy and security at risk with invasive time-tracking and “productivity” features that go far beyond what is necessary and proportionate to manage a workforce. At school, programs like remote proctoring and social media monitoring follow students home and into other parts of their online lives. And at home, stalkerware, parental monitoring “kidware” apps, home monitoring systems, and other consumer tech monitor and control intimate partners, household members, and even neighbors. In all of these settings, subjects and victims often do not know they are being surveilled, or are coerced into it by bosses, administrators, partners, or others with power over them.

Disciplinary technologies are often marketed for benign purposes: monitoring performance, confirming compliance with policy and expectations, or ensuring safety. But in practice, these technologies are non-consensual violations of a subject’s autonomy and privacy, usually with only a vague connection to their stated goals (and with no evidence they could ever actually achieve them). Together, they capture different aspects of the same broader trend: the appearance of off-the-shelf technology that makes it easier than ever for regular people to track, control, and punish others without their consent.

The application of disciplinary technologies does not meet standards for informed, voluntary, meaningful consent. In workplaces and schools, subjects might face firing, suspension, or other severe punishment if they refuse to use or install certain software—and a choice between invasive monitoring and losing one’s job or education is not a choice at all. Whether the surveillance is happening on a workplace- or school-owned device versus a personal one is immaterial to how we think of disciplinary technology: privacy is a human right, and egregious surveillance violates it regardless of whose device or network it’s happening on.

And even when its victims might have enough power to say no, disciplinary technology seeks a way to bypass consent. Too often, monitoring software is deliberately designed to fool the end-user into thinking they are not being watched, and to thwart them if they take steps to remove it. Nowhere is this more true than with stalkerware and kidware—which, more often than not, are the exact same apps used in different ways.

There is nothing new about disciplinary technology. Use of monitoring software in workplaces and educational technology in schools, for example, has been on the rise for years. But the pandemic has turbo-charged the use of disciplinary technology on the premise that, if in-person monitoring is not possible, ever-more invasive remote surveillance must take its place. This group of technologies and the norms it reinforces are becoming more and more mainstream, and we must address them as a whole.

To determine the extent to which certain software, apps, and devices fit under this umbrella, we look at a few key areas:

The surveillance is the point. Disciplinary technologies share similar goals. The privacy invasions from disciplinary tech are not accidents or externalities: the ability to monitor others without consent, catch them in the act, and punish them is a selling point of the system. In particular, disciplinary technologies tend to create targets and opportunities to punish them where none existed before.

This distinction is particularly salient in schools. Some educational technology, while inviting in third parties and collecting student data in the background, still serves clear classroom or educational purposes. But when the stated goal is affirmative surveillance of students—via face recognition, keylogging, location tracking, device monitoring, social media monitoring, and more—we look at that as a disciplinary technology.

Consumer and enterprise audiences. Disciplinary technologies are typically marketed to and used by consumers and enterprise entities in a private capacity, rather than the police, the military, or other groups we traditionally associate with state-mandated surveillance or punishment. This is not to say that law enforcement and the state do not use technology for the sole purpose of monitoring and discipline, or that they always use it for acceptable purposes. What disciplinary technologies do is extend that misuse.

With the wider promotion and acceptance of these intrusive tools, ordinary citizens and the private institutions they rely on increasingly deputize themselves to enforce norms and punish deviations. Our workplaces, schools, homes, and neighborhoods are filled with cameras and microphones. Our personal devices are locked down to prevent us from countermanding the instructions that others have inserted into them. Citizens are urged to become police, in a digital world increasingly outfitted for the needs of a future police state.

Discriminatory impact.

Disciplinary technologies disproportionately hurt marginalized groups. In the workplace, the most dystopian surveillance is used on the workers with the least power. In schools, programs like remote proctoring disadvantage disabled students, Black and brown students, and students without access to a stable internet connection or a dedicated room for test-taking. Now, as schools receive COVID relief funding, surveillance vendors are pushing expensive tools that will disproportionately discriminate against the students already most likely to be hardest hit by the pandemic. And in the home, it is most often (but certainly not exclusively) women, children, and the elderly who are subject to the most abusive non-consensual surveillance and monitoring.

And in the end, it’s not clear that disciplinary technologies even work for their advertised uses. Bossware does not conclusively improve business outcomes, and instead negatively affects employees’ job satisfaction and commitment. Similarly, test proctoring software fails to accurately detect or prevent cheating, instead producing rampant false positives and overflagging. And there’s little to no independent evidence that school surveillance is an effective safety measure, but plenty of evidence that monitoring students and children does decrease perceptions of safety, equity, and support, negatively affect academic outcomes, and can have a chilling effect on development that disproportionately affects minoritized groups and young women. If the goal is simply to use surveillance to give authority figures even more power, then disciplinary technology could be said to “work”—but at great expense to its unwilling targets, and to society as a whole.

The Way Forward
Fighting just one disciplinary technology at a time will not work. Each use case is another head of the same Hydra that reflects the same impulses and surveillance trends. If we narrowly fight stalkerware apps but leave kidware and bossware in place, the fundamental technology will still be available to those who wish to abuse it with impunity. And fighting student surveillance alone is untenable when scholarly bossware can still leak into school and academic environments.

The typical rallying cries around user choice, transparency, and strict privacy and security standards are not complete remedies when the surveillance is the consumer selling point. Fixing the spread of disciplinary technology needs stronger medicine. We need to combat the growing belief, funded by disciplinary technology’s makers, that spying on your colleagues, students, friends, family, and neighbors through subterfuge, coercion, and force is somehow acceptable behavior for a person or organization. We need to show how flimsy disciplinary technologies’ promises are; how damaging its implementations can be; and how, for every supposedly reasonable scenario its glossy advertising depicts, the reality is that misuse is the rule, not the exception.

We’re working at EFF to craft solutions to the problems of disciplinary technology, from demanding anti-virus companies and app stores recognize spyware more explicitly, pushing companies to design for abuse cases, and exposing the misuse of surveillance technology in our schools and in our streets. Tools that put machines in power over ordinary people are a sickening reversal of how technology should work. It will take technologists, consumers, activists and the law to put it right