Who (not What) is Cybersecurity For?
Author: Lana Ramjit is a Computing Innovations Postdoctoral Fellow at Cornell Tech where she serves as the Director of Operations for the Clinic to End Tech Abuse, a program which provides free consultative services to survivors of intimate partner violence experiencing technology-facilitated abuse. She earned her PhD in Computer Science at UCLA and her BA in Computer Science at Columbia University.
What is PIT?
What is PIT-UN?
5 Keys to Institutionalizing PIT
In July 2021, a groundbreaking investigative report revealed that spyware produced by the Israeli firm NSO had been used to target not just violent criminals, terrorists, and military targets, but also activists, journalists, and political opponents. Known as Pegasus, NSO’s spyware was unprecedented: it could read encrypted messages, listen in on phone calls, and track location without the target ever knowing. Even seasoned cybersecurity experts struggled to detect it.
That Pegasus was being used to target private, law-abiding citizens for political reasons was shocking in and of itself. However, one of the first confirmed targets was targeted for reasons that were not military, criminal, or even political, but domestic. During the divorce of Jordanian Princess Haya bint Hussein from Sheikh Mohammed bin Rashid al Maktoum of Dubai, her phone was targeted by Pegasus 11 times.
Most people are not in danger of being targeted with tools as sophisticated as Pegasus. Nonetheless, Princess Haya’s story fits a common pattern of technology abuse, in which technology is used as a tool of abusive control by an intimate partner. When survivors attempt to assert their independence by leaving, seeking a divorce, or fighting for custody, abusers use whatever power they have to reassert control. And technology, both a literal and figurative conduit of power, is yet another battleground for this power struggle.
Safety, Security & Power
Cybersecurity has long been associated with military-grade defenses, protecting corporate intellectual property, and prosecuting networks of violent criminals, so much so that the language of cybersecurity is steeped in the language of warfare. Recently, the computer security and privacy community has pushed back, reconceptualizing the field as one of safety for human individuals rather than security for technical systems.
This shift acknowledges that the dynamics of power that play out at the level of nation-states and corporations are often mirrored, with equally devastating impact, at the scale of human relationships. While Sheikh Mohammad, as a political figure, had access to much more powerful tools than the average abuser, this remains true even for private citizens. And at that scale, we are woefully lacking in our capacity to respond.
As the director of operations of the Clinic to End Tech Abuse at Cornell Tech, I oversee a manifestation of cybersecurity as a tool for safeguarding people, not systems. At CETA, we work directly with survivors of intimate partner violence who are referred to us by New York City anti-violence organizations. We develop technology safety plans tailored to our clients’ individual situations. For example, removing tracking software from a survivor’s phone might escalate their abuse; a survivor-centered safety plan may instead teach them how to evade or manipulate the software’s location tracking data.
Working with such clients lays bare the utter inadequacy of our legal, municipal, and medical systems in responding to the threat of technology abuse within everyday society. Examples include:
- Law enforcement failing to investigate online harassment.
- Insufficient protocols for enforcing restraining orders with respect to digital communication, including nonconsensual intimate imagery (NCII), aka “revenge porn.”
- The failure to regulate the manufacture, sale, and misuse of devices that enable stalking, such as GPS trackers and spyware apps.
- Public aid resources such as housing assistance failing to provide robust online systems resistant to abuse.
- A lack of mental health care for clients struggling with the trauma of technology abuse.
Put another way, technology abuse touches all areas of our complex social systems. And while cybersecurity expertise is necessary to address tech abuse, it is insufficient on its own.
A Necessary Shift toward Human Interests
Cybersecurity experts are trained to identify and address vulnerabilities in technical systems. Their concern typically does not — but should — extend to humans.
Who is most vulnerable to technology abuse? It’s the same as those most vulnerable to abuses of power: children and adolescents, the LGBTQ+ community, disabled persons (including the Deaf and blind community), the elderly, refugees, and the unhoused, to name just a few. These vulnerable populations receive scant attention from cybersecurity experts. The frameworks developed by CETA are borne of careful research and testing with a particular community, survivors of intimate partner violence. We need such clinics similarly tailored for other vulnerable populations, work we hope to inspire and enable through the Technology Abuse Clinic Toolkit.
There is no one-size-fits-all approach to safeguarding individuals across diverse communities, as each has distinct needs and threats. Security mechanisms that work for one population may be devastating for others, or even for different individuals within the same population. For example, turning on second-factor authentication (the use of a second device to authorize logins) may help some survivors of intimate partner violence secure their devices, but for others it may lose them their accounts.
And for the unhoused, who often lack a second device, second-factor authentication can lock them out entirely from systems we all depend on for basic services. Similarly, disappearing messages can help political dissidents and/or the LGBTQ+ community connect with one another safely, but they can also be wielded by abusers for harassment and digital gaslighting.
Designing systems that serve all members of our communities across complex and sometimes conflicting needs will undoubtedly be slow, difficult, and error-strewn. It will require us to work beyond our siloed communities and domains of expertise. We must learn to value flexible, human-mediated processes, rather than rigid systems and magic bullet products.
This way of doing business might not be as flashy (or as profitable) as the typical approach of Silicon Valley. Think of it as a new form of expertise, one that requires not just technical skills but empathy, curiosity, compassion, and adaptability. We will need these capacities just as much as knowledge of new protocols and platforms to safeguard our individual and collective rights to security and well-being in the digital age.