Skip to content

How PIT Prepares Students to Become Changemakers in Tech

Institutionalizing PIT

March, 2023

Author: Toby Shulruff is a writer, a technology safety project manager, and a graduate student in the Public Interest Technology program at Arizona State University. She recently published a report on Trust & Safety work based on interviews with practitioners in the field.

After nearly 20 years in the gender-based violence prevention field, I decided to go back to school. The knowledge, skills and confidence I developed through Arizona State’s Master’s in Public Interest Technology program have immediately helped me address tech abuse further upstream, rather than patching together solutions after harms have been inflicted, or, worse, simply playing down those harms and shirking responsibility.

Early in my career, I was often “the tech person” at nonprofits, fixing the printer, building a database, or updating websites. These two threads of technology and gender-based violence combined in 2003, when I began working with the Safety Net Project to help local support programs across the US respond to the countless ways in which new technologies like GPS, mobile phones, and social media are used by abusers to stalk and intimidate survivors of violence.

Though we helped many people cope with the immediate threats they were facing, we were too far downstream from the companies designing these tools to stem the tide. 

As we began partnering with privacy advocates and tech companies, I became driven by two key questions:

  1. Why technology is the way that it is?
  2. Could including a wider range of voices shift the direction technology is going?

What I learned at ASU is that public interest tech is not a set of solutions, but rather a whole new way of thinking. It’s a paradigm shift that takes a lot of time, energy and work. Reframing one’s thinking about technology – not to mention effecting real change after graduation – requires the support of a dedicated, interdisciplinary PIT faculty who can help students explore issues from multiple angles and ask the kinds of questions that actually lead to paradigm change. 

My first-year courses at ASU explored the roots of public interest technology in responsible innovation, technology assessment and governance, and public engagement with science and technology. My perspective was broadened immensely by classmates who came from UX design, public policy and entertainment. In the core Codesigning the Future course, we learned how to bring more voices into the design process from the earliest stage of defining the problem, to ensure that tech serves a wide range of human needs, not just the share price of the company building it. We won a prestigious award from the School for the Future of Innovation in Society for our work on a variety of projects including a COVID vaccine locator app, safer websites for domestic violence survivors, community financial tools, and a wildlife streaming camera. 

Once I finished my core courses, I had the chance to dive into the research literature on a specific topic. For the first 20 years of my career, there was very little academic research on gender-based violence and tech. In 2017, the topic caught fire in academia, and I discovered that there was a vibrant community of cybersecurity, sociology and criminology experts in Australia, the U.K. and the U.S. working on this intersection. I found a new vitality in my own field that I hadn’t seen before.

My core faculty at ASU didn’t specialize directly in this area, but I was able to connect with faculty from the School of Social Work and together they showed me how to construct high quality research questions, do literature reviews and translate academic research into actionable insights. Through my existing work and contacts, I was able to dive into a number of global technology and development questions like grassroots innovation and cross-border, bilingual research collaborations within my program and in the growing network of international scholars and practitioners I met from the U.K., Nigeria and Argentina. 

The standard thinking for many years was that stalking, harassment and domestic abuse are old behaviors, and tech just makes them a little easier for people to do. But the newer literature shows that technology has completely transformed these dynamics. It accelerates and amplifies abuse, and enables abusers to reach across space and time. In the past, someone had to physically follow you around to stalk you. Now, you never know when your abuser might pop back up and start monitoring you with some new form of spyware, or harassing you on social media. There’s no endpoint. 

We’ll never arrive at a single solution, but we can articulate important principles for ethical design to guide the decision making process.

I came across cybersecurity researchers at Cornell Tech who were analyzing spyware companies to identify software and settings that are easily leveraged by abusers, or identifying ways that serial stalkers were actually accelerating the uptake of these practices through online forums. University College London and the Oxford Internet Institute are defining an Intimate Partner Threat model based on cybersecurity principles, which for so long focused just on national security, corporate espionage and consumer harms, and completely neglected what happens to people when they are inside their own homes. End-to-end encryption is pretty useless if a violent partner or family member can just coerce me into giving them my password. There’s a growing movement of feminist technologists who are pushing back against these underlying assumptions in computer science and cybersecurity.

In my research and work with tech companies, I can now lift up these voices and insights, and because of my PIT coursework, I can recognize the limitations of my clients’ design frameworks and help them to be more responsive to the real life impacts of their products. So often, they just want to know, “what’s the one big threat to worry about?” or “what are the one or two personas we’re designing for?” I push them to consider power dynamics, and also the multitude of perspectives and needs of people who have to cope with the disproportionate impacts of technology

For example, the newer location tracking devices for lost wallets or keys have been used for stalking. Understanding both the threat models of gender-based violence and design processes helps companies to consider really tricky design questions that weren’t on their radar, like how to help people find out if they are being tracked with one of the company’s devices, or what kind of governance the industry as a whole could adopt in order to protect users. With my clients we can surface all kinds of tradeoffs that they would not have recognized on their own: if the device makes a noise, would that end up putting the threatened person in more danger? What if the person is deaf, so a noise wouldn’t be helpful? There are so many users, there’s never just one persona and how you design tech can define life-or-death situations.

The first step to designing better tech is identifying the issues you weren’t seeing before, and establishing new principles and goals accordingly. Thanks to the hard work of faculty and administrators at Arizona State who laid the groundwork for my comprehensive PIT education, the work I do every day is addressing the foundational questions about tech that really drive me – and that I hope will drive change in the industry in the years to come.