Shaping Ethical Technologists in the Age of AI
Author: Denise Ferebee is is an Associate Professor of Computer Science at Rust College. Denise has a B.S. from Mississippi University for Women and a M.S. and Ph. D. from the University of Memphis. She does research in cybersecurity and artificial intelligence; and in her spare time, she likes to garden.
What is PIT?
5 Keys to Institutionalizing PIT
What is PIT-UN?
As AI systems are being rapidly integrated into all aspects of life, we must help young technologists understand and mitigate the myriad unintended consequences that come with automating decision making.
As part of a 2022 PIT-UN Challenge Grant, Zina Parker (LeMoyne-Owen College) and I led a team that created a workbook for faculty and students interested in learning about public interest technology through classroom learning and research projects. Two of the major areas in this workbook focused on how to explore core PIT questions and concerns through data analytics and artificial intelligence.
Data analytics describes the methods used to find trends and draw conclusions from large datasets. Like any technological toolset, data analytics can be wielded in helpful or harmful ways to society
A cautionary tale we use in the workbook involves a person who was denied medical care because of a NarxCare score. The NarxCare score is a data-driven metric to help to determine when someone has a potential for opioid abuse or doctor shopping. In this instance, the person who needed medical care was denied access and their patient/doctor relationship was terminated because their NarxCare score indicated opioid abuse. But in reality, they had merely been purchasing pain medication for their pets under a system that logged drug purchases under the owner’s name; therefore the data used to calculate the score was wrongfully attributed to them, leading to the loss of critical healthcare.
The algorithms used in data analytics are the same or similar algorithms used in most Artificial Intelligence (AI) applications like generative AI, and natural language processors (NLPs) like ChatGPT, raising complicated questions about intellectual property, data privacy and automated inequality
AI tools are further implicated in the rise of misinformation, disinformation, and cybersecurity, raising critical questions about the role of AI in our democracy [LINK TO MARGARET]. To get students thinking about these issues and to help them develop the necessary skills to apply a public interest technology framework to their career in technology, I focus my computer programming courses on a specific set of questions about the fundamental building blocks of computer science.
Asking Key Questions and Getting Specific
Because of our education system’s focus on quantitative measures of success, the students we work with are often primarily focused on completing their assignments and getting a good grade. The main objective of PIT, however, is creating socially responsible technology. PIT professors must contend with this tension and take the time and effort to make sure our students understand the deeper “why” and “how” of PIT.
Computer science is, generally, a utilitarian field that values efficiency and operability. Too often, computer scientists review problems from a purely scientific approach without considering values, purpose and design thoroughly enough, leading to gaps in view and approach that can cause dire unintended consequences.
Take computer programming for example, a core competency and major portion of the creative process in computer science. While there are elements of programming that involve rote memorization and practical skill-building there is far more to creating socially responsible software and technology than completing a test and earning a good grade. There is foundational work that starts before we ever write a single line of code.
We must first consider:
- What problem are you trying to solve with technology?
- What technologies are available to you?
- What is the right technology to help you solve it?
We cannot write code just to write code. We must understand the purpose, the risks and how our code can be misused or influenced. Coders and developers must be aware of unintended consequences from the very beginning.
A key principle for public interest technologists learning how to code is specificity. You cannot be ambiguous when writing code, lest your program will end up delivering outputs you did not anticipate. While no program or application will account for all the harmful possibilities, we do need to make sure that at the very least the predictable harmful possibilities are not “baked in” from the beginning.
As a beginning assignment, I ask my students to write the instructions on how to make a sandwich (an assignment that some writing instructors also use to teach students about the importance of specificity in language). Many students at first write ambiguous instructions: “get the bread,” does not tell the user where the bread is located, for example. Without specific instructions on where or how to “spread the peanut butter,” such an instruction could be interpreted as “spread the peanut butter all over the walls.” With an instructor asking probing questions about each of the steps, students realize that they have to determine what is missing in their instructions, and how those gaps could lead to unintended consequences.
While the stakes are low when making a sandwich, they are quite high when writing code; and with so many of our public goods being administered or informed by automated systems nowadays, the instructions we embed in code have wide-ranging implications on peoples’ lives.
These are fundamental questions of values and design. When we do not thoroughly and systematically ask these questions about the technology that we design, we end up with thorny problems like social media algorithms that optimize for outrage and web traffic rather than quality information and civil discourse.
The problems of social media go beyond the realm of coding, of course. They involve many stakeholders, from product managers to CEOs to marketing departments and regulators. Not all of our students will go on to be programmers; as a result, everyone that comes through computer science courses should understand the foundational questions of values and design that undergird computer code.
Taking time to understand the uses and misuses of AI can help students chart a path in public interest technology. It can also help them understand how the data gathered to train AI models is far from perfect, and there are ample opportunities for improvement and research.
Taking a step back can help one to see and understand how technology impacts communities in ways that its designers often do not anticipate, and how we can create technologies that protect human rights and support democratic processes. At Rust College, we will continue to guide students along a constructive path that provides them with the building blocks of public interest technology, cultivating a new generation of technologists equipped and inspired to advance the public interest in whatever career path they choose.