Skip to content

2023 PIT-UN Convening

The 2023 Convening at Boston University explored how PIT practitioners can partner across academia, government, civil society and the private sector. Topics include data privacy, equitable design, cybersecurity, AI governance, climate tech and more.

Below, explore videos of panel discussions featuring PIT-UN members and keynotes from Jasmine McNealy (University of Florida / Mozilla), Danielle Allen (Harvard),  Rumman Chowdhury (Humane Intelligence), Deirdre Mulligan (White House OSTP), and more.

Mya Turner speaks at the 2023 PIT-UN Convening

Day 1 | Thursday, October 12, 2023

KEYNOTE

Welcome from PIT-UN Director Andreen Soley 

Public interest technology is interdisciplinary by nature and requires us to build partnerships across departments and campuses.

Learn how PIT has spurred partnerships within and across institutions that unlock new possibilities in the design, deployment, and governance of technology.

Good morning, and welcome to the 2023 PIT-UN Convening. We made it, we’re finally in the same room together! We have been building this network over Zoom for many years. We ventured out last year at the City University of New York. And we’re excited that Boston decided to host this year’s convening and bring us such good weather and such wonderful views. So thank you all.

I know, because I’ve spoken to some of you, that many of you have heavy hearts because you are thinking of conflicts that are happening outside of this room. So I want to also encourage you to take time to check in on each other and have the deep conversations that need to happen in a community and to take care of yourselves and each other.

I also want to begin by saying that we have been very thoughtful in putting together the panels that you have seen. And part of that is a recognition that I’ve often heard people say, “What is this thing, PIT-UN? What is this thing?” And then the first thing we jump to are the pain points, this is going to be a challenge to do. So today or the next two days are opportunities for you to reflect on the work of our grantees. They’re all in your panels. And they have been an important part of the work that we have been doing together for the past five years in building this field. They are just a representation of a small sample of some of the things that have inspired us to continue to grow and shape this field. So I hope that you take it as inspiration and not a complete picture. Because, of course, it’s a curated experience, right?

The other thing I want you all to leave with, if you leave with anything at all, is that equitable innovation is powered by and for people. When I say “the public interest,” I am curious about which public and how we continue to refine and define that really clearly.

Our goal for the convening, however, is to continue to build the connective tissues of our diverse and ambitious community. I hope that you will take the time to connect with old and new friends, share some of your big ideas and yes, your small ones, too, and find potential collaborators and then celebrate and reaffirm our shared commitment to this work.

The 2023 convening marks a significant milestone in our work at PIT-UN. When I came to New America in 2018, public interest technology was a nascent idea. At the time, our goal was to explore and ground technological development and deployment with an eye toward people and their needs. More and more people were asking questions like, “Where’s all this technology taking us?” “Why does it seem like technology is working against the things we value and care about?” “How can we do things differently?” And importantly, a few people whispered, “Are we even comfortable marrying technology with questions of values?”

An emerging group of technologists, philanthropists, policymakers, researchers, and activists — many in this room — came together to develop a shared vision of a field that would draw on the insights of many disciplines, among them computer science, engineering, public policy, the social sciences, the humanities, and law. Our goal was to make the public interest a core concern of technology, design, development, and governance, not just an afterthought.

We started by organizing a group of universities to define what the space could be. We wanted to imagine what the coursework would look like, to imagine the experiences that students should have, and to identify core competencies for public interest technologists. We started with 21 universities, and we are now at 63 universities.

Over the next two days, you will discover that each university is pursuing public interest technology in unique ways. We have some that are focused on data science, clearly, and they are leveraging data in ways that are more equitable and representative of our diverse communities. We have others that are building cybersecurity clinics that give students experiential learning opportunities with local communities and nonprofits. And I just want to give a shoutout to Stillman, because you’ve just received the funding to continue your work, one of our first HBCUs to do so. So I’m really excited about that. And then others are building technical tools to increase access to court proceedings for marginalized defendants. Those are just two examples of some of the work that’s happening in our network right now.

Over the past five years, we have invested over $15 million in 145 public interest technology projects through our annual Network Challenge. This fall, nine of our member universities are hosting PIT-specific career fairs. And our member-led working groups, which some of you saw happening this morning, have forged connections among universities and have submitted recommendations grounded in the public interest technology framework to the National Science Foundation.

Looking ahead, we’re excited to explore a regional model and how you can work together locally to advance public interest technology. We’re also going to be debuting in 2024 an OER with lots of resources from some of the grantees you are going to meet today and some of the ones you are going to continue to learn about as things go on.

What connects all of these efforts is a commitment to staying with the difficult questions that technology raises: Who stands to benefit from a given technology? And who might that technology harm? How can we better shape technology? And how should our technologies be governed? Those are the grounding questions that come up within our community all the time.

I think it’s often assumed in technology circles (and, I might add, in other circles, too) that innovation means sacrificing someone or some community for the greater good. Public interest technology demands that we interrogate those assumptions and ask again and again: How are we defining innovation? And what is the greater good in the first place? Can we clearly articulate in whose interests we are designing and shaping our technology tools? Public interest demands that we resist an easy tech solution frame. The idea that tech on its own will solve society’s problems continues in many circles. And as an alternative, PIT offers us ways to introduce many different perspectives on the problems we’re trying to solve, whatever the issue may be: access to housing, clean water, education, voting rights, health care.

PIT asks us to approach these complex problems with some humility and some recognition that we cannot solve these problems alone. Each of us needs the other to try to solve these entrenched problems.

Recent global events, from the turmoil in Ukraine, to Israel, to January 6, underscore the double-edged sword that is technology married with violence. On one hand, social media grants individuals the unparalleled ability to share their personal truths, build community, rally support, and offer invaluable documentary evidence. It can serve as a beacon for truth and openness, and it can even facilitate healthy debate. On the other hand, we’re increasingly witnessing its darker side of social media, where apps are weaponized to marginalize, spread falsehoods, deepen societal chasms, and incite acts of violence.

These issues intersect with ethics, trust, philosophy, civic engagement, human rights, and institutional stability. These are the various pillars that public interest technology seeks to buttress. The burgeoning interest among students at PIT-UN–affiliated universities to delve into these critical issues is really a call to action for us. We must pave clear pathways for these conversations and ensure that every participant feels an integral part of our broader field, which is dedicated to fortifying democracy, bolstering institutions, enhancing accessibility, and championing justice.

In essence, our collective endeavor should be centered around strengthening the communities that we serve. That’s our aim and our goal.

Thanks to the leadership of BU and Howard University, we have a small but mighty student track this week with papers, posters, and conversation circles on PIT. I invite you to visit and join the activities on the second floor today. This is reiterating what was said, but I am really encouraged and excited by the student work that’s from 2:30 to 5 p.m. on the second floor, as your schedule will allow you.

One thing to note is that BU and Howard hosted a wildly successful hackathon last year, and it is coming back again in this very space in February 2024. So, looking forward to inviting your students from across the network.

Finally, I want to give a nod to some of the keynotes that are coming up today and why they are so inspiring to us. Deirdre Mulligan was one of the founding members of PIT-UN and co-authored the founding definition of public interest technology. She’s now serving in the White House Office of Science and Technology Policy and embodies the kind of collaboration we need between academia and government. So, thank you, Deirdre.

Rumman Chowdhury, who has served on our evaluation committee a few times, is a pioneer in the field of applied algorithmic ethics, creating cutting-edge sociotechnical solutions for ethical, explainable, and transparent AI.

And finally, last but not least, and a great segue into our morning keynote, is Jasmine McNealy, who you’ll hear from. She is a lawyer, a critical public interest technologist, and a social scientist who works to influence law and policy surrounding technological ecosystems, privacy, surveillance, and data governance.

We know that our work does not take place in an ivory tower. We are very much connected to real-world problems. We’re living through challenging times in which many public institutions and public goods are under attack and being eroded. Higher education is one of them. This is one of the reasons it was so important for us to start the morning with Jasmine. We invited Jasmine to open our convening and set the tone for two days of conversation because she offers a powerful interdisciplinary framework for understanding how technology shapes society and the role that universities can play in advancing public interest technology and democracy itself.

Jasmine is a senior fellow in tech policy with the Mozilla Foundation, an associate professor at the University of Florida, and a faculty associate at Berkeley and the Center for Internet and Society. In 2022, she served as a technology advisor to the federal government. Jasmine personifies the kind of rigorous, interdisciplinary, and accessible public scholarship that demonstrates the best of what higher education has to offer our students, our communities, and our institutions.

It’s with great pleasure that I ask you to welcome our opening keynote, Jasmine McNealy.

KEYNOTE

Welcome from Associate Provost Azer Bestavros

Progress in public interest technology cannot be achieved in isolation. BU Associate Provost for Computing & Data Sciences Azer Bestravos introduces the theme for the 2023 PIT-UN Convening, “Partnerships for Impact.”

Distinguished guests, it’s really an honor to stand before you today to discuss the crucial role of public interest technology. I believe that this event marks a significant milestone in our collective ongoing commitment to fostering innovation for the greater good. As President Freeman mentioned, the theme of this year’s convening is Partnership for Impact. We chose the theme because it is important to recognize and underscore that progress cannot be done by individual institutions alone or organizations alone. It has to be done in partnership. Many types of partnerships. I’ll get back to this point here.

Collaboration among stakeholders is key to addressing our society’s complex challenges, and this year’s convening is all about that. Our agenda is packed with inspiring keynotes, thought-provoking panel discussions, interactive workshops, and enriched and inspired student track with papers, posters, and conversation circles on core public interest technology topics. I really want to invite you to visit and join those activities on the second floor. This is where the students will be this afternoon, if you’re available.

The program was also built to ensure enough time was available for peer collaboration through workshops and member-led sessions. These include, actually, the morning sessions we’ve already started earlier today, and we have planned other sessions this afternoon. The significance of these sessions lies not only in the knowledge that will be shared but also in the connections and partnerships that will hopefully be enabled. So in preparing these remarks, I went back to what we wrote to New America back in May of last year, when we concluded a study at BU on the national landscape of what our universities are already doing around PIT. As I reflect on that report, I want to share with you some observations that I hope you’ll keep in mind as you participate in the event today. First, the challenges facing our society are complex, and the only way to effectively address these challenges is through interdisciplinary research and especially research at the nexus of technology, policy, law, ethics, and the social sciences in general. As members and leaders of institutions that put the public interest above all, we need to continue to foster and recognize the value of the socio-technical interdisciplinary collaborations.

Second, and this is important: The speed with which technology is developing is increasing the gap between the have and have nots, between its deployment in the public interest versus its deployment by industry for profit making. As members and leaders of the institutions that put the public interest above all, we need to double down on education and training programs that equip students with the skills to work at this intersection of technology and public interest and to introduce them to career opportunities in the public sector.

Third, for any of our work at the intersection of technology and the public interest to be effective, not to mention meaningful, it should be reflective of the makeup of our society and should be built on participatory research and development. So as members and leaders of institutions that put the public interest above all, we need to broaden participation in STEM programs to prioritize inclusivity, diversity, and equity in our programs and initiatives.

Fourth, recent polls show that only 36% of Americans have confidence in higher education. This is down by 20 percentage points in just eight years. We have a duty to reverse that trend. As I often say about BU’s creation of the Faculty of Computing & Data Sciences, this is not going to happen unless we mold our ivory towers into public squares, in which the wants and needs of our society are heard and in which we can share with society what technology can do for the greater good. So as members and leaders of institutions that put the public interest above all, we need to reach out to our communities, local, state, and national levels to bring the public square to our campuses.

Lastly, while there are established models for supporting longer-term research — NSF and NIH — and while there are many Wall Street backers of short-term technology development, we have yet to figure out sustainable models for supporting the highly applied research and development of technology in the public interest. So as members and leaders of institutions that put the public interest above all, we need to figure this out through long-term partnership with organizations that believe in the cause. And by identifying strategies to share resources and scale successful projects and initiatives across our institution.

Friends, we know that the path ahead is not easy. Our member institutions are facing more and more challenges from the sustainability of recent initiatives to the political realities that we have to navigate. But I deeply believe that partnership with each other, with industry and foundation, with the government, we can tackle these challenges. So I encourage you to participate in today’s session, but before we all go to work and on your behalf, I want to thank all those who worked hard for almost a year to ensure this convening.

First, I want to acknowledge New America for giving us the opportunity to recognize this convening and for working shoulder to shoulder with us. Thank you, Andreen Soley and team New America. Second, I want to thank my colleagues in many of our academic institutions in the great state of Massachusetts for working with us at BU on the conception of the theme and the program for the convening. In particular, I want to shout out to my colleague Fran Berman at UMass Amherst. Fran, thank you. Speaking of the need to forge partnerships, this convening is a case in point of what regional partnerships can do. Last but not least, I want to thank and acknowledge my colleagues at BU who worked tirelessly to take the concept and make it a reality. Team BU, please stand up to be recognized. All of you, please stand, stand up. In particular, I want to thank Ziba Cranmer, Maureen McCarthy, and Carolina Rossini. Thank you, Carolina. We are in your debt.

KEYNOTE

Jasmine McNealy

What is the role of higher education in a healthy democracy? 

Opening keynote Jasmine McNealy, Associate Professor at University of Florida and Mozilla Senior Fellow.

Thank you, Andreen. That was very kind. Before I begin, I’d like us to hold space for the times we are in, acknowledging those experiencing conflicts in Gaza and Israel, in Ukraine and Russia, and in various armed conflicts around the world. Can we take a moment of silence for those affected by these conflicts? Thank you.

I want to express my gratitude to New America’s Public Interest Technology University Network and the BU Faculty of Computing and Data Science for inviting me to speak. I must warn you that this is less of a traditional keynote and more of a provocation. We are in a space where we need to engage in some provocations about what we want to be, what we want to see, and how we want things to progress in the realm of technology and its interaction with society. It’s particularly important in the United States, but it extends to the global context, considering how technology and people interact and how we govern ourselves.

I have been asked to discuss the role of universities in democracy, which is an ongoing debate, especially in the United States as we enter an election season at various levels of government. The question arises: what value, worth, and benefit do universities bring to society? It’s a valid question, as universities don’t always effectively communicate their roles, and not everyone attends college. We need to acknowledge that there are people who don’t participate in higher education and may feel excluded from conversations about public interest technology.

Data from the Education Data Initiative shows that in the spring of 2022, total college enrollment in the United States was approximately 16.2 million students. While this number is substantial, it leaves a significant portion of the population that didn’t complete their college studies or didn’t attend college at all. To address this issue, we must recognize the importance of inclusivity in conversations about public interest technology and what truly serves the public interest.

We also need to remember that those not attending college are taxpayers. Thus, universities have a responsibility to demonstrate the value of their programs and services, as well as to provide accurate information about their activities. This is particularly important because we need to persuade taxpayers that their investments in higher education are worthwhile.

Furthermore, it’s essential to ensure that what the public sees and hears about university activities is accurate. In today’s world of social media, information can be easily distorted or manipulated, so universities should play a role in offering transparency and promoting trust in various institutions.

The debate about the value of university programs has been ongoing, with some humanities and social sciences often criticized. Some people question the utility of humanities degrees and the value of certain courses. It is crucial for faculty, staff, students, and administrators to defend the worth of these programs and demonstrate their relevance to society. This defense should not focus solely on how university activities serve individuals but also how they contribute to democratic governance.

Land-grant universities in the United States have a history of using their resources to serve the public interest. These universities receive land and funding from the federal government and, in return, are expected to benefit their communities. They are also known for their cooperative and translational nature. They work closely with various stakeholders, share knowledge, and provide education to improve agriculture and other aspects of society.

While the history of land-grant universities is not without its criticisms, such as deceptive treaties with indigenous communities and the separate universities created for black students, the idea of being beneficial to the public is essential. Land-grant universities translate research into practical applications, addressing the public interest and contributing to societal progress.

One way to apply the principles of land-grant institutions to public interest technology is by establishing an extension model. In this model, educators work closely with communities to translate research findings and knowledge into actionable steps that benefit the public. This model involves both translational and cooperative elements, as extension workers collaborate with one another to address the public interest.

In summary, the role of universities in democracy is multifaceted. Universities should not only focus on individual students’ educational needs but also play a crucial role in improving democratic governance. We can learn from the principles of land-grant institutions and consider incorporating an extension model into the realm of public interest technology. This would involve translating research into practical solutions, working closely with communities, and promoting transparency and trust to serve the public interest.

PANEL

Partnering within and between Universities

Public interest technology is interdisciplinary by nature and requires us to build partnerships across departments and campuses.

Learn how PIT has spurred partnerships within and across institutions that unlock new possibilities in the design, deployment, and governance of technology.

 

PANEL

Partnering with Civil Society

PIT requires that we pay close attention to the impact of technology on human lives and communities, particularly to those most vulnerable and marginalized

Learn how PIT practitioners have established and grown collaborative partnerships with nonprofits and civil society organizations to foster individual and community rights, advance justice, and build resilience. 

PIT Lightning Talks

Public interest technology thrives on the cross-pollination of ideas and expertise from many domains. 

Lear how practitioners in K-12 education, game design, , cybersecurity and higher education apply PIT frameworks to operationalize values of equity, accessibility and community in technology.

PANEL

Partnering with the Private Sector

Public interest technology invites us to develop business models that put people first. 

From new design tactics such as ethics and privacy by design, to more revolutionary open innovation and open business strategies, companies have a unique opportunity to support community well-being while also creating profitable and sustainable firms.

Day 1 Image Gallery

Day 2 | Friday, October 13, 203

KEYNOTE

Danielle Allen

What is the role of technology in a healthy democracy?

Harvard Professor and Founder of the Allen Lab for Democracy Renovation Danielle Allen’s keynote address from the 2023 PIT-UN Convening.

Good morning, everyone. My charge this morning is pretty simple. You all know how important your work is. I’m just going to tell you again how important it is, and why public interest technology is of profound significance to democracy and to human flourishing everywhere.

Let me start by introducing myself. I am a Harvard professor and also a democracy advocate inside and outside the university. I focus on work that I call “democracy renovation.” When folks ask me what I work on, I always give the same answer: It’s just democracy, past, present, and future thereof, with no question mark at the end. I come by that focus honestly, as a matter of basic family inheritance. On my dad’s side, my granddad helped found one of the first NAACP chapters in northern Florida in the 1940s. I don’t know how much you know about northern Florida, but it’s basically the same thing as southern Georgia. In the ’40s, lynchings were on the rise, and my granddad was taking his life into his hands with his NAACP activities. It was very dangerous work. And on my mom’s side, my great-grandparents helped fight for women’s right to vote. So my great-granddad marched with suffragettes on Boston Common in 1917. And my great-grandmother ended up as president of the League of Women Voters in Michigan in the ’30s.

They were all people who were told that the things they wanted were impossible: that social equality for African Americans in the South was impossible; that women having the right to vote was impossible. Their answer to this was, of course, not only are these things possible, but these things are necessary. So the question is not “Is it possible?” The only question is how to achieve it. I was fortunate to grow up in a network of very civically engaged people who had that attitude toward the world around them.

My family also understood that empowerment is the bedrock for human flourishing And with empowerment as the bedrock, therefore also democracy. So I’ll admit, as a kid, I took the value of democracy for granted. It wasn’t really until I was watching my own generation come up in the world that the question of democracy’s value got to be a lot more complicated for me.

In my parents’ generation, everybody pretty much moved up. That same granddad was a fisherman, his kids were small business owners, and professors on the other side, from factory workers, to accountants, and so on. But my generation, I think I’m older than most of you in the room at this point, but my generation has lived through something quite different. It’s what I call “the Great Pulling Apart.” So here I stand in this incredible beautiful space, amazing view of the gorgeous city of Boston, tenured faculty member at a great university that has a role of incredible privilege, like, forget Elon Musk, or Jeff Bezos, or whatever people think is the sort of pinnacle. Being a tenured faculty member is the most privileged role there is. I feel that every day, and I have a brother who’s a corporate executive. At the same time, I have cousins who aren’t with us any longer, for all the reasons that are among the hardest things we struggle with in our society: substance use disorder, incarceration, homicide. I lost my youngest cousin Michael in 2009. And that was a moment that I just realized that what my family was living through: where some of us were here and others were trapped in really dark and difficult circumstances. Over the course of my lifetime, that’s exactly what our entire country has lived through.

So my 50-plus years on this planet perfectly coincide with the graphs that show you the rise of income inequality, the rise of wealth inequality, the rise of incarceration, the rise of polarization. And I began to ask myself, You know, hang on to this democracy concept. It’s not supposed to be abstractly valuable. I mean, yes, we love the ideals of freedom and equality. Yes, we can name empowerment as something important in human life. But we embrace those ideals, those goals, because in doing so the idea is that it gives us a society that makes it possible for each generation to move forward, one after the next, each generation to do a bit better than the previous generation and as a whole cohort.

So I began to ask myself the question of how is it that we can change the dynamics in our society so that this democracy is delivering on that promise. And as I started to do that work, which, for me, became work of democracy renovation, I realized that when I was a student, when I was an undergraduate, I was graduating into the world where this great pulling apart was just starting to happen. The first point when economists and politicians and policymakers really named the challenge of rising income inequality, it was 1992, and they were having a debate like, is it or isn’t it? Like, does it matter? It doesn’t matter? I mean, it should matter, I don’t think the debate should have been as complicated as it was, in all honesty, but that was what was happening back in 1992. So here we are, it’s 2023, our students are graduating into a world that we all know is the Age of AI, the age of incredible impacts from technology. And the most important question really is whether or not that age will make those problems of the Great Pulling Apart worse, or better.

And technology is truly the most influential force right now, for its impact on that question. Is technology going to make the Great Pulling Apart worse, with more inequality, more polarization, more disenfranchisement, more alienation? Or is it going to be a part of the solution? Help us come out of those dynamics. I take that question to be the work of public interest technology.

And therefore the work that you’re all doing on campuses, with courses, with research centers, with community engagement programs, are fundamental to governing innovation, governing technology, so that we can change those basic dynamics that are delivering so much injustice in our society, reverse that dynamic, and deliver justice instead. There’s a moment of great opportunity right now. And I talk about that need for the public interest technology perspective to govern technology. That is, about the students you are developing and sending out into the world. There are roles for them in the public sector, there is huge opportunity right now, the CHIPS Act with other aspects of governmental funding for meaningful partnerships, programs that can be sustained on campus through access to federal and state funding, and it’s really worth pursuing those.

It is also about making sure that every kid who goes into the private sector is ready to say inside those technology companies that technology is for human flourishing, not profit. All right, I mean, yes, you need sustainable revenue to support any enterprise, any human enterprise. But at the end of the day, technology, like journalism, even like business, is best understood as being for human flourishing and nothing else. We’re watching a transition from shareholder capitalism to stakeholder capitalism. It is about that insight that the project is human flourishing. And we need our public interest technology networks on university campuses to ensure that that lesson that technology is for human flourishing is embedded in the private sector fully and that the sector is well equipped. And also that that lesson is embedded in the private sector. So I’ve taken too long, I was just supposed to be the warmup act for your next speaker. So let me go ahead and invite Rumman to the podium.

KEYNOTE

Rumman Chowdhury

What ideologies will guide the development of artificial intelligence – and how can we build public feedback into AI tools? 

AI Ethicist Rumman Chowdhury’s keynote address from the 2023 PIT-UN Convening.

Thank you so much. And thank you for the lead-in to what I’m about to talk about. I’m going to go slightly off script, and I apologize to the organizers. But you know, I was going to talk today, and I will talk today, about public accountability and the work I’m doing on red teaming. But I actually wanted to start off with what I was woken up to this morning. I was woken up to this article that was published in Politico about the extent to which the existential risk movement and effective altruism are starting to bleed into D.C. And I don’t know how much folks here are paying attention to what’s going on politically. But Professor Allen was absolutely correct, right there. There is a war happening right now. It’s actually a war for the soul of how technology is being used. And if we think about artificial intelligence as the next iteration of the “Great Pulling Apart,” it is a battle we’re currently losing.

Because for all the hundreds of millions of dollars that the UK government spent in the past few years on responsible AI, effective altruism, in a matter of a few years, has come in and captured the entire government. So I, and probably some folks in this room, will be at the UK AI summit in a couple of weeks. And that summit is going to have a very different tone and tenor than what we are used to seeing in the UK. Frankly, when I built my practice at Accenture in 2017, even though I lived in San Francisco, I set my hub in London because it was the center at which there was applied algorithmic ethics. The first tool that we built at Accenture, the fairness tool — which, by the way, is now an entire industry of bias detection and mitigation technologies — was built with the Alan Turing Institute. And now folks like myself and the folks at the Turing Institute are scrambling to be relevant anymore.

This article that I woke up to this morning was about how there are funded internships for effective altruism all over D.C. And I can tell you, I’ve been aware of this for months. We know that folks in this movement are very well funded, and they are well organized. But so are we. So how are we shaping ourselves to have an affirmative vision? What do we stand for? And what are we about? What is our goal? What are we telling children when we work with them? When do we do projects with them? What are we telling them that their goal and their mission is? And I think, frankly, that sometimes we are a little bit confused. We, the field of responsible AI, are certainly very good at pointing out problems. I hope that my career has been built on making solutions as well. And while harm mitigation and risk mitigation are a good thing to do — and we certainly should be thinking about them — we also need to think about what we are building toward.

Professor Allen had an amazing example of her grandparents, who actually worked toward an affirmative vision. They wanted something, and you can mobilize people when you want a thing, and you make them want that thing as much as you want it, too. So my question to you is to think through what it is that we want. How can we put in one line the things we will tell an undergraduate, a high school student, a Ph.D. student, a member of Congress: This is what I want. And this is how you can help me get it. Because we really have to make our language that simple.

I want to talk a little bit about what I’m working on lately, which is building out public accountability. The little thing that I’m trying to solve here is how we get better structured public feedback into AI systems.

So as I mentioned, the UK government had spent hundreds of millions of dollars building out all sorts of institutes that actually were focused on responsible AI. But one thing I saw consistently missing — and it’s something I still see missing — is the role of public feedback. Now we have methods for getting public opinion in the U.S. government, and various other governments have ways of getting public feedback, too, right? So you do RFIs, like people provide commentary. And I will talk you through the ways in which all of those simply don’t work today. When the National Telecommunications and Information Administration put out their RFI, for instance, they received, I believe, about 2,500 submissions. Now how are they supposed to parse through 2,500 seven-page essays about people’s perspectives and opinions, all of which I’m sure are very well written. I’m sure lots of people in this room contributed as well. I mean, I did, too. But this is not a tenable way to truly get public opinion.

Another one that we see constantly is sort of public commentary posted on some sort of a website. Before I worked in responsible AI, I taught data science at a boot camp. And one of my students, who actually now is at ProPublica, the investigative journalism unit, did a project where he demonstrated that half of the public commentary that was provided on net neutrality was actually developed by bots. Half! People were creating informational bot rings to make it look as if the public thought a certain thing. The fundamental problem, then, is that even with public feedback, it’s not always clear what people want in aggregate. As a political scientist, I can tell you there are no “people,” there is no “public,” right? There is no singular voice. So how do we get the kind of feedback we need to improve AI systems?

And my hope is that we’re actually able to give structured feedback to say, hey, you told us this. And by the way, this is what we did. Now, this is a one-off project that was built on goodwill. My goal with my nonprofit Humane Intelligence is to make this something more sustainable and lasting. And I’m going to give you some numbers from the DEFCON challenge. So as I mentioned, we have eight large language model companies. It’s every company you know, and a few that you probably don’t. We had 20 hours of people coming in and entering this competition. We had 21 different challenges. A couple of them were around hacking, but most of them were about the concept of what I call “embedded harms,” which is ways in which large language models can surface biased or misleading information that could be harmful in society, including misinformation, incorrect refusals, meaning, you know, discriminatory output, based on how the model is deciding to communicate and talk. We also had 2,200 people show up and take on the competition. We had a line that was over an hour long out the door. We were actually not able to bring in everybody who wanted to be part of the competition. And it’s something we’re constantly being asked to continue today. And the legacy does continue. We’re hosting our next red teaming events on Oct. 25.

So what’s next? My hope is that we grow this concept of red teaming, because structured public feedback has done well when we get the right people in the room who have the right lived experience to tell us how language models and artificial intelligence can be improved to enable human flourishing. What’s next is that I would love to work with folks in this room to figure out how we develop a coordinated affirmative vision for what we’re going to do. Because as I mentioned, there’s a war being fought, and right now we are losing. Thank you.

PANEL

Institutionalizing PIT

The long-term success of public interest technology depends upon building and sustaining institutional support to undergird and operationalize the values of community, collaboration, equity, and justice that are the core of PIT.

In this conversation,  leaders across the field discuss how they have developed institutional support for PIT, how they’ve navigated political questions and relationships, and what steps members can take to build relationships and activate resources on their campuses.

KEYNOTE

Deirdre Mulligan

How can government promote technological development that protects human rights, democratic values, security and safety?

Deirdre Mulligan, a founding member of PIT-UN now at the White House Office of Science & Technology Policy lays out the Biden-Harris administration’s vision for seizing the opportunities and minimizing the risks of AI.

Read her full remarks here at whitehouse.gov.

Photo by Mike Spencer

PANEL

Partnering with Government

Government is a key stakeholder not only in the governance of technology but also in its development and deployment, from publicly-funded research labs and grants to digital public services and infrastructure.

How can universities collaborate with elected officials, local and federal government agencies, and policymakers to ensure that our evermore technological society supports innovation, access to justice, quality employment opportunities, environmental protections, and the flourishing of people and communities?

PANEL

Priorities for PIT Funding

Philanthropy has played a key role in organizing, formalizing, publicizing, and funding the field of public interest technology. Five years on from PIT-UN’s inception, we have a number of successes to celebrate, from proof-of-concept projects to interdisciplinary research centers to PIT degree programs and more.

KEYNOTE

Joel Christian Gill

What is the role of storytelling in fostering change?

Cartoonist and Director of the Visual Narrative MFA at Boston University Joel Christian Gill’s closing keynote at the 2023 PIT-UN Convening.

Transcript forthcoming

Day 2 Photo Gallery

What is PIT?

5 Keys to Institutionalizing PIT

What is PIT-UN?