Arvind Narayanan takes helm at joint technology-policy center

News Body

June 29, 2023

By Karen Rouse, Center for Information Technology Policy

Image removed.
Arvind Narayanan. Photo by David Kelly Crow

About a decade ago — back when he was a postdoctoral fellow at Stanford University and Silicon Valley was rebounding from the dot-com bust — computer scientist Arvind Narayanan co-founded a startup, pursuing what he thought was his career path.

“It was just very intellectually exciting,” said Narayanan, a Princeton University professor of computer science and expert on algorithmic fairness, artificial intelligence and privacy. “To a large extent, I believed back then that you can save the world with tech.”

But after four years embedded in Silicon Valley’s tech culture, Narayanan said, “I also started to recognize how some aspects of the culture then lead to some of the engineering practices that lead to some of the harmful impacts of technology.”

What he observed was a flagrant disregard for ethics as startups, facing pressure from venture capitalists to expand rapidly, adopted a scale-up-at-all-costs mentality with few to no consequences for ethical lapses or irresponsible practices.

“It’s kind of this move-fast-break-things culture,” he recalled. “I decided that that was not what I was going to do with the rest of my life.”

Narayanan instead chose to use research to protect consumers and society from the kind of excesses he saw. In 2012, he joined Princeton University’s Center for Information Technology Policy (CITP), a center that did work that aligned with his core values. At CITP, faculty, fellows, researchers and scholars collaborate on research that exposes the ways in which tech can deceive, manipulate, discriminate against and violate the privacy of users, among the center’s other functions.

“I think that’s how I can maximize how my research can be helpful to the world,” he said. “That’s what I find meaningful.”

LEADERSHIP IN A  CHALLENGING TIME FOR TECH

On July 1, Narayanan becomes the fourth director of CITP, which he said is well-positioned to be a resource and honest broker for policymakers and others seeking to understand technologies like artificial intelligence and chatbots.

“We were all about the importance of tech policy for the last 15 years, and we were ahead of the curve,” Narayanan said. “Now, developments have caught up to the insights that we had and so it’s our opportunity to seize the day and to have leadership in this area. I think that is super, super exciting.”

Narayanan’s appointment was announced last year, just before he went on sabbatical as a visiting researcher at the Knight First Amendment Institute at Columbia University. Prateek Mittal, a professor of electrical and computer engineering, served as CITP’s interim director for the July 2022 through June 2023 academic year.

CITP is a joint initiative of the School of Public and International Affairs (SPIA) and the School of Engineering and Applied Science (SEAS). “I look forward to working with Arvind to enhance CITP’s research and teaching impacts,” said SPIA Dean Amaney Jamal. “His expertise, experience, and leadership are perfectly suited for this point in time when technology policy is so important.”

SEAS Dean Andrea Goldsmith said Narayanan’s work “has been highly influential in areas such as artificial intelligence, blockchain technologies and social media, cutting through hype and technical detail to reveal how to best harness these technologies for good, and the public policies that can help facilitate that goal.”

“The impact of rapidly changing technology on society has never been more apparent,” Goldsmith said. “I look forward to working with Arvind in advancing CITP’s teaching and research mission to enhance the role of technology in benefiting humanity. I am also very grateful to Prateek Mittal for his visionary leadership as interim director.”

 Narayanan takes the helm at a time when citizens are trying to reconcile the benefits of promising technologies with the risks and harms — like misinformation online, the negative impacts of social media content on teenagers, dark patterns that trick consumers into spending more when shopping online or voting for a particular candidate and algorithms that perpetuate discrimination.

“This is a critical period for understanding and improving the relationship between technology and society,” said Princeton sociology professor Matthew Salganik, who was CITP director from July 1, 2019 to June 30, 2022 and knows Narayanan well.

“We’ve done research together, taught a class together, and served on committees together,” Salganik said. “I’m always impressed with his ability to go right to the heart of a complex problem. He has a great sense of what is important, he sees things differently from other people, and he fights for what he believes.”

Narayanan graduated from the Indian Institute of Technology in Madras, India, with dual bachelor’s and master’s degrees in 2004, before earning a Ph.D. in computer science at the University of Texas at Austin in 2009 — the year he joined Stanford University as a postdoctoral fellow. He joined Princeton CITP as an assistant professor of computer science in 2012.

 INTERDISCIPLINARY COLLABORATIONS, RECORD OF RESEARCH CORE TO CITP

Narayanan’s conscious decision to do work that improved tech in society aligned with CITP’s mission. The center was founded by renowned computer scientist Ed Felten in 2005 as an interdisciplinary hub where researchers, academics and scholars would, in a collaborative setting, research technology for the good of society.

The center’s priorities are research, social engagement and training. Among its current programs are the CITP Tech Policy ClinicCITP Digital Witness Lab and the Siegel Public Interest Technology Summer Fellowship.

“Back in the late 2000s, there were hardly any centers like this one,” Narayanan said. “I very much respected Ed and, at the time when I was in grad school, he was one of very, very few people who was doing this kind of work, so that was inspiring; it was the kind of work that I wanted to do.”

He was also drawn to the interdisciplinary and collaborative model. “It felt like a good mix of academic strength and people who I admired and the opportunity to do influential policy work.”

That includes work by CITP faculty member Aleksandra “Sasha” Korolova. She has been looking at discrimination on app platforms before that became a widely recognized issue, Narayanan said. He also cited work by Jonathan Mayer, who, since graduate school, “realized that third-party tracking on the web is a serious privacy problem,” and built a tool called FourthParty,” to help researchers and journalists uncover third-party tracking.

To illustrate the foundational work of CITP research, Narayanan noted that he was able to collaborate with former CITP graduate student Steven Engelhardt and former CITP postdoctoral fellow Günes Acar, to build upon FourthParty and create the web privacy measurement tool Open WPM. OpenWPM, in turn, was later used by data journalist and engineer Surya Mattu to build BlackLight, a research tool designed to help journalists expose when websites are tracking users. Mattu later joined CITP and now leads its Digital Witness Lab.

“Surya was able to do something where it’s not just a research tool that is obscure and you need to be a tech expert to use it,” Narayanan said. “It’s something anyone can use, and he combines it with journalism so that he can keep companies’ feet to the fire.”

CALLING OUT BIG TECH

 Narayanan first made national headlines in 2006, when he and his advisor Vitaly Shmatikov, were able to extract the identities of some Netflix subscribers from a dataset that the digital movies company released as part of a contest with the claim that the dataset was anonymized.

In an interview with Cornell Tech, Shmatikov described their skepticism: “My colleague at the University of Texas, Arvind Narayanan, and I were already working on various privacy-related things and then one day he walked into my office and said, ‘Did you hear Netflix released this huge dataset for their data mining competition and they claim it’s all anonymous? There is no way to reconstruct people’s identities.’ And that just sounded bogus.”

They proved it was. “We just went and wrote a simple program that scraped information from a separate Internet movie database website and tried to match it against what was in the Netflix Prize dataset. And it worked,” Shmatikov told Cornell Tech. The two later wrote about the Myths and Fallacies of “Personally Identifiable Information.”

It was Narayanan’s first high-profile moment of pulling back the curtain on Big Tech. He has since earned a reputation among students, fellow researchers, policymakers, journalists — and his 114,000 Twitter followers — as a credible voice of reason; he isn’t afraid to separate the hype and noise from the truth about what digital technologies can actually do. In an extensive profile this year, Quanta magazine called Narayanan The Researcher Who Would Teach Machines to Be Fair.

He has won numerous awards and grants, including from the National Science Foundation. During the last year, while at the Knight First Amendment Institute, he launched a symposium, Optimizing for What? Algorithmic Amplification and Society, that sought to demystify recommendation algorithms and how they impact society.

Prior to that, he led the Princeton Web Transparency and Accountability Project, an automated study to monitor and track information that a million websites were collecting on users, among other projects. More recently, he has become an international authority on AI and ChatGPT. He and CITP graduate student, Sayash Kapoor, are co-writing a book, AI Snake Oil, that is under contract with the Princeton University Press.

They are chronicling the book’s development in a blog by the same name, in which they call out bad practices. But, to be sure, the purveyors of good tech have little to worry about.

“If something is a problem that needs to be addressed, we’ll say that,” Narayanan said. “But on the other hand, if some of the risks are being blown out of proportion, and we think there’s not much to worry about here, then we’ll say that,” he said.

“We don’t have to have a horse in the race.”

This story originally appeared on the CITP website.