SBU Professor Stephanie Dinkins named TIME 100 AI Innovator

Stephanie Dinkins, a Stony Brook Arts Professor and transmedia artist who works at the intersection of emerging technologies and social collaboration, was named an “Innovator” on this year’s TIME 100 AI list. Her name appeared alongside renowned industry leaders and stalwarts, including Sam Altman, CEO of OpenAI; Demis Hassabis, CEO and Co-Founder of Google DeepMind; Elon Musk, Founder of xAI; Geoffrey Hinton, Emeritus Professor at the University of Toronto; and former SBU Computer Science faculty member Yejin Choi.

In an article explaining the selection process, TIME Editor-in-Chief Sam Jacobs said, “This group of 100 individuals is in many ways a map of the relationships and power centers driving the development of AI. They are rivals and regulators, scientists and artists, advocates and executives—the competing and cooperating humans whose insights, desires, and flaws will shape the direction of an increasingly influential technology.”

       

SBU Prof. Dinkins named TIME 100 AI Innovator
Illustration by TIME

Recognizing the honor, Steven Skiena, Director of the AI Institute at Stony Brook, added “Stephanie is a great example of how Stony Brook is leading in AI far beyond our traditional strengths in science and engineering.”

Seeing her name on the list was just as surprising for Dinkins, who felt puzzled when she received the news. “I wondered if there was someone more qualified than me to be on this list, but I also recognize that I’ve been doing the work for a long time, so I feel like I can accept it in some way. But it’s just very strange.”

Dinkins was recognized for her work with Not the Only One, an ongoing project in which she trained an AI on three generations of Black women to give it cultural roots, a deep history, and a perspective that existing systems do not offer. However for Dinkins, the movement has only just begun.

Drawn to using AI as a medium after a jarring interaction with a black female robot, Dinkins has spent years creating platforms for dialog with and about AI. Her work focuses on working with communities of color to co-create more inclusive, fair and ethical ecosystems, and is driven by the need to create AI that can offer care and support.

“When I say that, people look at me like I’m crazy. But why can’t we design systems that computationally care? If we can try for so many different outcomes, care and support could be one that we build into everything.”

When she started working on Not the Only One, Dinkins was not satisfied with the data she had to build on. The artist collaborated with coders and engineers to look for more neutral-sounding data about their community, while also exploring the idea of working only with the existing data. The project had a deep learning curve, and carried lessons about what it meant to have AI intersect with human lives and how to nurture it into an entity that might be conducive to a better society. Earlier this year, the project helped Dinkins land a $100,000 award from the Guggenheim Museum, given to artists pushing the boundaries of technology-based art.

Inviting communities to interact with Not the Only One

Dinkins’ other projects — including Bina48, an experience exploring the possibility of long-term friendship between a person and an autonomous robot; and AI.Assembly, an art and tech incubator — continue creating spaces for communities to physically engage with AI. While this may sound dangerous to some, to Dinkins, it’s more detrimental to not engage with AI and leave the technology to those who build it without experimenting with it.

“Consider Slavery as an example. I get blocked a lot when I ask questions around it,” she said. “But it’s a historical fact, and to not be able to generate [results] from our history scares me.” Training AI should not be about putting in too many guardrails, even though it presents real dangers. Instead, it’s better to know what we’re up against instead of shutting it down. “If it means that, every once in a while, I find something that is hurtful or offensive, I should be able to logically think about it, and ask for it to be corrected.” It’s a role we have to play as a society — nurturing AI to be more mature and understanding.

There are several other artists Dinkins is working with in this space — Beth Coleman, who is talking about rewilding algorithms; Louis Chude-Sokei, who is focusing on Electronic Personhood and Human Futurity; Joy Buolamwini, who uses art to illuminate the social implications of AI; and Kite, an artist working with indigenous communities.

“We are making an impact, but now we need to negotiate the effects of these impacts. There’s a conversation to be had — on a global level.”

More artists are being called upon to join the movement, but it’s important to note that the intersection of AI and art demands working deeply — digging in and picking apart an existing system, or building upon it to create something no one thought could exist.

The journey for Stony Brook’s Prof. Dinkins started with intimate questions about her family. And now, she’s digging deeper by considering things from a global perspective and inviting artists from varied backgrounds to join the conversation, collaborate, and start asking the right questions.

“I’m hoping people can start to hear about it [my work] and incorporate language and things that are broader, with the hope that we can help these systems, so we may help ourselves.”

Dinkins’ work on the intersection of AI and humanity is not only a recognition of their contributions to the Stony Brook community but also to safeguarding our future histories.

 

 

Ankita Nagpal
Communications Assistant