AI Institute’s Seed Grant Consultants Share Findings at AI³ Showcase

 

Collage - Vismay Vora, Deboparna Banerjee, Jayesh Rathi
Left to Right: Deboparna Banerjee, Vismay Vora, Jayesh Rathi

Stony Brook, NY, December 31, 2025 — When Stony Brook’s AI Innovation Institute (AI³) launched its AI Seed Grant Program in 2024, the goal was clear: inspiring interdisciplinary collaborations that would use AI to help solve complex societal problems and explore new areas of study.

In 2025, the program awarded 13 projects more than $500,000 in funding, encouraging collaborations between AI experts and faculty in areas from climate and medicine to education and humanities. One track of the program offered something particularly unusual, pairing faculty members with graduate AI consultants who would help design and jump-start projects that rely on machine learning and data science.

Three of those consultants — Deboparna Banerjee, Jayesh Rathi, and Vismay Vora — have spent the past year using AI to help tackle problems as diverse as recycling, gene editing, burn wound healing, and parsing 18th-century literature. Their work was highlighted at the AI³ Showcase on November 17, 2025, where recipients of the AI³ Seed Grant shared progress with colleagues from across campus.

Deboparna Banerjee: Making AI Accessible for Biologists

Deboparna, a master's student in the Department of Computer Science, worked in the life sciences with David McKinnon, a professor in the Department of Neurobiology and Behavior at the College of Arts and Sciences (CAS). His lab works with single-cell “multi-omics” data — large, complex measurements of individual cells. Banerjee also worked for Eric Josephs, Empire Innovation associate professor in the Department of Biomedical Engineering, who studies CRISPR gene-editing technologies.

In McKinnon’s lab, many of the students are primarily trained in biology, but modern single-cell research depends heavily on machine learning. Before Deboparna joined, the group tried to replicate methods from papers, treating AI as a mysterious black box.

“My job was to help them understand the internal mechanisms of AI,” she said. She started running informal crash courses on core machine-learning and deep-learning concepts, and when the lab wanted to follow a new paper, she’d spend an hour walking them through the math and modeling choices. After that, the students could clone code, tweaking models, and adapting them to the lab’s own data. “With this team, it was more about teaching and training so they could be independent.”

The CRISPR project with Josephs was more hands-on. Here, Deboparna helped build and adapt general models that looked at RNA sequences and estimated how likely they are to pair correctly — a step forward in personalized gene therapy. “The data is still limited, so the project is exploratory, but early machine-learning results are promising,” Banerjee said.

She saw a common pattern across both labs: students and faculty in biology know they need AI, and many even experiment with it, but they struggle without a strong foundation. “Artificial Intelligence is not a black box,” she reminds them. “These models are just math — and if you understand the math, you can learn to use AI effectively.”

Jayesh Rathi: AI for Literature, Climate, and Wound Healing

Jayesh’s consulting portfolio might be the most eclectic of all. His seed-grant projects span European literature, coastal climate change, and — in a newer collaboration with Deboparna — burn wound healing.

With Giuseppe Gazzola, associate professor in the Department of Languages and Cultural Studies, Jayesh built a web tool that lets scholars look through a large corpus of 18th-century Italian, French, and Spanish texts and then request AI-assisted analysis of these texts.

A simple search returns the most relevant passages from across the corpus. If the user wants to analyze certain parts of the content, the system can summarize relevant pages, synthesize findings at the book level, and even pull information across all the books by a certain author. For literary scholars facing shelves of dense historical texts, this is a powerful shortcut.

On the climate side, Jayesh worked with Kevin Reed, associate provost for Climate and Sustainability Programming, on a project that uses AI to better understand how extreme precipitation is changing along US coasts. Using decades of data from weather stations and dozens of global climate models, the team looked at how those models have historically over- or under-estimated heavy rainfall.

AI models then learned those error patterns and adjusted their future projections accordingly. The ultimate goal was to give urban planners better information about what future “2-year” or “100-year” storms might really look like — a key input for designing stormwater systems and resilient infrastructure.

More recently, Jayesh and Deboparna have teamed up with researchers in the Department of Materials Science and Chemical Engineering to study how burn wounds heal over time. They analyzed videos of thick skin being gently poked near a wound and observed how the skin springs back. By turning those subtle motions into quantitative measures, they plan to build AI tools that can tell clinicians how well the tissue under the surface is recovering, and which treatments are working best.

Vismay Vora: From Recycling Streams to Hidden Ocean Waves

Vismay consulted on two seed-funded projects: one in civil engineering and one in marine and atmospheric sciences. With associate professor Ruwen Qin, he helped build an AI-assisted system to analyze video from Long Island recycling facilities. Cameras captured hours of footage of bottles, cans, and other materials moving along conveyor belts. “Manually labeling each item in those videos would’ve taken months,” Vismay explained, “far too slow for serious analysis.”

Instead, he built a pipeline that automatically breaks the videos into frames and identifies, labels, and outlines items like plastics, aluminum, and biodegradable waste. The result is a rich dataset the team can use to answer basic questions: How much plastic is coming through this facility? How does that change over time?

AI-assisted system to analyze video from Long Island recycling facilitiesAI-assisted system to analyze video from Long Island recycling facilities

The same project is now expanding beyond conveyor belts. The team is beginning to collect “multimodal” data from workers themselves — audio of how they talk to each other about different materials and videos from cameras mounted on their hats and clothing — to better understand expert decision-making on the sorting line.

Vismay’s second project was with Jack McSweeney, associate professor in the School of Marine and Atmospheric Sciences. Using long records of temperature measurements from sensors along California’s coast, the team studied “internal waves” — wave motions that occur below the ocean’s surface. Vismay’s role was to find patterns in those complex signals using machine-learning methods that cluster similar wave shapes together. The work is still in progress, but it’s already revealing recurring patterns in how internal waves move toward the shoreline. 

Vismay has since worked on two more research initiatives: conducting exploratory analysis on Snapchat data to characterize behavioral patterns, usage trends, and human-AI engagement dynamics, and developing generative AI models to create several design options for mechanisms — an assembly of moving parts that perform a complete function in a larger machine.

What ties these seemingly unrelated projects together is the thrill of seeing AI make a difference outside computer science. “When you work only within CS, you usually practice on toy datasets,” Vismay said. “Here, we were pairing AI knowledge with real-world domains to solve real problems and actually seeing our work’s impact.”

For Deboparna, the early meetings often flipped the usual dynamic: “For the first two sessions, I was the one learning things, because they were telling me their biology concepts. It was not until the third or fourth session that I started making suggestions.” 

The AI³ Showcase felt like a milestone, both a celebration and a chance to look ahead. Preparing to share their progress forced everyone to step back, organize their work, and ask, What happens next? The event drew strong attendance across both morning and afternoon sessions, with different audiences cycling through.

What happens next?

If there’s one message the three consultants want share, it’s that demand for this kind of help is enormous. “Almost every professor doing research work could use some support with the technical side,” Jayesh said. Many are also trying out AI tools on their own and feeling overwhelmed. He imagines the AI³ Institute playing an expanding role in helping faculty “retrain” for AI-rich research. 

Vismay regularly hears a different recurring question from faculty collaborators: When are you graduating? Do you have time to keep working with us? For him, it’s proof that the model works, and that there’s room to bring many more students into similar roles.

Deboparna sees a parallel need among students in other departments who are teaching themselves AI without the mathematical grounding needed to adapt the models they use.

Together, AI³’s consultants have shown how a relatively small investment — in their time, training, and stipends — can unlock new capabilities across Stony Brook’s community, making sure that when researchers seek AI, it doesn’t remain a black box, but becomes a shared solution.

News Author

Ankita Nagpal