Skip to main content

AI With Purpose

How disciplines and research initiatives across the U of A are adapting to AI.

Fall 2025
Image
Digital artwork with glowing blue and red light patterns resembling data streams and circuitry. Large white text in the foreground reads “AI WITH PURPOSE.”

 


Arizona Names First Chief AI Officer 

Story by Stephanie Doster  

Image
A headshot of David Ebert in a suit and tie. The background is gray.

David Ebert

Artificial intelligence is rapidly reshaping how we live, from health care and water resources to education and national security. At the University of Arizona, AI is becoming integral to how we learn, teach, research and serve. 

Leading this integration at the U of A is David Ebert, named in May as the university’s first chief AI officer (CAIO), a position still rare in higher education but increasingly essential.  

A global leader in human-centered AI, data visualization and human-centered computing, Ebert brings a collaborative, interdisciplinary vision for accelerating and guiding an AI transformation across campus. He previously served as director of Data Institute for Societal Challenges at the University of Oklahoma and now holds the inaugural Computer Science Engineering Endowed Innovation Chair, established through a $3.5 million anonymous gift. 

Ebert sees the U of A becoming a national model for how AI and data science can empower cross-disciplinary teams to solve real-world problems while preparing students and communities for an AI- and data-driven future. 

“AI is impacting everything in higher education, from compliance and curriculum to student services and research,” he says. “With the right strategy, the U of A can lead more than ever before in turning knowledge into real-world impact, ethically, responsibly and transparently advancing industry, society and student success.” 

INSTITUTIONAL MOMENTUM 

New leadership and strategic priorities are accelerating the university’s AI trajectory. President Suresh Garimella has outlined three key imperatives: success for every student, research that shapes the future and community engagement that creates opportunity. AI aligns with all of them. 

Additionally, a $20 million investment by the Arizona Board of Regents will drive innovation in high-impact areas: sustainable mining and critical minerals, space sciences and national security, fusion energy, and AI-driven health care. 

“Artificial intelligence is already advancing research across health care, space sciences, agriculture and national security at the University of Arizona,” says Tomás Díaz de la Rubia, senior vice president for research and partnerships. “David brings the strategic insight and collaborative mindset to connect these efforts, accelerate discovery and expand partnerships that will shape the university’s impact locally and globally.” 

Ebert and his team, guided by input from across campus, are building a comprehensive, university-wide AI road map. It will integrate innovation, research, education and operations across disciplines, from biomedical, physical, social and optical sciences to humanities, agriculture and the arts. 

“This road map reflects our commitment to the land-grant mission. We’re applying AI to enhance clinical care, reimagine curriculum, accelerate research and modernize operations, ensuring the U of A leads responsibly in this transformative era,” Ebert says. “We also need to remember that soon, incoming students will have been raised using data and AI tools. They will be AI-native, and we must ensure that we provide top-notch education and training to prepare graduates for future careers.” 

Ebert also is creating an AI and data science institute that will unify existing data-focused programs in the Office of Research and Partnerships, and he is developing the initiative centered on AI in health care. 

AI FOR IMPACT 

The university is already leveraging AI to revolutionize medicine. Faculty and researchers are developing tools for earlier and more accurate diagnoses, improved patient-provider interactions, and faster, smarter drug discovery. Among the leaders in this area is Dan Theodorescu, director of the University of Arizona Comprehensive Cancer Center, who is featured separately in this magazine issue. His research applies AI to identify new biomarkers, personalize therapeutic options and predict patient responses to therapy.  

Highlighted in this AI section are other research applications already in motion: mining safety; innovations in agriculture, space sciences and national security plus wearable health technologies; and thoughts on the intersection of humanities and AI. 

“I am not exaggerating when I say that data is changing everything,” Ebert says. “The ability to manipulate and understand data is now critical to discovery and innovation. Data science and AI are the key drivers turning knowledge into impact in Arizona, across the U.S. and around the world, and I am proud to be a part of it.” 


NATIONAL SECURITY 

Story by Stephanie Doster 

Image
An image of Roberto Furfaro smiling at the camera with his arms crossed. He is in a dark green sweater.

Roberto Furfaro

As threats in orbit grow more complex, the U of A is advancing national security through cutting-edge AI research that will support space situational awareness, satellite tracking and planetary defense. 

Roberto Furfaro works in extreme environments. From punishing hypersonic speeds to the increasingly congested orbital space between Earth and the moon, the systems and industrial engineering professor is applying artificial intelligence to tackle the complexity inherent in aerospace defense and space traffic management. 

Furfaro, head of the Space Systems Engineering Laboratory and deputy director of the Space4 Center, is in the vanguard of AI research focused on improving national security and space safety. His team is developing intelligent systems that can predict and track the motion of objects in space and enable vehicles and satellites to make decisions in real time — without human involvement — under high-risk conditions. 

“We’re building flight systems that don’t just follow instruction: they learn, adapt, and respond to uncertainty like a pilot would,” Furfaro says. “That’s what autonomy means when you’re operating in an unpredictable, high-speed, high-stakes environment.” 

In the hypersonics field, Furfaro is completing a three-year, $4.5 million grant from the U.S. Department of Defense to develop improved AI-driven guidance, navigation and control systems for vehicles traveling at speeds greater than Mach 5 — more than five times the speed of sound. These include interceptors, which are high-speed, maneuverable vehicles designed for defense against high-speed threats. Furfaro’s research involves training systems to act as onboard brains that can rapidly adapt to shifting trajectories and withstand extreme heat, turbulence and shockwaves. 

The Joint Hypersonics Transition Office, through the University Consortium for Applied Hypersonics, sponsored the research, which includes collaborators from the U of A College of Engineering, the University of Texas at Austin, and the RTX Corporation (formerly Raytheon Technologies Corporation). Researchers gathered data from simulations and wind tunnel tests to characterize how vehicles behave in hypersonic flow and to create a simulated environment for training the adaptive brain of the system. 

Beyond Earth’s atmosphere, Furfaro is applying similar AI techniques to space traffic management and defense. Space is becoming increasingly crowded with satellites and debris — defunct rocket bodies and smaller objects like paint flecks. An estimated 20,000 to 100,000 new satellites could launch over the next decade, creating hazards to astronauts, spacecraft and communications. 

Furfaro’s lab uses machine learning to better detect and track space objects and predict their trajectories. The team focuses on objects in the geostationary belt, about 22,000 miles above Earth, and in cislunar space — the volume of space between and around Earth and the moon. 

His team is also building and testing AI-integrated telescope systems that can interpret observational data, reprioritize targets in space and direct the telescopes autonomously, in real time — a capability critical in scenarios in which human operators are unavailable or delayed. 

“Intelligent systems are becoming essential for managing space as it becomes more contested and congested,” Furfaro says. “We’re working toward predictive capabilities that anticipate threats and enable real-time response with no human in the loop.” 


MINING 

Story by Eric Van Meter 

Image
An image of inside a mining truck. The driver is seen making a left and the mine is in the background.

One of the most technologically advanced industries today, mining and mineral extraction requires highly skilled workers. The university is prioritizing the health and safety of these workers, using Al and machine learning to reduce accidents and injuries in underground mines while training the next generation of mining engineers. 

“There are many opportunities to obtain data in the mining industry,” says Nathalie Risso, assistant professor in the Department of Mining and Geological Engineering. “The data can be numbers, images, or video and help us learn how something works to make it better.” 

Risso led the development of an algorithm to visually scan worksites for issues with personal protection equipment (PPE) compliance. To train the algorithm, a campus-wide invitation rapidly populated a database of more than 2,000 photos of students and U of A community members in mining gear, ensuring diversity representative of multinational mining workforces. 

Researchers have also created an app that allows anyone — even those without safety expertise — to use smartphone cameras to determine risks for mine collapse and falling rock. Users capture video or photos that an algorithm analyzes in real time. The university’s San Xavier Underground Mining Laboratory provided visuals to train the algorithm, and the app will crowdsource more data from ongoing use, continually refining its sensitivity to hazardous conditions.  

The co-existence between manned and autonomous equipment, like the massive haul trucks, presents another safety challenge. Risso’s team is developing training methods to prepare new workers to interact with mining robots in a safe and effective manner by taking advantage of AI and virtual and extended reality. Her team is also using AI to train autonomous drones and robots for rescue missions. These devices can help evaluate injuries, identify hazards such as toxic gases, and locate safe exit routes to effectively help coordinate rescue teams. 

The innovations have use in any high-risk emergency (think earthquakes or nuclear plant malfunctions) as well as non-crisis scenarios, potentially including extraterrestrial mining operations to provide water and resources for space exploration. 


AGRICULTURE  

 Story by Eric Van Meter 

Image
An image of the agriculture tech scene on a Yuma farm - two machines being pulled by tractors are visible and assisting with crop picking.

Through partnerships between the Arizona Experiment Station, Cooperative Extension, and YCEDA, the U of A is working hand-in-hand with growers, industry, and local leaders to ensure solutions are practical and scalable. The result? Yuma is becoming a global proving ground for AI-powered agriculture, helping producers worldwide navigate shifting climates, limited resources, and the demand for sustainable, high-yield food systems.  

When Tanya Hodges sees giant robots gliding through Yuma’s lettuce fields, distinguishing crops from weeds in milliseconds, she doesn’t just see machines. She sees the future of farming.  

“Yuma’s agricultural landscape offers an unparalleled opportunity to serve as a test site for ag-tech implementation,” says Hodges, who leads the Yuma Center of Excellence for Desert Agriculture. “The region’s diverse climate and cropping systems make it an ideal environment for a testbed for innovations in specialty crops and arid agriculture.”  

Traditional farming in Yuma County has long relied on manual labor for essential tasks like weeding. But as younger generations seek different careers and border restrictions tighten labor availability, technology is filling the gap.  

“Instead of hoeing by hand, automatic weeders and thinners are now being developed,” Hodges explains. “That’s all through AI and pixels — pictures that teach these systems to determine what a weed looks like versus what a lettuce plant looks like.”  

The robots — which can work up to 12 crop beds simultaneously — represent just one application of AI in Yuma’s evolving agricultural landscape. The U of A is also working with ag-tech companies using AI to distinguish beneficial insects from harmful ones, precision that matters because modern integrated pest management aims to maintain a balanced ecosystem rather than to eliminate all insects.  

Water management is another frontier where AI is making an impact. Yuma’s agricultural systems are already 84% to 90% efficient in water use, but AI-powered sensors could push that efficiency even higher — a critical advantage in the arid Southwest.  

The U of A’s commitment extends beyond the field through a network of partnerships, facilities, and infrastructure designed to fast-track innovation. The Yuma Agricultural Center functions as a real-world lab for testing and refining agricultural technologies, while the annual Ag Tech Conference will showcase emerging tools directly to farmers.  

AI-driven agriculture depends on massive data flows, from high-resolution crop images to real-time soil analytics. To enable this, the U of A and its partners, with support from state and county funding, helped launch a broadband expansion and smart farm initiative. This initiative is building a wireless network, anchored by more than 30 planned broadband network towers across Yuma’s greenbelt. Once complete, the system will provide high-speed connectivity to over 180,000 acres of irrigated farmland, creating a countywide fiber backbone and the largest connected tract of agricultural land in the country for precision agriculture innovation. 

The effort positions Yuma not just as a center for ag-tech, but also as a testbed for the future of farming worldwide. 


HEALTH CARE  

Story by Stacy Pigott 

Image
An image of a person wearing a wearable device. The device screen shows a human body with different pieces of information and charts available for viewing. The person's finger is hovering over the heart icon.

The combination of AI and wearable devices provides new opportunities to improve health through research. U of A researchers are harnessing data to detect stress and transform how we monitor our health — before symptoms appear.  

Anyone who owns a car knows that when the check engine light comes on, it’s time for a look under the hood or a trip to a mechanic. The human body doesn’t have a check engine light, but wearable sensors could fill that role for people thanks to the power of data. 

“Most of us wear health and fitness trackers on our wrists or fingers, and that’s an amazing window into our biology and how we operate,” says Shravan Aras, assistant director of sensor analysis and smart health platforms at the Center for Biomedical Informatics and Biostatistics. 

Aras, who also is an assistant research professor at the Mel and Enid Zuckerman College of Public Health, wants to help fellow scientists get the most out of wearable sensors by incorporating them into research studies and optimizing how the data they capture is explored and analyzed. The timing is perfect, as the growth of AI and machine learning has created tremendous opportunities for advancement. 

Aras earned a doctorate in computer science from the U of A. Over time, his research focus has shifted from space and satellites to the data collected by wearable sensors. He currently is part of a study — along with Esther Sternberg and J. Ray Runyon of the U of A’s Andrew Weil Center for Integrative Medicine — using AI to assess people’s stress response via digital, sweat-based biomarkers.  

Projects like this are the tip of the iceberg when it comes to using AI for health sciences research. Aras says AI lets researchers gather and handle significantly more data in a faster and more accessible way than before. And by combining AI with wearable sensors, Aras aims to shift health care from reactive to proactive — using data not just to monitor, but to predict.  

“My goal is to be able to predict things that are nonsymptomatic and do it in a proactive manner rather than doing it reactively, where you’re trying to always keep treating the symptoms,” he says. “I want to be able to predict, using sensors, that something is coming.” 


HUMANITIES 

Story by Mike Powell  

Image
A decorative image showing a green profile of a face overlaid with additional human profile views in tan, brown, and gray.

When we talk about the everyday applications of AI, we tend to frame them in terms of science or tech: The molecular models that help predict the toxicity or efficacy of prospective drugs, the algorithms that refine and accelerate our weather forecasts, the super-evolved fraud-detection software, and so on. But the implications of AI technologies as they evolve raise fundamentally human questions: Can AIs have emotional lives they can recount for others? Can they build and maintain meaningful relationships based on trust and vulnerability? Can they be funny?  

Ken McAllister, a professor and the associate dean of research and program innovation in the College of Humanities, is one of dozens of scholars at the U of A who have spent no small part of their careers exploring such questions. To him, now is just another exciting moment in the long, strange story of us.   

“If somebody were to just ask me what the relationship between AI and humanities is, I’d say they’re inextricable and they’ve always been inextricable,” McAllister says. “This creates lots of inroads for people in the humanities. It’s culture; it’s spirituality; it’s communicating with people who don’t speak our language — these are the things that people in the humanities have devoted their life to thinking about, analyzing and advancing.”  

“People get fixated on the machinery,” he says. “We focus on the accuracy of the printing press, or the speed of the steam-driven loom, forgetting that these machines — including today’s massive data centers — are mediators between parties. Humanities researchers interrogate the troubles and triumphs that attend those mediations, and signal to society when opportunities and dangers impend.”  

Language translation, for example. Or using predictive analytics to reconstruct lost buildings, or tireless customized tutoring on any subject. AI might surpass the human race; it also might just become another very useful tool. “My own perspective is that this is a liminal moment, a threshold,” McAllister says. “The human race has lived through many, many of these, where there’s a lot of anxiety about what’s going to happen — that we’re going to be replaced or die off.” So far, we always end up lasting longer than we expect.   

McAllister says one of the thrills of being in a moment like this is that the machinery of change is still visible enough to observe and question, whether our interests are environmental (what impact will AI have on energy and water usage?), material (how will AI affect the cost and production of human labor?), or otherwise.  

“Once we get a little further across the threshold and the door to the machine room starts to close, questions about how things work get much more difficult to answer — and to ask.”  

For his part, McAllister remains optimistic but says he tempers that optimism with activism. “The tremendous good that can come from these technologies is there for everyone to see,” he says. “And people are seeing it.  

“But we have to drive innovation in good’s direction. It has to be worked toward.”

Subscribe to the Alumni Insider

The Arizona Alumni Insider is a monthly newsletter for University of Arizona alumni. In it you will find information on alumni events around the country, news about university rankings and accomplishments, profiles of Wildcat alumni, donors and students who are making a difference, special opportunities just for Wildcats like you, and so much more.

Subscribe now