Research tackles human-AI teaming for critical decision-making

Image
A person wears an augmented reality headset inside a lab and a digital screen displays data in the background.
Karl Maasdam
Wearable sensors and augmented reality displays enable better two-way communication between humans and artificial intelligence agents.

Research tackles human-AI teaming for critical decision-making

Key Takeaways

With the rise of AI, automated systems are becoming increasingly capable of planning, inferencing, and assisting humans in making decisions.
David Kaber’s research probes the challenges humans face when interacting with complex automated or AI systems, particularly under time pressure and high operational stress.
Effective human-automation teaming requires better two-way communication and situational awareness.

Imagine you’re a Black Hawk helicopter pilot, flying through unexpected severe weather. The air is turbulent, visibility is dropping, and critical emergency checklists are flashing. Your workload is skyrocketing, and stresses are mounting.  
 
But in this cockpit, you’re not alone. An artificial intelligence copilot, monitoring your physiological state through an array of wearable sensors, detects your rising stress levels. Based on this real-time assessment of your cognitive workload, it knows you need more than just a standard checklist display.  

We need to effectively figure out how we can integrate human creativity and inventiveness with machine speed analytics from a decision-making perspective.
David Kaber,
College of Engineering Dean’s Professor and Associate Head of Research Programs

Through your augmented reality goggles, visual cues appear — perhaps a pointing hand highlighting a specific instrument or simplified instructions guiding you through the critical steps, delivered precisely when you need them without overloading your senses.  

That scenario, where advanced automation works with the human operator, understanding their state and adjusting the level of support it provides, is at the heart of the research conducted by David Kaber, a new industrial engineering faculty member at Oregon State University. 

New lab focuses on human-AI teaming

“When people get stressed out and are under a really high workload, they become prone to cognitive tunneling,” Kaber said. “They focus intensely on a narrow set of information and overlook other relevant details. Decision-making is impaired, as people in a tunnel state tend to rely on heuristics — whatever has worked before, even though that might not be the best option for the current circumstances.”  

Image
Three people stand in a lab with a digital screen in the background. One person wears an augmented reality headset and gloves.
David Kaber, center, is working to better integrate human-machine systems.

Kaber, who is the College of Engineering Dean’s Professor and associate head of research programs, is establishing a new lab focused on human-AI teaming, bringing a wealth of experience in human factors and human systems engineering to Oregon State’s growing robotics and AI programs. 

“My work is focused on how to effectively design autonomous agents and human interfaces to better integrate human-machine systems,” Kaber said.  

Historically, automation has focused on executing routine tasks, like an autopilot following a flight plan, leaving more complex work to the human brain. However, with the rise of AI, automated systems are becoming increasingly capable of planning, inferencing, and assisting humans in making decisions.  

"Automation is moving more into the cognitive aspect of operation,” Kaber said. “We need to effectively figure out how we can integrate human creativity and inventiveness with machine speed analytics from a decision-making perspective.”  

Using AI for better human decision-making

The problem at the core of Kaber’s research is the set of challenges humans face when interacting with complex automated or AI systems, particularly under time pressure and high operational stress. As illustrated by the pilot scenario, these challenges can lead to cognitive tunneling.  

“The result is poor situation awareness and, ultimately, bad decision-making,” Kaber said. “Our goal is to help humans make better decisions through effective interface and automation design.” 

Kaber’s research involves several key methodologies: 

  • Computational cognitive performance modeling: Kaber’s lab uses computerized tools to predict task performance time and cognitive workload. 
  • Real-time cognitive state assessment: Wearable physiological sensors measure cardiac responses, galvanic skin response, blood volume pulse, EEG, blink rate, and pupil size.  
  • Machine learning classification: Real-time physiological data is processed through machine learning classifiers to determine the operator’s cognitive workload state within seconds. 
  • Adaptive automation and AR/VR interface design: Cognitive state is used to dynamically adjust the level or type of automation assistance provided.  

Kaber’s research focus areas

Beyond the Black Hawk pilot simulation, conducted with funding from Northrop Grumman, Kaber’s research spans various potential application areas: 

  • Army field medics: Developing AI agents delivered via head-mounted displays to assist medics under extremely stressful field conditions.

  • Powered prosthetics: Collaborating on designing interfaces for upper extremity prosthetics, using EMG signals and machine learning to control motion

  • Police patrol with quadruped robots: Exploring how officers on foot patrol, who lose the information access available in their vehicle, could be assisted by dexterous quadruped robots carrying equivalent information sources.  

Looking ahead, Kaber envisions a future where the model of human-automation interaction is inverted. Rather than autonomous machines performing a set of programmed tasks with humans acting as passive overseers, humans will lead AI-powered agents, providing critical guidance and direction.   

“High-level automation should operate as a personable and honest partner that can recognize situations it cannot handle and communicate, ‘There’s a set of circumstances that I can’t handle here. Could you please help me?’” Kaber said. “This contrasts with current systems, which often act as if they can do everything or are frustratingly intrusive.”  

June 16, 2025

Related Researchers

Portrait of David Kaber.
David Kaber

College of Engineering Dean's Professor

Related Stories