Eye-tracking for tailored autonomy
By: Amy Sprague
March 24, 2025
Top image: Two pairs of eye-tracking glasses with movement sensors. Photos by Dennis Wise, University of Washington.
NSF funds eye-tracking research to help us create autonomous systems that can adjust to individual comfort levels.
The future of building trustworthy autonomous systems may lie in wearing glasses. A&A Assistant Professor Karen Leung, with co-Principal Investigator Anat Caspi, director of the Allen School’s Taskar Center for Accessible Technology, has received a $300,000 National Science Foundation grant to explore how specialized eyeglasses could help autonomous vehicles and robots better understand and adapt to human comfort levels. Undergraduate researchers Senna Keesing (A&A), Marc Alwan (CSE) and Kyshawn Warren (ECE) are carrying out the research in Leung’s Control and Trustworthy Robotics Lab (CTRL).
This work stems from a simple observation: people aren't identical in their comfort levels around autonomous systems. "I've watched how people interact with autonomous systems in their daily lives," Leung shares. "What makes one person perfectly comfortable might make another quite nervous. We need to bridge this gap."
The research team’s approach involves specialized eyeglasses that observe how individuals scan their environment. These insights help autonomous systems understand each person's unique safety preferences and adapt accordingly. Picture an autonomous wheelchair that learns whether its user prefers to give other pedestrians a wide berth or is comfortable with closer encounters – all while maintaining core safety standards.

The research tackles a crucial challenge in autonomous mobility: earning public trust. Traditional autonomous systems operate with fixed safety parameters, potentially making some users uncomfortable while frustrating others with overcautious behavior. Leung's team aims to create more nuanced systems that can recognize and respond to individual comfort levels.
Beyond wheelchairs, this research could transform how delivery robots navigate college campuses or how autonomous vehicles interact with pedestrians in urban environments. The project combines advances in computer vision, human behavior understanding, and adaptive control systems.
The NSF grant, jointly supported by the Dynamics, Controls, and System Diagnostics and Mind, Machine, and Motor Nexus Programs, underscores the project's interdisciplinary significance. Leung's team is particularly focused on including diverse perspectives in their research, actively engaging underrepresented groups in robotics and fostering collaboration between computer vision, controls, and robotics researchers.
"We're not just developing technology. We're working to create autonomous systems that truly understand and respect human preferences. That's the key to building trust."



Left: Kyshawn Warren monitors data collection during a collision-avoidance exercise. Top right: Marc Alwan models the eye-tracking which register real-time gaze data on a connected smartphone. Bottom right: The customized hard hats with the Lab’s logo have sensors mounted to the top that track movement.
Demo of the dual view of cameras recording eye movements, what the subject sees and the accompanying data collection.
spacer