In a world where drones are becoming indispensable tools for infrastructure inspection, a recent study from NUS Computing researchers (Associate Professor Ooi Wei Tsang and CS PhD Student Xu Peisen), in collaboration with colleagues in France (Assistant Professor Jérémie Garcia and Professor Christophe Jouffrais from Université de Toulouse) through the IPAL laboratory, funded by the DesCartes program, offers a compelling glimpse into the future of drone safety. Presented at the prestigious ACM Conference on Computer-Human Interaction (CHI) and awarded an Honourable Mention—a distinction reserved for the top 5% of submitted papers—this research goes beyond efficiency to tackle one of the biggest challenges in drone operations: situational awareness and information overload.
The researchers introduced SafeSpect, an adaptive augmented reality (AR) interface that revolutionizes how drone pilots interact with critical information during inspections. While much of the drone tech evolution has focused on automation and speed, SafeSpect zeros in on the human operator—where often decisions are pressured, attention is split and safety is paramount. This innovation doesn’t just enhance performance; it could redefine how we think about the synergy between humans and machines.
The Critical Problem of Split Attention
Imagine flying a drone along the facade of a high-rise building, searching for cracks, corrosion, or other signs of structural wear. You should simultaneously keep the drone within your line of sight (a legal requirement in many countries) while also peering down at a tablet for the camera feed, battery level, GPS status, and more. When the drone is far away, the pilot tends to focus solely on the device screen, creating tunnel vision—a dangerous narrowing of attention that increases the risk of collisions, missed defects, or worse, catastrophic crashes.
Traditional 2D interfaces on tablets demand too much from pilots in high-stakes environments. The interface is cluttered, hard to read under sunlight, and non-contextual. And simply porting that same information into an AR heads-up display doesn’t fix the issue. Without intelligent filtering, it just moves the clutter from your hands to your face.
Enter SafeSpect: Adaptive, Smart, and Human-Centric
What sets SafeSpect apart is its adaptability. Instead of bombarding users with every bit of data at all times, the interface selectively displays information based on what’s happening in the moment. When flying a pre-programmed route on autopilot, for instance, pilots see only the next waypoint and a minimal data overlay. But the second something goes wrong—say, GPS drops out or the battery dips dangerously low—the system springs into action, surfacing critical safety indicators like heading direction, collision proximity, and return-to-home paths.
This context-aware design was inspired not by theory, but by the people who actually fly drones for a living. The researchers conducted structured workshops with five professional drone pilots, all seasoned veterans with years of inspection experience. The pilots sketched their ideal AR interface, highlighting what mattered most for safety: wind conditions, drone heading, GPS blackouts, safe landing zones, and real-time defect tracking.
Building the System, Testing the Stress
To bring those insights to life, the researchers developed a high-fidelity drone simulator in virtual reality using the Meta Quest 3. They built a virtual replica of downtown Singapore and crafted a complex inspection task: locate six different types of building defects, navigate through wind gusts, dodge obstacles like a moving gondola, and handle mid-mission emergencies.
Three of the original pilots helped fine-tune the system. The research team iterated on everything from the positioning of data to the visual style of alerts. A green safety boundary box surrounded the inspection area. A dynamic locator ring around the drone showed its heading and alerted the pilot to nearby threats. Ground projections highlighted where the drone would fall if it crashed. And return-to-home paths changed color as battery life ticked down—green for good, yellow for caution, red for critical.
All of these elements were designed not only to inform, but to direct pilot attention where it mattered most—without overwhelming them.
A True Field Test: Adaptive AR vs. The Status Quo
In a formal user study involving 15 amateur drone pilots, the SafeSpect team compared three interfaces: a traditional 2D tablet-like interface, a fully immersive AR interface with all elements visible at all times, and their adaptive AR interface, dubbed ADAPT-AR.
The results were revealing. While all three groups performed similarly in terms of objective outcomes (e.g., number of defects identified, path accuracy), the subjective feedback was where the adaptive interface shone. Pilots using ADAPT-AR reported significantly lower mental workload and significantly higher situational awareness compared to those using the tablet-style interface. They also felt more confident in their own performance and more attuned to their surroundings.
Interestingly, the adaptive interface was rated similarly to the full AR interface in many areas—but with one key difference: reduced visual clutter. By only displaying what’s needed, when it’s needed, ADAPT-AR gave pilots a cleaner, more focused experience. It was a smarter system, not just a flashier one.
Trust, Transparency, and the Human Factor
But no system is perfect. One of the most consistent pieces of feedback from users was the issue of trust. Some pilots expressed unease when the interface hid information, even if the system was intelligently deciding what was necessary. As one participant noted, “I want to see everything, just in case.”
This highlights a crucial design principle in safety-critical applications: transparency breeds trust. The best system isn’t the one that hides complexity—it’s the one that helps users understand when and why it’s simplifying. The researchers took this to heart, exploring ways to better communicate the logic behind SafeSpect’s decision-making.
Another challenge was comfort. Some pilots found the placement of 2D interface elements in the AR display to be physically awkward, especially when the drone was far away. Others experienced a slight lag when turning their heads—a technical hurdle that could be improved with future hardware.
Despite these limitations, the overarching takeaway was clear: augmented reality has enormous potential to support safer, smarter drone operations. But it must be grounded in real-world needs, shaped by expert feedback, and built with transparency and adaptability in mind.
What SafeSpect Means for the Future
While the focus of this study was on high-rise building inspections, the implications of SafeSpect stretch far beyond urban infrastructure. Any high-stakes drone operation—search and rescue missions, disaster assessment, firefighting reconnaissance, or even delivery in complex urban environments—could benefit from the core principles developed in this research.
In search and rescue, for example, real-time overlays could help locate missing persons while dynamically guiding the drone through treacherous terrain or weather. In agriculture, adaptive AR could highlight crop anomalies or detect pests without overwhelming the pilot with data. In the military or defense sector, AR-enhanced drone interfaces could reduce friendly fire incidents, improve threat detection, and enhance coordination between air and ground teams.
But perhaps the most exciting frontier lies in autonomous-human collaboration. As drones become more autonomous, human operators won’t disappear—they’ll shift roles, becoming supervisors rather than direct controllers. Adaptive AR systems like SafeSpect could become the essential interface layer between human intuition and machine execution, alerting operators to edge cases, malfunctions, or ethical concerns while maintaining their situational command.
Designing for Safety in a World of Distraction
In a hyper-connected, visually saturated world, drone operators face a barrage of information that threatens to obscure the very details they need to notice. SafeSpect, and the adaptive AR philosophy it embodies, represents a step toward not more information, but better information. Displayed at the right time, in the right place, for the right reason.
This research doesn’t just push the boundaries of human-computer interaction—it reframes the conversation. It reminds us that the goal of interface design, especially in safety-critical systems, isn’t to dazzle. It’s to guide, clarify and empower.
As cities grow taller, drones fly further, and missions grow more complex, tools like SafeSpect could be what keeps pilots grounded, focused, and safe. It’s not just about flying smarter. It’s about seeing better. And in the skies of tomorrow, vision might be the most important safety system of all.
Further Reading: Xu, P., Garcia, J., Ooi, W.T. and Jouffrais, C. (2025) “SafeSpect: Safety-First Augmented Reality Heads-up Display for Drone Inspections,” In ACM Conference on Human Factors in Computing Systems (CHI ’25), April 26–May 01, Yokohama, Japan, https://doi.org/10.1145/3706598.3714283