Human-Drone Collaborations in Human-on-the-Loop Emergency Response Systems
The use of autonomous unmanned aerial vehicles (UAVs) or drones for emergency response scenarios, such as fire surveillance and search and rescue, offers great potential for societal benefits. UAV onboard sensors such as GPS, LiDAR, thermal cameras, and AI algorithms for planning and sensing enable them to make decisions and act autonomously in the environment without requiring explicit human control. However, human planning and strategic guidance can enhance the mission outcome. Therefore, designing a real-world solution in which humans and multiple autonomous UAVs work as a team in a time-critical environment demands bidirectional communications and collaborations. Our design of a UAV-driven emergency response system focuses on Humans-to-UAVs, UAVs-to-Humans, UAVs-to-UAVs, and Humans-to-Humans interactions.
We first followed a participatory design process in which we engaged domain experts in identifying and analyzing design tensions, and then designed a high-fidelity UI to support effective human situational awareness in a UAV-driven emergency response system. The design produced by this process reflected the domain knowledge and vision of the firefighters, as well as ideas for human-UAV partnerships in collaborative mission-centric environments. We have built upon this initial design to explore UAV-to-Human Interaction patterns, and have specifically highlighted the problem of information overload that occurs when explaining the autonomous behavior of multiple UAVs simultaneously. We conducted multiple user studies with domain experts and crowd workers to assess the impact of autonomy explanations on human awareness. Our analysis provided initial guidelines for designing explainable UIs for multi-UAV applications.
Second, We examined Human-to-UAV interactions in five different emergency response situations, which included River Search-and-Rescue, Defibrillator Delivery, Traffic Accident Surveillance, Water Sampling for hazardous chemicals, and Man Overboard to identify patterns of human interventions in the UAV autonomy. We leveraged the identified human intervention patterns for two purposes. First, we presented a human multi-UAV intervention model that formalizes human intervention in the autonomy of the UAV. Second, we developed a process based on a set of probing questions for eliciting and specifying human intervention requirements for multi-UAV use cases.
Finally, we developed a novel collaboration framework called Rescue-AR to support Human-to-Human and UAV-to-UAV collaborations during an emergency response. For humans, Rescue-AR addresses challenges associated with information sharing over the radio, such as lack of visualization and the transitory nature of the information. For UAVs, Rescue-AR leverages a shared AR world space to establish collaboration between multiple UAVs. We utilized the paradigm of Location-based Augmented Reality (AR) to Geo-spatially tag scene information on aerial video streams in real-time. Rescue-AR builds upon the previous design in the literature to provide a pragmatic solution for alleviating the social and organizational challenges of communication and collaboration during a UAV-driven emergency response.
In essence, this dissertation explores human-centered design solutions for achieving meaningful human-UAV partnerships.
History
Date Modified
2022-08-06Defense Date
2022-06-24CIP Code
- 40.0501
Research Director(s)
Jane Cleland-HuangDegree
- Doctor of Philosophy
Degree Level
- Doctoral Dissertation
Language
- English
Alternate Identifier
1338306814Library Record
6264115OCLC Number
1338306814Program Name
- Computer Science and Engineering