TagTeam: Towards Wearable-Assisted, Implicit Guidance for Human--Drone Teams
About
The availability of sensor-rich smart wearables and tiny, yet capable, unmanned vehicles such as nano quadcopters, opens up opportunities for a novel class of highly interactive, attention-shared human--machine teams. Reliable, lightweight, yet passive exchange of intent, data and inferences within such human--machine teams make them suitable for scenarios such as search-and-rescue with significantly improved performance in terms of speed, accuracy and semantic awareness. In this paper, we articulate a vision for such human--drone teams and key technical capabilities such teams must encompass. We present TagTeam, an early prototype of such a team and share promising demonstration of a key capability (i.e., motion awareness).
Kasthuri Jayarajah, Aryya Gangopadhyay, Nicholas Waytowich• 2022
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Keypoint Detection | ShapeNetCore V2 (test) | DAS87 | 56 | |
| 3D Keypoint Detection | ClothesNet Normal Placement Fold Clothes | DAS53.6 | 8 | |
| 3D Keypoint Detection | ClothesNet SE(3) Transformation Fold Clothes | DAS51.9 | 8 | |
| Keypoint Detection | KeypointNet | mIoU (Airplane)82.7 | 6 | |
| 3D Keypoint Detection | ClothesNet Drop Clothes | DAS (Hat)55.7 | 4 | |
| 3D Keypoint Detection | ClothesNet Drag Clothes | DAS (Hat)0.427 | 4 |
Showing 6 of 6 rows