Archibald, B. , Calder, M. , Sevegnani, M. and Xu, M. (2021) Observable and Attention-Directing BDI Agents for Human-Autonomy Teaming. In: Third Workshop on Formal Methods for Autonomous Systems (FMAS 2021), 21-22 Oct 2021, pp. 167-175. (doi: 10.4204/EPTCS.348.12)
![]() |
Text
253955.pdf - Accepted Version 248kB |
Publisher's URL: http://eptcs.web.cse.unsw.edu.au/paper.cgi?FMAS2021.12
Abstract
Human-autonomy teaming (HAT) scenarios feature humans and autonomous agents collaborating to meet a shared goal. For effective collaboration, the agents must be transparent and able to share important information about their operation with human teammates. We address the challenge of transparency for Belief-Desire-Intention agents defined in the Conceptual Agent Notation (CAN) language. We extend the semantics to model agents that are observable (i.e. the internal state of tasks is available), and attention-directing (i.e. specific states can be flagged to users), and provide an executable semantics via an encoding in Milner's bigraphs. Using an example of unmanned aerial vehicles, the BigraphER tool, and PRISM, we show and verify how the extensions work in practice.
Item Type: | Conference Proceedings |
---|---|
Status: | Published |
Refereed: | Yes |
Glasgow Author(s) Enlighten ID: | Archibald, Dr Blair and Calder, Professor Muffy and Sevegnani, Dr Michele and Xu, Dr Mengwei |
Authors: | Archibald, B., Calder, M., Sevegnani, M., and Xu, M. |
College/School: | College of Science and Engineering > School of Computing Science |
Research Group: | FATA |
ISSN: | 2075-2180 |
Copyright Holders: | Copyright © 2021 The Authors |
First Published: | First published in Electronic Proceedings in Theoretical Computer Science 348:167–175 |
Publisher Policy: | Reproduced in accordance with the publisher copyright policy |
Related URLs: |
University Staff: Request a correction | Enlighten Editors: Update this record