Intelligent machines have the potential to transform the world of defence, from unmanned vehicles and aircraft to AI assistants. With this rise in technology comes an increased role for human-machine teaming (HMT) – where humans and autonomous systems work together towards a shared goal.
One of the challenges for human factors specialists will be to integrate HMT into military systems, which are already complex, and optimise the team’s performance. How this could work and the considerations it involves – in particular how to quantify the functional state of human personnel in these systems – have been examined by Kate Ewing and Clare Borras from the Defence Science and Technology Laboratory (Dstl).
They emphasised the importance of being able to predict people’s ability to perform, and to trust and interact with the AI elements of their team. This involved deciding what factors to monitor and measure. They raised the question of whether human-centred design will need to happen alongside ‘agent-centred design’ to take AI agents into account.
And they also highlighted ethical concerns surrounding how we monitor humans, including using implanted sensors or AI algorithms which can access private mental states.
Kate presented the research at an online session of our Ergonomics & Human Factors 2022 conference earlier this month.