ETRI-Knowledge Sharing Plaform

KOREAN
논문 검색
Type SCI
Year ~ Keyword

Detail

Journal Article 트랜스포머 기반 MUM-T 상황인식 기술: 에이전트 상태 예측
Cited - time in scopus Share share facebook twitter linkedin kakaostory
Authors
백재욱, 전성우, 김광용, 이창은
Issue Date
2023-11
Citation
로봇학회 논문지, v.18, no.4, pp.436-443
ISSN
1975-6291
Publisher
한국로봇학회
Language
Korean
Type
Journal Article
DOI
https://dx.doi.org/10.7746/jkros.2023.18.4.436
Abstract
With the advancement of robot intelligence, the concept of man and unmanned teaming (MUM-T) has garnered considerable attention in military research. In this paper, we present a transformer-based architecture for predicting the health status of agents, with the help of multi-head attention mechanism to effectively capture the dynamic interaction between friendly and enemy forces. To this end, we first introduce a framework for generating a dataset of battlefield situations. These situations are simulated on a virtual simulator, allowing for a wide range of scenarios without any restrictions on the number of agents, their missions, or their actions. Then, we define the crucial elements for identifying the battlefield, with a specific emphasis on agents’ status. The battlefield data is fed into the transformer architecture, with classification headers on top of the transformer encoding layers to categorize health status of agent. We conduct ablation tests to assess the significance of various factors in determining agents’ health status in battlefield scenarios. We conduct 3-Fold corss validation and the experimental results demonstrate that our model achieves a prediction accuracy of over 98%. In addition, the performance of our model are compared with that of other models such as convolutional neural network (CNN) and multi layer perceptron (MLP), and the results establish the superiority of our model.
KSP Keywords
Attention mechanism, Convolution neural network(CNN), Dynamic interaction, Multi-head, Number of Agents, Prediction accuracy, Robot intelligence, Virtual simulator, Wide range, health status, multilayer perceptron