TY - GEN
T1 - Using Self-Attention LSTMs to Enhance Observations in Goal Recognition
AU - Amado, Leonardo
AU - Paludo Licks, Gabriel
AU - Marcon, Matheus
AU - Fraga Pereira, Ramon
AU - Meneguzzi, Felipe
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/9/28
Y1 - 2020/9/28
N2 - Goal recognition is the task of identifying the goal an observed agent is pursuing. The quality of its results depends on the quality of the observed information. In most goal recognition approaches, the accuracy significantly decreases in settings with missing observations. To mitigate this issue, we develop a learning model based on LSTMs, leveraging attention mechanisms, to enhance observed traces by predicting missing observations in goal recognition problems. We experiment using a dataset of goal recognition problems and apply the model to enhance the observation traces where missing. We evaluate the technique using a state-of-the-art goal recognizer in four different domains to compare the accuracy between the standard and the enhanced observation traces. Experimental evaluation shows that recurrent neural networks with self-attention mechanisms improve the accuracy metrics of state-of-the-art goal recognition techniques by an average of 60%.
AB - Goal recognition is the task of identifying the goal an observed agent is pursuing. The quality of its results depends on the quality of the observed information. In most goal recognition approaches, the accuracy significantly decreases in settings with missing observations. To mitigate this issue, we develop a learning model based on LSTMs, leveraging attention mechanisms, to enhance observed traces by predicting missing observations in goal recognition problems. We experiment using a dataset of goal recognition problems and apply the model to enhance the observation traces where missing. We evaluate the technique using a state-of-the-art goal recognizer in four different domains to compare the accuracy between the standard and the enhanced observation traces. Experimental evaluation shows that recurrent neural networks with self-attention mechanisms improve the accuracy metrics of state-of-the-art goal recognition techniques by an average of 60%.
UR - http://www.scopus.com/inward/record.url?scp=85093848189&partnerID=8YFLogxK
U2 - 10.1109/IJCNN48605.2020.9207597
DO - 10.1109/IJCNN48605.2020.9207597
M3 - Published conference contribution
AN - SCOPUS:85093848189
T3 - Proceedings of the International Joint Conference on Neural Networks
SP - 1
EP - 8
BT - 2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 International Joint Conference on Neural Networks, IJCNN 2020
Y2 - 19 July 2020 through 24 July 2020
ER -