Classifying motion states of AUV based on graph representation for multivariate time series

Chen Feng* (Corresponding Author), Shuang Gao, Simin Chen, Zhongke Gao, Celso Grebogi

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Motion state monitoring and recognition are the important issues to be dealt to improve the reliability of Autonomous Underwater Vehicle (AUV). In this work, we transform the motion state classification into Multivariate Time Series Classification (MTSC). By combining two kinds of MTSC methods, including the methods based on feature representation transformation and Deep Neural Network (DNN), we propose a new classification method for Multivariate Time Series (MTS). Multivariate monitoring data of AUV are fused to construct complex networks as graphs to represent the motion states. Then, Graph Convolutional Neural Network (GCNN) is used to extract the features of the graphs and classify the graphs. The effectiveness of our method is validated through sea experiments, whose data are from three classes of navigational motions: near the surface, at fixed depth, and influenced by unknown ocean currents at fixed depth. The experimental results show that the graphical representation based on complex networks can effectively describe the motion states. Compared with Support Vector Machine (SVM), the graphical features are extracted automatically by GCNN to get a higher accuracy of classification of the motion states. The experiments also show that the classification accuracy of our method is higher than that of other two DNNs.
Original languageEnglish
Article number113539
Number of pages10
JournalOcean Engineering
Volume268
Early online date28 Dec 2022
DOIs
Publication statusPublished - 15 Jan 2023

Keywords

  • AUV
  • MTSC
  • Complex network
  • GCNN

Fingerprint

Dive into the research topics of 'Classifying motion states of AUV based on graph representation for multivariate time series'. Together they form a unique fingerprint.

Cite this