Domain-adapted driving scene understanding with uncertainty-aware and diversified generative adversarial networks

Yining Hua, Jie Sui, Hui Fang, Chuan Hu, Dewei Yi* (Corresponding Author)

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Autonomous vehicles are required to operate in an uncertain environment. Recent advances in computational intelligence techniques make it possible to understand driving scenes in various environments by using a semantic segmentation neural network, which assigns a class label to each pixel. It requires massive pixel-level labelled data to optimise the network. However, it is challenging to collect sufficient data and labels in the real world. An alternative solution is to obtain synthetic dense pixel-level labelled data from a driving simulator. Although the use of synthetic data is a promising way to alleviate the labelling problem, models trained with virtual data cannot generalise well to realistic data due to the domain shift. To fill this gap, the authors propose a novel uncertainty-aware generative ensemble method. In particular, ensembles are obtained from different optimisation objectives, training iterations, and network initialisation so that they are complementary to each other to produce reliable predictions. Moreover, an uncertainty-aware ensemble scheme is developed to derive fused prediction by considering the uncertainty from ensembles. Such a design can make better use of the strengths of ensembles to enhance adapted segmentation performance. Experimental results demonstrate the effectiveness of our method on three large-scale datasets.

Original languageEnglish
Number of pages12
JournalCAAI Transactions on Intelligence Technology
Early online date8 Jul 2023
DOIs
Publication statusE-pub ahead of print - 8 Jul 2023

Bibliographical note

Funding Information:
This work was supported by Fisheries Innovation & Sustainability (FIS) and the U.K. Department for Environment, Food & Rural Affairs (DEFRA) under grant number FIS039 and FIS045A.

Data Availability Statement

The data that support the findings of this study are openly available in the repositories and URLs below. The GTA5 Dataset at https://download.visinf.tu-darmstadt.de/data/from_games/. The SYNTHIA Dataset at http://synthia-dataset.net/downloads/. The Cityscapes Dataset at https://www.cityscapes-dataset.com/.

Keywords

  • adaptive intelligent systems
  • autonomous vehicles
  • computer vision
  • measurement uncertainty
  • neural network
  • object segmentation

Fingerprint

Dive into the research topics of 'Domain-adapted driving scene understanding with uncertainty-aware and diversified generative adversarial networks'. Together they form a unique fingerprint.

Cite this