Mapping field photographs to textured surface meshes directly on mobile devices

Christian Kehl*, Simon J. Buckley, Sophie Viseur, Robert L. Gawthorpe, James R. Mullins, John A. Howell

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)
13 Downloads (Pure)

Abstract

The mapping of photographs to surface geometry is an important procedure for many applications within the geosciences. This paper proposes an interactive framework for feature-based image-to-geometry mapping that works directly on mobile devices, under challenging imaging conditions and with limited available hardware performance. The framework makes use of openly available digital elevation models (DEMs) together with mobile position-and-orientation sensor data. It integrates calculation heuristics for result evaluation and feedback, synthesising available knowledge in current registration literature. The approach is assessed on two image datasets captured on separate occasions. Their interpretations are mapped to one textured lidar surface model and the projection accuracy is qualitatively assessed. The experiments show a significant accuracy improvement in photograph registration results, as well as the faithful mapping of image interpretations on the underlying surface geometry. This semi-automatic, user-guided, interactive approach is superior to comparable fully automatic registration methods.

Original languageEnglish
Pages (from-to)398-423
Number of pages26
JournalPhotogrammetric Record
Volume32
Issue number160
Early online date4 Nov 2017
DOIs
Publication statusPublished - Dec 2017

Bibliographical note

Funded by Research Council of Norway. Grant Number: 234111/E30

Keywords

  • Feature-based registration
  • Geological interpretations
  • Image-to-geometry
  • Interactive framework
  • Mobile-device application
  • Outdoor environments

Fingerprint

Dive into the research topics of 'Mapping field photographs to textured surface meshes directly on mobile devices'. Together they form a unique fingerprint.

Cite this