Direct Image-to-Geometry Registration Using Mobile Sensor Data

C. Kehl*, S. J. Buckley, R. L. Gawthorpe, I. Viola, J. A. Howell

*Corresponding author for this work

Research output: Contribution to journalConference article

4 Citations (Scopus)
4 Downloads (Pure)

Abstract

Adding supplementary texture and 2D image-based annotations to 3D surface models is a useful next step for domain specialists to make use of photorealistic products of laser scanning and photogrammetry. This requires a registration between the new camera imagery and the model geometry to be solved, which can be a time-consuming task without appropriate automation. The increasing availability of photorealistic models, coupled with the proliferation of mobile devices, gives users the possibility to complement their models in real time. Modern mobile devices deliver digital photographs of increasing quality, as well as on-board sensor data, which can be used as input for practical and automatic camera registration procedures. Their familiar user interface also improves manual registration procedures. This paper introduces a fully automatic pose estimation method using the on-board sensor data for initial exterior orientation, and feature matching between an acquired photograph and a synthesised rendering of the orientated 3D scene as input for fine alignment. The paper also introduces a user-friendly manual camera registration- and pose estimation interface for mobile devices, based on existing surface geometry and numerical optimisation methods. The article further assesses the automatic algorithm's accuracy compared to traditional methods, and the impact of computational- and environmental parameters. Experiments using urban and geological case studies show a significant sensitivity of the automatic procedure to the quality of the initial mobile sensor values. Changing natural lighting conditions remain a challenge for automatic pose estimation techniques, although progress is presented here. Finally, the automatically-registered mobile images are used as the basis for adding user annotations to the input textured model.

Original languageEnglish
Pages (from-to)121-128
Number of pages8
JournalISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences
Volume3
DOIs
Publication statusPublished - 2 Jun 2016
Event23rd International Society for Photogrammetry and Remote Sensing Congress, ISPRS 2016 - Prague, Czech Republic
Duration: 12 Jul 201619 Jul 2016

Fingerprint

sensor
geometry
Geometry
Mobile devices
sensors
Sensors
annotations
Cameras
cameras
photographs
photograph
surface geometry
photogrammetry
Photogrammetry
estimation method
automation
imagery
complement
illuminating
User interfaces

Keywords

  • Automatic Pose Estimation
  • Image-to-Geometry
  • Mobile Devices
  • Registration Interfaces
  • Virtual Outcrop Geology

ASJC Scopus subject areas

  • Earth and Planetary Sciences (miscellaneous)
  • Environmental Science (miscellaneous)
  • Instrumentation

Cite this

Direct Image-to-Geometry Registration Using Mobile Sensor Data. / Kehl, C.; Buckley, S. J.; Gawthorpe, R. L.; Viola, I.; Howell, J. A.

In: ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Vol. 3, 02.06.2016, p. 121-128.

Research output: Contribution to journalConference article

@article{48455ddb050440cea5f5d27b0a50041c,
title = "Direct Image-to-Geometry Registration Using Mobile Sensor Data",
abstract = "Adding supplementary texture and 2D image-based annotations to 3D surface models is a useful next step for domain specialists to make use of photorealistic products of laser scanning and photogrammetry. This requires a registration between the new camera imagery and the model geometry to be solved, which can be a time-consuming task without appropriate automation. The increasing availability of photorealistic models, coupled with the proliferation of mobile devices, gives users the possibility to complement their models in real time. Modern mobile devices deliver digital photographs of increasing quality, as well as on-board sensor data, which can be used as input for practical and automatic camera registration procedures. Their familiar user interface also improves manual registration procedures. This paper introduces a fully automatic pose estimation method using the on-board sensor data for initial exterior orientation, and feature matching between an acquired photograph and a synthesised rendering of the orientated 3D scene as input for fine alignment. The paper also introduces a user-friendly manual camera registration- and pose estimation interface for mobile devices, based on existing surface geometry and numerical optimisation methods. The article further assesses the automatic algorithm's accuracy compared to traditional methods, and the impact of computational- and environmental parameters. Experiments using urban and geological case studies show a significant sensitivity of the automatic procedure to the quality of the initial mobile sensor values. Changing natural lighting conditions remain a challenge for automatic pose estimation techniques, although progress is presented here. Finally, the automatically-registered mobile images are used as the basis for adding user annotations to the input textured model.",
keywords = "Automatic Pose Estimation, Image-to-Geometry, Mobile Devices, Registration Interfaces, Virtual Outcrop Geology",
author = "C. Kehl and Buckley, {S. J.} and Gawthorpe, {R. L.} and I. Viola and Howell, {J. A.}",
year = "2016",
month = "6",
day = "2",
doi = "10.5194/isprs-annals-III-2-121-2016",
language = "English",
volume = "3",
pages = "121--128",
journal = "ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences",
issn = "2194-9042",

}

TY - JOUR

T1 - Direct Image-to-Geometry Registration Using Mobile Sensor Data

AU - Kehl, C.

AU - Buckley, S. J.

AU - Gawthorpe, R. L.

AU - Viola, I.

AU - Howell, J. A.

PY - 2016/6/2

Y1 - 2016/6/2

N2 - Adding supplementary texture and 2D image-based annotations to 3D surface models is a useful next step for domain specialists to make use of photorealistic products of laser scanning and photogrammetry. This requires a registration between the new camera imagery and the model geometry to be solved, which can be a time-consuming task without appropriate automation. The increasing availability of photorealistic models, coupled with the proliferation of mobile devices, gives users the possibility to complement their models in real time. Modern mobile devices deliver digital photographs of increasing quality, as well as on-board sensor data, which can be used as input for practical and automatic camera registration procedures. Their familiar user interface also improves manual registration procedures. This paper introduces a fully automatic pose estimation method using the on-board sensor data for initial exterior orientation, and feature matching between an acquired photograph and a synthesised rendering of the orientated 3D scene as input for fine alignment. The paper also introduces a user-friendly manual camera registration- and pose estimation interface for mobile devices, based on existing surface geometry and numerical optimisation methods. The article further assesses the automatic algorithm's accuracy compared to traditional methods, and the impact of computational- and environmental parameters. Experiments using urban and geological case studies show a significant sensitivity of the automatic procedure to the quality of the initial mobile sensor values. Changing natural lighting conditions remain a challenge for automatic pose estimation techniques, although progress is presented here. Finally, the automatically-registered mobile images are used as the basis for adding user annotations to the input textured model.

AB - Adding supplementary texture and 2D image-based annotations to 3D surface models is a useful next step for domain specialists to make use of photorealistic products of laser scanning and photogrammetry. This requires a registration between the new camera imagery and the model geometry to be solved, which can be a time-consuming task without appropriate automation. The increasing availability of photorealistic models, coupled with the proliferation of mobile devices, gives users the possibility to complement their models in real time. Modern mobile devices deliver digital photographs of increasing quality, as well as on-board sensor data, which can be used as input for practical and automatic camera registration procedures. Their familiar user interface also improves manual registration procedures. This paper introduces a fully automatic pose estimation method using the on-board sensor data for initial exterior orientation, and feature matching between an acquired photograph and a synthesised rendering of the orientated 3D scene as input for fine alignment. The paper also introduces a user-friendly manual camera registration- and pose estimation interface for mobile devices, based on existing surface geometry and numerical optimisation methods. The article further assesses the automatic algorithm's accuracy compared to traditional methods, and the impact of computational- and environmental parameters. Experiments using urban and geological case studies show a significant sensitivity of the automatic procedure to the quality of the initial mobile sensor values. Changing natural lighting conditions remain a challenge for automatic pose estimation techniques, although progress is presented here. Finally, the automatically-registered mobile images are used as the basis for adding user annotations to the input textured model.

KW - Automatic Pose Estimation

KW - Image-to-Geometry

KW - Mobile Devices

KW - Registration Interfaces

KW - Virtual Outcrop Geology

UR - http://www.scopus.com/inward/record.url?scp=85021956754&partnerID=8YFLogxK

U2 - 10.5194/isprs-annals-III-2-121-2016

DO - 10.5194/isprs-annals-III-2-121-2016

M3 - Conference article

VL - 3

SP - 121

EP - 128

JO - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences

JF - ISPRS Annals of the Photogrammetry, Remote Sensing and Spatial Information Sciences

SN - 2194-9042

ER -