Improving content-based image retrieval by identifying least and most correlated visual words

Leszek Kaliciak*, Dawei Song, Nirmalie Wiratunga, Jeff Pan

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we propose a model for direct incorporation of image content into a (short-term) user profile based on correlations between visual words and adaptation of the similarity measure. The relationships between visual words at different contextual levels are explored. We introduce and compare various notions of correlation, which in general we will refer to as image-level and proximity-based. The information about the most and the least correlated visual words can be exploited in order to adapt the similarity measure. The evaluation, preceding an experiment involving real users (future work), is performed within the Pseudo Relevance Feedback framework. We test our new method on three large data collections, namely MIRFlickr, ImageCLEF, and a collection from British National Geological Survey (BGS). The proposed model is computationally cheap and scalable to large image collections.

Original languageEnglish
Title of host publicationInformation Retrieval Technology - 8th Asia Information Retrieval Societies Conference, AIRS 2012, Proceedings
Pages316-325
Number of pages10
DOIs
Publication statusPublished - 2012
Event8th Asia Information Retrieval Societies Conference, AIRS 2012 - Tianjin, China
Duration: 17 Dec 201219 Dec 2012

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume7675 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference8th Asia Information Retrieval Societies Conference, AIRS 2012
CountryChina
CityTianjin
Period17/12/1219/12/12

Keywords

  • Content-based image retrieval and representation
  • Correlation
  • Local features
  • Pseudo relevance feedback
  • Similarity measure

Fingerprint Dive into the research topics of 'Improving content-based image retrieval by identifying least and most correlated visual words'. Together they form a unique fingerprint.

  • Cite this

    Kaliciak, L., Song, D., Wiratunga, N., & Pan, J. (2012). Improving content-based image retrieval by identifying least and most correlated visual words. In Information Retrieval Technology - 8th Asia Information Retrieval Societies Conference, AIRS 2012, Proceedings (pp. 316-325). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 7675 LNCS). https://doi.org/10.1007/978-3-642-35341-3_27