A co-occurrence region based Bayesian network stepwise remote sensing image retrieval algorithm

Although scholars have conducted numerous researches on content-based image retrieval and obtained great achievements, they make little progress in studying remote sensing image retrieval. Both theoretical and application systems are immature. Since remote sensing images are characterized by large d...

Full description

Saved in:
Bibliographic Details
Main Authors: Rui Zeng, Yingyan Wang, Wanliang Wang
Format: Article
Language:English
Published: Universidad Nacional de Colombia 2018-01-01
Series:Earth Sciences Research Journal
Subjects:
Online Access:https://revistas.unal.edu.co/index.php/esrj/article/view/66107
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Although scholars have conducted numerous researches on content-based image retrieval and obtained great achievements, they make little progress in studying remote sensing image retrieval. Both theoretical and application systems are immature. Since remote sensing images are characterized by large data volume, broad coverage, vague themes and rich semantics, the research results on natural images and medical images cannot be directly used in remote sensing image retrieval. Even perfect content-based remote sensing image retrieval systems have many difficulties with data organization, storage and management, feature description and extraction, similarity measurement, relevance feedback, network service mode, and system structure design and implementation. This paper proposes a remote sensing image retrieval algorithm that combines co-occurrence region based Bayesian network image retrieval with average high-frequency signal strength. By Bayesian networks, it establishes correspondence relationships between images and semantics, thereby realizing semantic-based retrieval of remote sensing images. In the meantime, integrated region matching is introduced for iterative retrieval, which effectively improves the precision of semantic retrieval.
ISSN:1794-6190
2339-3459