Spatial feature recognition and layout method based on improved CenterNet and LSTM frameworks

Existing spatial feature recognition and layout methods primarily identify spatial components manually, which is time-consuming and inefficient, and the constraint relationship between objects in space can be difficult to observe. Consequently, this study introduces an advanced spatial feature recog...

Full description

Saved in:
Bibliographic Details
Main Authors: Yuxuan Gu, Fengyu Liu, Xiaodi Yi, Lewei Yang, Yunshu Wang
Format: Article
Language:English
Published: Electronics and Telecommunications Research Institute (ETRI) 2025-08-01
Series:ETRI Journal
Subjects:
Online Access:https://doi.org/10.4218/etrij.2024-0192
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Existing spatial feature recognition and layout methods primarily identify spatial components manually, which is time-consuming and inefficient, and the constraint relationship between objects in space can be difficult to observe. Consequently, this study introduces an advanced spatial feature recognition and layout methodology employing enhanced CenterNet and LSTM (Long Short-Term Memory) frameworks, which is bifurcated into two major components—first, HCenterNet-based feature recognition enhances feature extraction through an attention mechanism and feature fusion technology, refining the identification of small targets within complex background areas; second, a GA-BiLSTM (Genetic Algorithm - Bidirectional LSTM)-based spatial layout model uses a bidirectional LSTM network optimized with a genetic algorithm (GA), aimed at fine-tuning the network parameters to yield more accurate spatial layouts. Experiments verified that compared with the CenterNet model, the recognition performance of the proposed HCenterNet-DIoU model improved by 7.44%. Moreover, the GA-BiLSTM model improved the overall layout accuracy by 10.08% compared with the LSTM model. Time cost analysis also confirmed that the proposed model could meet the real-time requirements.
ISSN:1225-6463
2233-7326