Dance video action recognition algorithm based on improved hypergraph convolutional networks

Dance as an art form is rich in movement and expression information. Accurately recognizing movements in dance videos is important for dance education, creation and performance. In view of this, the study takes the hypergraph convolutional network under deep learning as the framework basis, optimize...

Full description

Saved in:
Bibliographic Details
Main Authors: Ni Zhen, Yiyi Jiang
Format: Article
Language:English
Published: Elsevier 2025-12-01
Series:Systems and Soft Computing
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2772941925000651
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Dance as an art form is rich in movement and expression information. Accurately recognizing movements in dance videos is important for dance education, creation and performance. In view of this, the study takes the hypergraph convolutional network under deep learning as the framework basis, optimizes the performance by introducing the self-attention module and the topology module, constructs the temporal refinement channel and the channel refinement channel, and adds the spatio-temporal hypergraph convolutional network for channel fusion, and finally proposes a new video action recognition model. The experimental results show that the fastest iteration of this new model is 250 times, at which time the recognition accuracy is 95 %. The highest model P-value is 0.094, the highest R-value is 0.098, and the highest F1-value is 0.082. After the confusion test, the model shows >90 % recognition accuracy. The accuracy, validity, and fluency scores were all above 90 after the dance category was rated by the judges. In summary, the study improves the hypergraph convolutional network and applies it to dance video action recognition with higher effectiveness and better recognition accuracy, the study aims to provide a more effective technical means for the development of dance education and performance field.
ISSN:2772-9419