SemABC: Semantic-Guided Adaptive Bias Calibration for Generative Zero-Shot Point Cloud Segmentation

Due to the limited quantity and high cost of high-quality three-dimensional annotations, generalized zero-shot point cloud segmentation aims to transfer the knowledge of seen to unseen classes by leveraging semantic correlations to achieve generalization purposes. Existing generative point cloud sem...

Full description

Saved in:
Bibliographic Details
Main Authors: Yuyun Wei, Meng Qi
Format: Article
Language:English
Published: MDPI AG 2025-07-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/15/8359
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849407815955775488
author Yuyun Wei
Meng Qi
author_facet Yuyun Wei
Meng Qi
author_sort Yuyun Wei
collection DOAJ
description Due to the limited quantity and high cost of high-quality three-dimensional annotations, generalized zero-shot point cloud segmentation aims to transfer the knowledge of seen to unseen classes by leveraging semantic correlations to achieve generalization purposes. Existing generative point cloud semantic segmentation approaches rely on generators trained on seen classes to synthesize visual features for unseen classes in order to help the segmentation model gain the ability of generalization, but this often leads to a bias toward seen classes. To address this issue, we propose a semantic-guided adaptive bias calibration approach with a dual-branch network architecture. This network consists of a novel visual–semantic fusion branch alongside the primary segmentation branch to suppress the bias toward seen classes. Specifically, the visual–semantic branch exploits the visual–semantic relevance of the synthetic features of unseen classes to provide auxiliary predictions. Furthermore, we introduce an adaptive bias calibration module that dynamically integrates the predictions from both the main and auxiliary branches to achieve unbiased segmentation results. Extensive experiments conducted on standard benchmarks demonstrate that our approach significantly outperforms state-of-the-art methods on both seen and unseen classes, thereby validating the effectiveness of our approach.
format Article
id doaj-art-87c9ee646a0a413293123b457ea2ed1f
institution Kabale University
issn 2076-3417
language English
publishDate 2025-07-01
publisher MDPI AG
record_format Article
series Applied Sciences
spelling doaj-art-87c9ee646a0a413293123b457ea2ed1f2025-08-20T03:35:57ZengMDPI AGApplied Sciences2076-34172025-07-011515835910.3390/app15158359SemABC: Semantic-Guided Adaptive Bias Calibration for Generative Zero-Shot Point Cloud SegmentationYuyun Wei0Meng Qi1School of Information Science and Engineering, Shandong Normal University, Jinan 250358, ChinaSchool of Information Science and Engineering, Shandong Normal University, Jinan 250358, ChinaDue to the limited quantity and high cost of high-quality three-dimensional annotations, generalized zero-shot point cloud segmentation aims to transfer the knowledge of seen to unseen classes by leveraging semantic correlations to achieve generalization purposes. Existing generative point cloud semantic segmentation approaches rely on generators trained on seen classes to synthesize visual features for unseen classes in order to help the segmentation model gain the ability of generalization, but this often leads to a bias toward seen classes. To address this issue, we propose a semantic-guided adaptive bias calibration approach with a dual-branch network architecture. This network consists of a novel visual–semantic fusion branch alongside the primary segmentation branch to suppress the bias toward seen classes. Specifically, the visual–semantic branch exploits the visual–semantic relevance of the synthetic features of unseen classes to provide auxiliary predictions. Furthermore, we introduce an adaptive bias calibration module that dynamically integrates the predictions from both the main and auxiliary branches to achieve unbiased segmentation results. Extensive experiments conducted on standard benchmarks demonstrate that our approach significantly outperforms state-of-the-art methods on both seen and unseen classes, thereby validating the effectiveness of our approach.https://www.mdpi.com/2076-3417/15/15/8359adaptive bias calibrationvisual–semantic contrastive learninggeneralized zero-shot semantic segmentation
spellingShingle Yuyun Wei
Meng Qi
SemABC: Semantic-Guided Adaptive Bias Calibration for Generative Zero-Shot Point Cloud Segmentation
Applied Sciences
adaptive bias calibration
visual–semantic contrastive learning
generalized zero-shot semantic segmentation
title SemABC: Semantic-Guided Adaptive Bias Calibration for Generative Zero-Shot Point Cloud Segmentation
title_full SemABC: Semantic-Guided Adaptive Bias Calibration for Generative Zero-Shot Point Cloud Segmentation
title_fullStr SemABC: Semantic-Guided Adaptive Bias Calibration for Generative Zero-Shot Point Cloud Segmentation
title_full_unstemmed SemABC: Semantic-Guided Adaptive Bias Calibration for Generative Zero-Shot Point Cloud Segmentation
title_short SemABC: Semantic-Guided Adaptive Bias Calibration for Generative Zero-Shot Point Cloud Segmentation
title_sort semabc semantic guided adaptive bias calibration for generative zero shot point cloud segmentation
topic adaptive bias calibration
visual–semantic contrastive learning
generalized zero-shot semantic segmentation
url https://www.mdpi.com/2076-3417/15/15/8359
work_keys_str_mv AT yuyunwei semabcsemanticguidedadaptivebiascalibrationforgenerativezeroshotpointcloudsegmentation
AT mengqi semabcsemanticguidedadaptivebiascalibrationforgenerativezeroshotpointcloudsegmentation