Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain Adaptation

Due to access to the source data during the transfer phase, conventional domain adaptation works have recently raised safety and privacy concerns. More research attention thus shifts to a more practical setting known as source-data-free domain adaptation (SFDA). The new challenge is how to obtain re...

Full description

Saved in:
Bibliographic Details
Main Authors: Yangkuiyi Zhang, Song Tang
Format: Article
Language:English
Published: MDPI AG 2025-04-01
Series:Mathematics
Subjects:
Online Access:https://www.mdpi.com/2227-7390/13/9/1491
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849322427000029184
author Yangkuiyi Zhang
Song Tang
author_facet Yangkuiyi Zhang
Song Tang
author_sort Yangkuiyi Zhang
collection DOAJ
description Due to access to the source data during the transfer phase, conventional domain adaptation works have recently raised safety and privacy concerns. More research attention thus shifts to a more practical setting known as source-data-free domain adaptation (SFDA). The new challenge is how to obtain reliable semantic supervision in the absence of source domain training data and the labels on the target domain. To that end, in this work, we introduce a novel <i>Gradual Geometry-Guided Knowledge Distillation</i> (G2KD) approach for SFDA. Specifically, to address the lack of supervision, we used local geometry of data to construct a more credible probability distribution over the potential categories, termed geometry-guided knowledge. Then, knowledge distillation was adopted to integrate this extra information for boosting the adaptation. More specifically, first, we constructed a neighborhood geometry for any target data using a similarity comparison on the whole target dataset. Second, based on pre-obtained semantic estimation by clustering, we mined soft semantic representations expressing the geometry-guided knowledge by semantic fusion. Third, using the soften labels, we performed knowledge distillation regulated by the new objective. Considering the unsupervised setting of SFDA, in addition to the distillation loss and student loss, we introduced a mixed entropy regulator that minimized the entropy of individual data as well as maximized the mutual entropy with augmentation data to utilize neighbor relation. Our contribution is that, through local geometry discovery with semantic representation and self-knowledge distillation, the semantic information hidden in the local structures is transformed to effective semantic self-supervision. Also, our knowledge distillation works in a gradual way that is helpful to capture the dynamic variations in the local geometry, mitigating the previous guidance degradation and deviation at the same time. Extensive experiments on five challenging benchmarks confirmed the state-of-the-art performance of our method.
format Article
id doaj-art-21592ea38c494afe8f2b9bd3be35814c
institution Kabale University
issn 2227-7390
language English
publishDate 2025-04-01
publisher MDPI AG
record_format Article
series Mathematics
spelling doaj-art-21592ea38c494afe8f2b9bd3be35814c2025-08-20T03:49:22ZengMDPI AGMathematics2227-73902025-04-01139149110.3390/math13091491Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain AdaptationYangkuiyi Zhang0Song Tang1IMI Group, University of Shanghai for Science and Technology, Shanghai 200093, ChinaIMI Group, University of Shanghai for Science and Technology, Shanghai 200093, ChinaDue to access to the source data during the transfer phase, conventional domain adaptation works have recently raised safety and privacy concerns. More research attention thus shifts to a more practical setting known as source-data-free domain adaptation (SFDA). The new challenge is how to obtain reliable semantic supervision in the absence of source domain training data and the labels on the target domain. To that end, in this work, we introduce a novel <i>Gradual Geometry-Guided Knowledge Distillation</i> (G2KD) approach for SFDA. Specifically, to address the lack of supervision, we used local geometry of data to construct a more credible probability distribution over the potential categories, termed geometry-guided knowledge. Then, knowledge distillation was adopted to integrate this extra information for boosting the adaptation. More specifically, first, we constructed a neighborhood geometry for any target data using a similarity comparison on the whole target dataset. Second, based on pre-obtained semantic estimation by clustering, we mined soft semantic representations expressing the geometry-guided knowledge by semantic fusion. Third, using the soften labels, we performed knowledge distillation regulated by the new objective. Considering the unsupervised setting of SFDA, in addition to the distillation loss and student loss, we introduced a mixed entropy regulator that minimized the entropy of individual data as well as maximized the mutual entropy with augmentation data to utilize neighbor relation. Our contribution is that, through local geometry discovery with semantic representation and self-knowledge distillation, the semantic information hidden in the local structures is transformed to effective semantic self-supervision. Also, our knowledge distillation works in a gradual way that is helpful to capture the dynamic variations in the local geometry, mitigating the previous guidance degradation and deviation at the same time. Extensive experiments on five challenging benchmarks confirmed the state-of-the-art performance of our method.https://www.mdpi.com/2227-7390/13/9/1491domain adaptationsource-data-freegeometry-guidedgradual knowledge distillationobject recognition
spellingShingle Yangkuiyi Zhang
Song Tang
Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain Adaptation
Mathematics
domain adaptation
source-data-free
geometry-guided
gradual knowledge distillation
object recognition
title Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain Adaptation
title_full Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain Adaptation
title_fullStr Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain Adaptation
title_full_unstemmed Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain Adaptation
title_short Gradual Geometry-Guided Knowledge Distillation for Source-Data-Free Domain Adaptation
title_sort gradual geometry guided knowledge distillation for source data free domain adaptation
topic domain adaptation
source-data-free
geometry-guided
gradual knowledge distillation
object recognition
url https://www.mdpi.com/2227-7390/13/9/1491
work_keys_str_mv AT yangkuiyizhang gradualgeometryguidedknowledgedistillationforsourcedatafreedomainadaptation
AT songtang gradualgeometryguidedknowledgedistillationforsourcedatafreedomainadaptation