RoSe-Mix: Robust and Secure Deep Neural Network Watermarking in Black-Box Settings via Image Mixup

Due to their considerable costs, deep neural networks (DNNs) are valuable assets that need to be protected in terms of intellectual property (IP). From this statement, DNN watermarking gains significant interest since it allows DNN owners to prove their ownership. Various methods that embed ownershi...

Full description

Saved in:
Bibliographic Details
Main Authors: Tamara El Hajjar, Mohammed Lansari, Reda Bellafqira, Gouenou Coatrieux, Katarzyna Kapusta, Kassem Kallas
Format: Article
Language:English
Published: MDPI AG 2025-03-01
Series:Machine Learning and Knowledge Extraction
Subjects:
Online Access:https://www.mdpi.com/2504-4990/7/2/32
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Due to their considerable costs, deep neural networks (DNNs) are valuable assets that need to be protected in terms of intellectual property (IP). From this statement, DNN watermarking gains significant interest since it allows DNN owners to prove their ownership. Various methods that embed ownership information in the model behavior have been proposed. They need to fill several requirements, among them the security, which represents an attacker’s difficulty in breaking the watermarking scheme. There is also the robustness requirement, which quantifies the resistance against watermark removal techniques. The problem is that the proposed methods generally fail to meet these necessary standards. This paper presents RoSe-Mix, a robust and secure deep neural network watermarking technique designed for black-box settings. It addresses limitations in existing DNN watermarking approaches by integrating key features from two established methods: RoSe, which uses cryptographic hashing to ensure security, and Mixer, which employs image Mixup to enhance robustness. Experimental results demonstrate that RoSe-Mix achieves security across various architectures and datasets with a robustness to removal attacks exceeding 99%.
ISSN:2504-4990