Light Attack: A Physical World Real-Time Attack Against Object Classifiers

It is well known that deep neural networks (DNNs) are vulnerable to adversarial examples. In the digital world, most of the existing work makes classifiers or detectors fail by adding perturbations that are imperceptible to humans. In the physical world, existing work mostly invalidates classifiers...

Full description

Saved in:
Bibliographic Details
Main Authors: Ruizhe Hu, Ting Rui, Yan Ouyang, Jinkang Wang, Qunyan Jiang, Yinan Du
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/9791340/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849708248197758976
author Ruizhe Hu
Ting Rui
Yan Ouyang
Jinkang Wang
Qunyan Jiang
Yinan Du
author_facet Ruizhe Hu
Ting Rui
Yan Ouyang
Jinkang Wang
Qunyan Jiang
Yinan Du
author_sort Ruizhe Hu
collection DOAJ
description It is well known that deep neural networks (DNNs) are vulnerable to adversarial examples. In the digital world, most of the existing work makes classifiers or detectors fail by adding perturbations that are imperceptible to humans. In the physical world, existing work mostly invalidates classifiers or detectors by adding large unrealistic perturbations. In this paper, we attack the target classifiers from a new perspective of light attack. First, we define three lighting modes (Line, Point, Area), then we control the wavelength, intensity and position of light to accomplish adversarial attack, and finally we conduct ablation experiments and compare our method with existing mainstream methods. Experiments demonstrate that our proposed method is effective in both digital and physical worlds, and in darker environments, our attack method is much better than existing ones. Light attack enriches the current family of the adversarial attack while paving the way for future defense against light attacks as well as the study of light attack laws in nature.
format Article
id doaj-art-5ea9f9bb2d8e481a84f8dd054ec7e23f
institution DOAJ
issn 2169-3536
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-5ea9f9bb2d8e481a84f8dd054ec7e23f2025-08-20T03:15:43ZengIEEEIEEE Access2169-35362025-01-0113366013661010.1109/ACCESS.2022.31811979791340Light Attack: A Physical World Real-Time Attack Against Object ClassifiersRuizhe Hu0https://orcid.org/0000-0002-4891-9820Ting Rui1https://orcid.org/0000-0002-2949-5874Yan Ouyang2https://orcid.org/0000-0002-4481-9662Jinkang Wang3https://orcid.org/0000-0001-8866-6744Qunyan Jiang4Yinan Du5Department of Mechanical Engineering, College of Field Engineering, Army Engineering University of PLA, Nanjing, ChinaDepartment of Mechanical Engineering, College of Field Engineering, Army Engineering University of PLA, Nanjing, ChinaDepartment of Mechanical Engineering, College of Field Engineering, Army Engineering University of PLA, Nanjing, ChinaDepartment of Mechanical Engineering, College of Field Engineering, Army Engineering University of PLA, Nanjing, ChinaDepartment of Mechanical Engineering, College of Field Engineering, Army Engineering University of PLA, Nanjing, ChinaDepartment of Mechanical Engineering, College of Field Engineering, Army Engineering University of PLA, Nanjing, ChinaIt is well known that deep neural networks (DNNs) are vulnerable to adversarial examples. In the digital world, most of the existing work makes classifiers or detectors fail by adding perturbations that are imperceptible to humans. In the physical world, existing work mostly invalidates classifiers or detectors by adding large unrealistic perturbations. In this paper, we attack the target classifiers from a new perspective of light attack. First, we define three lighting modes (Line, Point, Area), then we control the wavelength, intensity and position of light to accomplish adversarial attack, and finally we conduct ablation experiments and compare our method with existing mainstream methods. Experiments demonstrate that our proposed method is effective in both digital and physical worlds, and in darker environments, our attack method is much better than existing ones. Light attack enriches the current family of the adversarial attack while paving the way for future defense against light attacks as well as the study of light attack laws in nature.https://ieeexplore.ieee.org/document/9791340/Adversarial exampleadversarial attacklight attackreal-time attackobject classifier
spellingShingle Ruizhe Hu
Ting Rui
Yan Ouyang
Jinkang Wang
Qunyan Jiang
Yinan Du
Light Attack: A Physical World Real-Time Attack Against Object Classifiers
IEEE Access
Adversarial example
adversarial attack
light attack
real-time attack
object classifier
title Light Attack: A Physical World Real-Time Attack Against Object Classifiers
title_full Light Attack: A Physical World Real-Time Attack Against Object Classifiers
title_fullStr Light Attack: A Physical World Real-Time Attack Against Object Classifiers
title_full_unstemmed Light Attack: A Physical World Real-Time Attack Against Object Classifiers
title_short Light Attack: A Physical World Real-Time Attack Against Object Classifiers
title_sort light attack a physical world real time attack against object classifiers
topic Adversarial example
adversarial attack
light attack
real-time attack
object classifier
url https://ieeexplore.ieee.org/document/9791340/
work_keys_str_mv AT ruizhehu lightattackaphysicalworldrealtimeattackagainstobjectclassifiers
AT tingrui lightattackaphysicalworldrealtimeattackagainstobjectclassifiers
AT yanouyang lightattackaphysicalworldrealtimeattackagainstobjectclassifiers
AT jinkangwang lightattackaphysicalworldrealtimeattackagainstobjectclassifiers
AT qunyanjiang lightattackaphysicalworldrealtimeattackagainstobjectclassifiers
AT yinandu lightattackaphysicalworldrealtimeattackagainstobjectclassifiers