Increasing Neural-Based Pedestrian Detectors’ Robustness to Adversarial Patch Attacks Using Anomaly Localization
Object detection in images is a fundamental component of many safety-critical systems, such as autonomous driving, video surveillance systems, and robotics. Adversarial patch attacks, being easily implemented in the real world, provide effective counteraction to object detection by state-of-the-art...
Saved in:
| Main Authors: | Olga Ilina, Maxim Tereshonok, Vadim Ziyadinov |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2025-01-01
|
| Series: | Journal of Imaging |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2313-433X/11/1/26 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
OD-SHIELD: Convolutional Autoencoder-Based Defense Against Adversarial Patch Attacks in Object Detection
by: Byeongchan Kim, et al.
Published: (2025-01-01) -
Fortify the Guardian, Not the Treasure: Resilient Adversarial Detectors
by: Raz Lapid, et al.
Published: (2024-11-01) -
URAdv: A Novel Framework for Generating Ultra-Robust Adversarial Patches Against UAV Object Detection
by: Hailong Xi, et al.
Published: (2025-02-01) -
Adversarial patch defense algorithm based on PatchTracker
by: Zhenjie XIAO, et al.
Published: (2024-02-01) -
Patch is enough: naturalistic adversarial patch against vision-language pre-training models
by: Dehong Kong, et al.
Published: (2024-12-01)