Bader Rasheed
Structure Estimation of Adversarial Distributions for Enhancing Model Robustness: A Clustering-Based Approach
Rasheed, Bader; Khan, Adil; Masood Khattak, Asad
Abstract
In this paper, we propose an advanced method for adversarial training that focuses on leveraging the underlying structure of adversarial perturbation distributions. Unlike conventional adversarial training techniques that consider adversarial examples in isolation, our approach employs clustering algorithms in conjunction with dimensionality reduction techniques to group adversarial perturbations, effectively constructing a more intricate and structured feature space for model training. Our method incorporates density and boundary-aware clustering mechanisms to capture the inherent spatial relationships among adversarial examples. Furthermore, we introduce a strategy for utilizing adversarial perturbations to enhance the delineation between clusters, leading to the formation of more robust and compact clusters. To substantiate the method’s efficacy, we performed a comprehensive evaluation using well-established benchmarks, including MNIST and CIFAR-10 datasets. The performance metrics employed for the evaluation encompass the adversarial clean accuracy trade-off, demonstrating a significant improvement in both robust and standard test accuracy over traditional adversarial training methods. Through empirical experiments, we show that the proposed clustering-based adversarial training framework not only enhances the model’s robustness against a range of adversarial attacks, such as FGSM and PGD, but also improves generalization in clean data domains.
Citation
Rasheed, B., Khan, A., & Masood Khattak, A. (2023). Structure Estimation of Adversarial Distributions for Enhancing Model Robustness: A Clustering-Based Approach. Applied Sciences, 13(19), Article 10972. https://doi.org/10.3390/app131910972
Journal Article Type | Article |
---|---|
Acceptance Date | Sep 15, 2023 |
Online Publication Date | Oct 5, 2023 |
Publication Date | Oct 1, 2023 |
Deposit Date | Dec 1, 2023 |
Publicly Available Date | Dec 6, 2023 |
Journal | Applied Sciences |
Publisher | MDPI |
Peer Reviewed | Peer Reviewed |
Volume | 13 |
Issue | 19 |
Article Number | 10972 |
DOI | https://doi.org/10.3390/app131910972 |
Keywords | Deep neural networks; Robustness; Adversarial attacks; Adversarial training; Clustering |
Public URL | https://hull-repository.worktribe.com/output/4407860 |
Files
Published article
(392 Kb)
PDF
Publisher Licence URL
http://creativecommons.org/licenses/by/4.0
Copyright Statement
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
You might also like
A hybrid contextual framework to predict severity of infectious disease: COVID-19 case study
(2024)
Journal Article
Overhead Based Cluster Scheduling of Mixed Criticality Systems on Multicore Platform
(2023)
Journal Article
Downloadable Citations
About Repository@Hull
Administrator e-mail: repository@hull.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search