Skip to main content

Research Repository

Advanced Search

All Outputs (1)

Exploring the Impact of Conceptual Bottlenecks on Adversarial Robustness of Deep Neural Networks (2024)
Journal Article
Rasheed, B., Abdelhamid, M., Khan, A., Menezes, I., & Masood Khatak, A. (2024). Exploring the Impact of Conceptual Bottlenecks on Adversarial Robustness of Deep Neural Networks. IEEE Access, 12, 131323-131335. https://doi.org/10.1109/ACCESS.2024.3457784

Deep neural networks (DNNs), while powerful, often suffer from a lack of interpretability and vulnerability to adversarial attacks. Concept bottleneck models (CBMs), which incorporate intermediate high-level concepts into the model architecture, prom... Read More about Exploring the Impact of Conceptual Bottlenecks on Adversarial Robustness of Deep Neural Networks.