Skip to content

AdaptiveMotorControlLab/AROS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

50 Commits
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

AROS: Adversarially Robust Out-of-Distribution Detection through Stability

PyPI version License: Apache 2.0 Paper

Overview

This repository contains the code for the paper "Adversarially Robust Out-of-Distribution Detection Using Lyapunov-Stabilized Embeddings". The method, termed AROS, employs Neural Ordinary Differential Equations (NODEs) with Lyapunov stability to create robust embeddings for OOD detection, significantly improving performance against adversarial attacks. This repository contains the code for the paper "Adversarially Robust Out-of-Distribution Detection Using Lyapunov-Stabilized Embeddings". The method, termed AROS, employs Neural Ordinary Differential Equations (NODEs) with Lyapunov stability to create robust embeddings for OOD detection, significantly improving performance against adversarial attacks. Additionally, the repository includes two notebooks: one demonstrates the training and evaluation process on the CIFAR-10 and CIFAR-100 datasets, while the other focuses on the ablation study.

AROS

Preprint

Check out our preprint now out on arXiv! Adversarially Robust Out-of-Distribution Detection Using Lyapunov-Stabilized Embeddings. Hossein Mirzaeri & Mackenzie W. Mathis. Oct 2024.

Key Features

  • Lyapunov Stability for OOD Detection: Ensures that perturbed inputs converge back to stable equilibrium points, improving robustness against adversarial attacks.
  • Fake Embedding Crafting Strategy: Generates fake OOD embeddings by sampling from the low-likelihood regions of the ID data feature space, eliminating the need for additional OOD datasets.
  • Orthogonal Binary Layer: Enhances separation between ID and OOD embeddings, further improving robustness.

Demo

  • Open In Colab This notebook is designed to replicate and analyze the results presented in Table 1 of the AROS paper, focusing on out-of-distribution detection performance under both attack scenarios and clean evaluation.
  • Open In Colab This notebook is designed to demo the ablation study.

Repository Structure

  • AROS/
    • data_loader.py: Contains the data loading utilities for training and evaluation.
    • evaluate.py: Implements the evaluation metrics and testing routines for the AROS model.
    • Main.py: The main script for training and testing AROS, combining all components.
    • stability_loss_function.py: Defines the Lyapunov-based loss function used for stabilizing the NODE dynamics.
    • utils.py: Includes various helper functions used throughout the project.
  • requirements.txt: Lists the dependencies required to run the project.
  • Notebooks/
    • AROS.ipynb
    • Notebooks/Ablation_Study.ipynb

Installation

To install the necessary packages, run:

pip install git+https://github.com/RobustBench/robustbench.git
pip install aros-node

To install the necessary packages from source (locally), run:

pip install -r requirements.txt

Citation

@article{mirzaei2024aros,
      title={Adversarially Robust Out-of-Distribution Detection Using Lyapunov-Stabilized Embeddings}, 
      author={Hossein Mirzaei and Mackenzie W. Mathis},
      year={2024},
      eprint={2410.10744},
      journal={arXiv},
      primaryClass={cs.LG},
      url={https://arxiv.org/abs/2410.10744}, 
}