This is the implementation of Iterative Circuit Repair Against Formal Specifications (ICLR'23). It builds on the ML2 library.
ML2 is an open source Python library for machine learning research on mathematical and logical reasoning problems. The library includes the (re-)implementation of the research papers Teaching Temporal Logics to Neural Networks, Neural Circuit Synthesis from Specification Patterns and Iterative Circuit Repair Against Formal Specifications. So far, the focus of ML2 is on propositional and linear-time temporal logic (LTL) and sequence-to-sequence models, such as the Transformer and hierarchical Transformer. ML2 is actively developed at CISPA Helmholtz Center for Information Security.
Note on Docker: For data generation, evaluation, and benchmarking ML2 uses a variety of research tools (e.g. SAT solvers, model checker, and synthesis tools). For ease of use, each tool is encapsulated in a separate Docker container that is automatically pulled and launched when the tool is needed. Thus, ML2 requires Docker to be installed and running.
Before installing ML2, please note the Docker requirement.
To install ML2 from source, clone the git repo and install with pip as follows:
git https://github.com/reactive-systems/circuit-repair.git && \
cd ml2 && \
pip install .
For development pip install in editable mode and include the development dependencies as follows:
pip install -e .[dev]
Iterative Circuit Repair Against Formal Specifications (to appear at ICLR'23)
We present a deep learning approach for repairing sequential circuits against formal specifications given in linear-time temporal logic (LTL). Given a defective circuit and its formal specification, we train Transformer models to output circuits that satisfy the corresponding specification. We propose a separated hierarchical Transformer for multimodal representation learning of the formal specification and the circuit. We introduce a data generation algorithm that enables generalization to more complex specifications and out-of-distribution datasets. In addition, our proposed repair mechanism significantly improves the automated synthesis of circuits from LTL specifications with Transformers. It improves the state-of-the-art by 6.8 percentage points on held-out instances and 11.8 percentage points on an out-of-distribution dataset from the annual reactive synthesis competition.
A notebook guiding through the data generation can be found in notebooks/repair_datasets_creation.ipynb. A notebook giving an overview over all created datasets can be found in notebooks/datasets.ipynb. We provide a tabular overview at Google Sheets.
To train a separated hierarchical Transformer with default parameters:
python -m ml2.ltl.ltl_repair.ltl_repair_sep_hier_transformer_experiment train -n exp-repair-gen-96 -d scpa-repair-gen-96 --steps 20000 --val-freq 100 -u --tf-shuffle-buffer-size 10000
To evaluate a model on the Repair
dataset from our paper run the following command.
python -m ml2.ltl.ltl_repair.ltl_repair_sep_hier_transformer_experiment eval -n exp-repair-gen-96-0 -u -d val --beam-sizes 16
To iteratively evaluate on the LTL synthesis problem, run the following command.
python -m ml2.ltl.ltl_repair.ltl_repair_sep_hier_transformer_experiment pipe -n exp-repair-gen-96-0 --base-model repair-data-2 --beam-base 16 --beam-repair 16 --repeats 2 --samples 350 -d syntcomp --keep all
A notebook in notebooks/experiments.ipynb guides through analysis of the evaluation results.
The results of training on a large selection of our diverse datasets and the results of our hyperparamter study can be found in Google Sheets.
@inproceedings{cosler_iterative_2023,
title = {Iterative Circuit Repair Against Formal Specifications},
url = {https://openreview.net/forum?id=SEcSahl0Ql},
language = {en},
booktitle = {International Conference on Learning Representations},
author = {Cosler, Matthias and Schmitt, Frederik and Hahn, Christopher and Finkbeiner, Bernd},
year = {2023},
pubstate = {forthcoming}
}