Skip to content

Commit

Permalink
pinned all packages, simplified CLI options
Browse files Browse the repository at this point in the history
  • Loading branch information
adharshkamath committed Apr 16, 2024
1 parent 1719259 commit d576698
Show file tree
Hide file tree
Showing 5 changed files with 15 additions and 22 deletions.
11 changes: 7 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ opam init --compiler 4.14.1 # may take a while
eval $(opam env)

# Install Frama-C (including dependencies)
opam install frama-c
opam install frama-c=27.1
```

#### Install CVC4
Expand All @@ -39,7 +39,7 @@ mv cvc4-1.6-{ARCH}-opt cvc4
#### Install Alt-Ergo

```bash
opam install alt-ergo
opam install alt-ergo=2.4.3
```

#### Install Z3
Expand All @@ -48,6 +48,7 @@ opam install alt-ergo
wget wget https://github.com/Z3Prover/z3/releases/download/z3-4.12.2/z3-4.12.2-x64-glibc-2.31.zip
unzip z3-4.12.2-x64-glibc-2.31.zip
ln -s z3-4.12.2-x64-glibc-2.31/bin/z3
# Add z3 to your PATH
```

#### Configure Why3
Expand Down Expand Up @@ -117,6 +118,7 @@ export OPENAI_API_VERSION=<your API version>
You should now be able to run the toolchain using the following command:

```bash
cd src/
python3 main.py --config-file <YAML_config_file> --max-benchmarks <max_benchmarks> [options]
```

Expand All @@ -130,15 +132,16 @@ If you are hosting a model locally, you can use a local model inference/serving
Extract the dataset from the ZIP file and place it in the root directory of the repository. You can run the toolchain using the following command:

```bash
cd src/
python3 main.py --config-file <YAML_config_file> --max-benchmarks <max_benchmarks> --no-preprocess [options]
```

where options can be any of `--loop-invariants`, `--termination-analysis`, etc. Use `python3 main.py --help` to see the list of available options.

The `--no-preprocess` option is used to skip the pre-processing step, since the datasets are already pre-processed.
If you are using a different dataset with assertions in non-ACSL syntax, you can skip using this option.
Pre-processing involves removing comments, converting the assertions to ACSL syntax, and filtering out benchmarks not supported by the specified benchmark_features.

Use `python3 main.py --help` to see the list of available options.

The YAML configuration file contains the following fields:

```yaml
Expand Down
7 changes: 4 additions & 3 deletions src/benchmark.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ def __init__(self, benchmarks_file="", features=None, no_preprocess=True):
self.no_preprocess = no_preprocess

@abstractmethod
def preprocess(self, code):
def preprocess(self, code, features=None):
raise NotImplementedError

@abstractmethod
Expand Down Expand Up @@ -66,7 +66,7 @@ def extract_loop_invariants(self, code):
def get_variant_expressions(self, completions):
raise NotImplementedError

def validate_inputs(self):
def validate_inputs(self, no_preprocess=False):
if not os.path.exists(self.input_benchmarks):
raise InvalidBenchmarkException(
f"Input file {self.input_benchmarks} not found"
Expand All @@ -81,7 +81,8 @@ def validate_inputs(self):
code = None
with open(file) as f:
code = f.read()
self.preprocess(code, self.features)
if not no_preprocess:
self.preprocess(code, self.features)
self.input_file_paths.append(file)
except InvalidBenchmarkException as e:
print(f"Error: {e.message}. File: {file}.")
Expand Down
2 changes: 1 addition & 1 deletion src/boogie.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ def __init__(self, benchmarks_file="", features=None, no_preprocess=False):
self.features = features
self.input_file_paths = []

def preprocess(self, code):
def preprocess(self, code, features=None):
raise NotImplementedError

def combine_llm_outputs(self, checker_input, llm_outputs, features=None):
Expand Down
11 changes: 3 additions & 8 deletions src/loopy.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,14 +83,9 @@ def set_config(self, config_file, no_preprocess=False):
self.debug,
)

if not no_preprocess:
"""
This call will filter out benchmarks that
do not have features specified in the config file
"""
Logger.log_info("Validating input files")
self.benchmark.validate_inputs()
Logger.log_info(f"Found {len(self.benchmark.input_file_paths)} valid benchmarks")
Logger.log_info("Validating input files")
self.benchmark.validate_inputs(no_preprocess)
Logger.log_info(f"Found {len(self.benchmark.input_file_paths)} valid benchmarks")

return self

Expand Down
6 changes: 0 additions & 6 deletions src/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -125,12 +125,6 @@ def main(args):
k=args.repair_from_k,
num_repairs=args.repair_retries,
)
elif args.dump_dataset_to_file != "":
p.dump_dataset(
max_benchmarks=args.max_benchmarks,
start_index=args.start_index,
path=args.dump_dataset_to_file,
)
else:
raise Exception(f"Invalid input args: {vars(args)}")

Expand Down

0 comments on commit d576698

Please sign in to comment.