vnncomp2021 icon indicating copy to clipboard operation
vnncomp2021 copied to clipboard

Tool submission

Open stanleybak opened this issue 4 years ago • 85 comments

The tool testing and submission instructions are in this document.

Please post your tool information according to the instructions in this topic.

stanleybak avatar Jun 13 '21 17:06 stanleybak

This is the entry for the nnenum tool. The tool is available using git:

TOOL_NAME=nnenum
REPO=https://github.com/stanleybak/nnenum.git 
COMMIT=c93a39cb568f58a26015bd151acafab34d2d4929
SCRIPTS_DIR=vnncomp_scripts

stanleybak avatar Jun 13 '21 17:06 stanleybak

Regarding the tool (and benchmark) submission deadline. I would like to propose to either strictly enforce both or give all participants one week from the last update to any benchmark to submit their tool (i.e., move that deadline) such that all other participants can check that their tools behaves as expected on all other benchmarks and potentially fix any problems coming up.

mnmueller avatar Jun 13 '21 19:06 mnmueller

I think your second option may be more realistic. Maybe we'll have a cutoff for scored benchmarks... if people submit something late and we still want to do them, them they won't be part of the scoring. How does that sound?

stanleybak avatar Jun 13 '21 19:06 stanleybak

I have no problem with either suggestion. While it would be great to be able to score all benchmarks, moving the deadline will obviously leave you (and the rest of the organizing team) with a tighter schedule, so I think this decision should be yours. If we go with excluding some benchmarks from the scoring, it would still be great to be able to test the final (example) benchmarks/networks before we have to submit our tools.

mnmueller avatar Jun 13 '21 20:06 mnmueller

@stanleybak, we intend to compete with a toolkit that hasn't been published yet; how should we proceed with this? We can share it via a Dropbox folder if that's okay with you?

We also have a dependency on the Xpress solver, they have free academic licenses or we could provide the license file (this is all described in our readme). If we are to provide the license file, we need a host-id from the AWS instance; this host-id changes every time the instance is stopped.

pat676 avatar Jun 25 '21 15:06 pat676

@stanleybak Should we have one result file for each instance run(i.e one onnx-vnnlib pair)? Or its a single result (.txt) file containing all the instances run?

Also, as per my understanding there will be 3 categories: mnist, cifar10 and acasxu. In that case, if I want to skip a particular benchmark under any these categories, how should I mention that? Or the Categories will be the benchmark names?

Neelanjana314 avatar Jun 26 '21 04:06 Neelanjana314

This is the entry for the NeuralVerification.jl tool. The tool is available using git: Passed docker test.

TOOL_NAME=NeuralVerification.jl
REPO=https://github.com/intelligent-control-lab/NeuralVerification.jl.git
COMMIT=4e612602ba4b34b42416742d85476d9b0dcdcb51
SCRIPTS_DIR=vnncomp_scripts

Wei-TianHao avatar Jun 28 '21 04:06 Wei-TianHao

@stanleybak, we have emailed you instructions on how to obtain the VeriNet toolkit. Please let us know if you need a different submission format.

pat676 avatar Jun 28 '21 09:06 pat676

@pat676 got it. We'll let you know if there are issues with the licensing.

@Neelanjana314 one result for each instance run. The run_instance.sh script gets passed in a single onnx path and a single vnnlib path, your tool should run on that single instance and then output the result. Our competition scripts will aggregate results.

stanleybak avatar Jun 28 '21 14:06 stanleybak

This is the submission for the oval framework:

TOOL_NAME=oval
REPO=https://github.com/oval-group/oval-bab.git 
COMMIT=0f39b4d685927c56f9e2c12307cc3d2b19be8bd6
SCRIPTS_DIR=vnncomp_scripts

alessandrodepalma avatar Jun 28 '21 18:06 alessandrodepalma

This is the submission for the ERAN framework: A gurobi license has to be acquired manually as detailed in the readme.

TOOL_NAME=ERAN
REPO=https://github.com/mnmueller/eran_vnncomp2021.git
COMMIT=<updated later in the topic>ca42b4afd1ff8cb92ebde5303fcce0db26357b49
SCRIPTS_DIR=vnncomp_scripts

Edit: ERAN should be run on a GPU instance

mnmueller avatar Jun 28 '21 18:06 mnmueller

@mnmueller great thanks. Were you able to test if / how to get Gurobi working on AWS?

stanleybak avatar Jun 28 '21 19:06 stanleybak

@stanleybak Yes. If you follow the README.md or uncomment the following block in install_tool_user.sh and copy a license key into the "###" block everything should work correctly.

#cd bin
#./grbgetkey #################### < ../../
#cd ../../../

mnmueller avatar Jun 28 '21 20:06 mnmueller

The submission for DNNF:

TOOL_NAME=DNNF
REPO=https://github.com/dlshriver/DNNF.git 
COMMIT=e2dafcc0017bdd555a777e8f6ae96d0af5813bfb
SCRIPTS_DIR=scripts/vnncomp

The README for the tool submission is here: https://github.com/dlshriver/DNNF/blob/e2dafcc0017bdd555a777e8f6ae96d0af5813bfb/scripts/vnncomp/README.md

dlshriver avatar Jun 28 '21 21:06 dlshriver

For Debona:

TOOL_NAME=Debona
REPO=https://github.com/ChristopherBrix/Debona
COMMIT=792575d18bb5f83cb8699dda6b9097dc41438e3d
SCRIPTS_DIR=Debona

https://github.com/ChristopherBrix/Debona

Thank you for organizing this competition!

ChristopherBrix avatar Jun 28 '21 22:06 ChristopherBrix

TOOL_NAME=nnv
REPO=https://github.com/verivital/nnv.git
COMMIT=de4b327fdf112888a03a0bd51c4f4854e1b8f53b
SCRIPTS_DIR=nnv/code/nnv/examples/Submission/VNN_COMP2021/vnncomp_scripts

https://github.com/verivital/nnv.git Readme : https://github.com/verivital/nnv/tree/master/code/nnv/examples/Submission/VNN_COMP2021/vnncomp_scripts/README.md

Neelanjana314 avatar Jun 29 '21 03:06 Neelanjana314

Here is the entry for venus2. Thank you for the organisation of the competition @stanleybak.

TOOL_NAME=venus2
REPO=https://github.com/pkouvaros/venus2_vnncomp21
COMMIT=c13f9bf486a5eaf82a9193836bc09d8e862c48f4
SCRIPTS_DIR=vnncomp_scripts

pkouvaros avatar Jun 29 '21 05:06 pkouvaros

For Marabou

TOOL_NAME=Marabou
REPO= https://github.com/anwu1219/Marabou_private.git
COMMIT=32bc82e785c570523c0af0a0e6e2b77c7e89986f
SCRIPTS_DIR=vnn-comp-scripts

Repo: https://github.com/anwu1219/Marabou_private.git Readme : https://github.com/anwu1219/Marabou_private/blob/vnn-comp-21/README.md

wu-haoze avatar Jun 29 '21 05:06 wu-haoze

For RPM

TOOL_NAME=RPM
REPO=https://github.com/StanfordMSL/Neural-Network-Reach.git
COMMIT=021a811153ae744bdbc49726809bf5670d9f33a2
SCRIPTS_DIR=vnncomp_scripts

Joe-Vincent avatar Jun 29 '21 07:06 Joe-Vincent

Here is our entry for alpha,beta-CROWN:

TOOL_NAME=alpha-beta-CROWN
REPO=https://github.com/huanzhang12/alpha-beta-CROWN
COMMIT=8144c10a4aa2c182e9556cc302c6654bbf9cbfc3
SCRIPTS_DIR=vnncomp_scripts

Our tool should be run on a GPU instance with Amazon Deep Learning AMI 46.0 (Ubuntu 18.04), and it should run all benchmarks without errors/crashes. It requires a Gurobi license. If you encounter any issues please let us know. Thank you!

huanzhang12 avatar Jun 29 '21 11:06 huanzhang12

Ok, the list of 12 tools is as follows:

alpha-beta-CROWN
Debona
DNNF
eran_vnncomp2021
Marabou_private
Neural-Network-Reach
NeuralVerification.jl
nnenum
nnv
oval-bab
venus2_vnncomp21
VeriNet

Let me know if I missed anyone.

stanleybak avatar Jun 29 '21 14:06 stanleybak

@Wei-TianHao Should NeuralVerification.jl be run on a CPU instance or GPU instance? Any custom installation instructions?

stanleybak avatar Jul 01 '21 16:07 stanleybak

It should be run on a CPU instance. The installation is fully automated.

Wei-TianHao avatar Jul 01 '21 16:07 Wei-TianHao

An issue came up that I'm going to need further input from everyone. The Amazon Deep Learning Base AMI we're using comes with standard deep learning frameworks like tensorflow and pytorch installed, but you still need to select which one using conda:

Please use one of the following commands to start the required environment with the framework of your choice:
for AWS MX 1.7 (+Keras2) with Python3 (CUDA 10.1 and Intel MKL-DNN) _______________________________ source activate mxnet_p36
for AWS MX 1.8 (+Keras2) with Python3 (CUDA + and Intel MKL-DNN) ___________________________ source activate mxnet_latest_p37
for AWS MX(+AWS Neuron) with Python3 ___________________________________________________ source activate aws_neuron_mxnet_p36
for AWS MX(+Amazon Elastic Inference) with Python3 _______________________________________ source activate amazonei_mxnet_p36
for TensorFlow(+Keras2) with Python3 (CUDA + and Intel MKL-DNN) _____________________________ source activate tensorflow_p37
for Tensorflow(+AWS Neuron) with Python3 _________________________________________ source activate aws_neuron_tensorflow_p36
for TensorFlow 2(+Keras2) with Python3 (CUDA 10.1 and Intel MKL-DNN) _______________________ source activate tensorflow2_p36
for TensorFlow 2.3 with Python3.7 (CUDA + and Intel MKL-DNN) ________________________ source activate tensorflow2_latest_p37
for PyTorch 1.4 with Python3 (CUDA 10.1 and Intel MKL) _________________________________________ source activate pytorch_p36
for PyTorch 1.7.1 with Python3.7 (CUDA 11.1 and Intel MKL) ________________________________ source activate pytorch_latest_p37
for PyTorch (+AWS Neuron) with Python3 ______________________________________________ source activate aws_neuron_pytorch_p36
for base Python3 (CUDA 10.0) _______________________________________________________________________ source activate python3

If you are using one of these frameworks, could you specify your tool name and which framework I should activate?

Alternatively, if you need multiple frameworks, I don't think this is going to work as you need to select one (I ran into this with ERAN @mnmueller , where I needed both pytorch and tensorflow). My idea then is I could just install the frameworks directly in that case (let me know if you have a better idea). If you don't want me to use the conda environment they provide, could you please specify your tool name and the commands I should use to install the necessary frameworks?

If you don't need any frameworks you don't need to provide further information.

stanleybak avatar Jul 01 '21 16:07 stanleybak

@stanleybak Our script for alpha-beta-CROWN has handled the selection of frameworks and there is no need to manually activate. Let us know if you encounter any troubles running our code. Thank you!

Generally, using multiple frameworks (tensorflow+pytorch like the case in ERAN) should be fine as long as people provide their requirements.txt listing all required Python packages (e.g., both tensorflow and pytorch can be listed in requirements.txt), and in the initial setup you can just install all required packages via python -m pip install -r requirements.txt (if that's not already included in install_tool.sh) in any python environment (either any of the preinstalled ones, or the base/vanilla one). This will install any missing packages listed in requirements.txt (if a package is already provided by the environment, it will not be reinstalled).

For example, in a PyTorch environment (like source activate pytorch_latest_p37) you can install tensorflow and all other required packages automatically using the python -m pip install -r requirements.txt command, and in all following experiments you just need to activate this single environment.

huanzhang12 avatar Jul 01 '21 18:07 huanzhang12

While everything @huanzhang12 said is true the ERAN install script was based on the vanilla python environment and as this might be updated between when I tested things and when you run/initialize the instances, it is probably better to use a well defined conda environment. I updated the install instructions and install script to include every step and install everything in a (custom) conda environment:

TOOL_NAME=ERAN
REPO=https://github.com/mnmueller/eran_vnncomp2021.git
COMMIT=808bfa4a1d3660c7e161ab1550f90392c9fdd2ee
SCRIPTS_DIR=vnncomp_scripts

mnmueller avatar Jul 01 '21 19:07 mnmueller

For DNNF, our install_tool.sh script should take care of creating a python virtual environment and installing the required frameworks, and the other scripts should automatically activate this virtual environment, so there shouldn't be a need to manually activate a conda environment.

dlshriver avatar Jul 01 '21 21:07 dlshriver

For VeriNet the install_tool.sh script should install all necessary requirements.

pat676 avatar Jul 02 '21 08:07 pat676

The install_tool.sh script for the oval framework installs all the necessary requirements into a new conda environment, which is then exploited by prepare_instance.sh and run_instance.sh scripts. No additional command should be required.

alessandrodepalma avatar Jul 02 '21 11:07 alessandrodepalma

@huanzhang12 I'm getting an error during install after entering the Gurobi key:

./vnncomp_scripts/install_tool.sh: line 77: /home/ubuntu/anaconda3/envs/pytorch_latest_p37/bin/grbgetkey: No such file or directory

Please modify install script and provide new commit hash.

stanleybak avatar Jul 02 '21 14:07 stanleybak