Skip to content

code for our ICCV 2021 paper "DeepCAD: A Deep Generative Network for Computer-Aided Design Models"

License

Notifications You must be signed in to change notification settings

ChrisWu1997/DeepCAD

Repository files navigation

DeepCAD

This repository provides source code for our paper:

DeepCAD: A Deep Generative Network for Computer-Aided Design Models

Rundi Wu, Chang Xiao, Changxi Zheng

ICCV 2021 (camera ready version coming soon)

We also release the Onshape CAD data parsing scripts here: onshape-cad-parser.

Prerequisites

  • Linux
  • NVIDIA GPU + CUDA CuDNN
  • Python 3.7, PyTorch 1.5+

Dependencies

Install python package dependencies through pip:

$ pip install -r requirements.txt

Install pythonocc (OpenCASCADE) by conda:

$ conda install -c conda-forge pythonocc-core=7.5.1

Data

Download data from here (backup) and extract them under data folder.

  • cad_json contains the original json files that we parsed from Onshape and each file describes a CAD construction sequence.
  • cad_vec contains our vectorized representation for CAD sequences, which serves for fast data loading. They can also be obtained using dataset/json2vec.py. TBA.
  • Some evaluation metrics that we use requires ground truth point clouds. Run:
    $ cd dataset
    $ python json2pc.py --only_test

The data we used are parsed from Onshape public documents with links from ABC dataset. We also release our parsing scripts here for anyone who are interested in parsing their own data.

Training

See all hyper-parameters and configurations under config folder. To train the autoencoder:

$ python train.py --exp_name newDeepCAD -g 0

For random generation, further train a latent GAN:

# encode all data to latent space
$ python test.py --exp_name newDeepCAD --mode enc --ckpt 1000 -g 0

# train latent GAN (wgan-gp)
$ python lgan.py --exp_name newDeepCAD --ae_ckpt 1000 -g 0

The trained models and experment logs will be saved in proj_log/newDeepCAD/ by default.

Testing and Evaluation

Autoencoding

After training the autoencoder, run the model to reconstruct all test data:

$ python test.py --exp_name newDeepCAD --mode rec --ckpt 1000 -g 0

The results will be saved inproj_log/newDeepCAD/results/test_1000 by default in the format of h5 (CAD sequence saved in vectorized representation).

To evaluate the results:

$ cd evaluation
# for command accuray and parameter accuracy
$ python evaluate_ae_acc.py --src ../proj_log/newDeepCAD/results/test_1000
# for chamfer distance and invalid ratio
$ python evaluate_ae_cd.py --src ../proj_log/newDeepCAD/results/test_1000 --parallel

Random Generation

After training the latent GAN, run latent GAN and the autoencoder to do random generation:

# run latent GAN to generate fake latent vectors
$ python lgan.py --exp_name newDeepCAD --ae_ckpt 1000 --ckpt 200000 --test --n_samples 9000 -g 0

# run the autoencoder to decode into final CAD sequences
$ python test.py --exp_name newDeepCAD --mode dec --ckpt 1000 --z_path proj_log/newDeepCAD/lgan_1000/results/fake_z_ckpt200000_num9000.h5 -g 0

The results will be saved inproj_log/newDeepCAD/lgan_1000/results by default.

To evaluate the results by COV, MMD and JSD:

$ cd evaluation
$ sh run_eval_gen.sh ../proj_log/newDeepCAD/lgan_1000/results/fake_z_ckpt200000_num9000_dec 1000 0

The script run_eval_gen.sh combines collect_gen_pc.py and evaluate_gen_torch.py. You can also run these two files individually with specified arguments.

Pre-trained models

Download pretrained model from here (backup) and extract it under proj_log. All testing commands shall be able to excecuted directly, by specifying --exp_name=pretrained when needed.

Visualization and Export

We provide scripts to visualize CAD models and export the results to .step files, which can be loaded by almost all modern CAD softwares.

$ cd utils
$ python show.py --src {source folder} # visualize with opencascade
$ python export2step.py --src {source folder} # export to step format

Script to create CAD modeling sequence in Onshape according to generated outputs: TBA.

Acknowledgement

We would like to thank and acknowledge referenced codes from DeepSVG, latent 3d points and PointFlow.

Cite

Please cite our work if you find it useful:

@InProceedings{Wu_2021_ICCV,
    author    = {Wu, Rundi and Xiao, Chang and Zheng, Changxi},
    title     = {DeepCAD: A Deep Generative Network for Computer-Aided Design Models},
    booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
    month     = {October},
    year      = {2021},
    pages     = {6772-6782}
}

About

code for our ICCV 2021 paper "DeepCAD: A Deep Generative Network for Computer-Aided Design Models"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published