"Inverse Procedural Modeling: from Sketches to Buildings"
docker available! run inference, load paramter and render all in one button click:
-
docker pull registry.cn-hangzhou.aliyuncs.com/sanbingyouyong/building_dag_st:1.0
- 12 to 17G in size, may be able to shrink it in the future
-
docker run --rm -d -p 8502:8502 registry.cn-hangzhou.aliyuncs.com/sanbingyouyong/building_dag_st:1.0
or corresponding docker image iddocker run --rm -d -p 8502:8502 <image id>
-
go to
http://localhost:8502/
for streamlit-based webpage- upload sketch image (e.g.
./sample.png
) - click
Inference & Load Param & Render
- wait for rendered image and predicted params to show up, may consume about 10G RAM in the process (CPU currently, GPU version may come with nvidia container soon)
- upload sketch image (e.g.
-
closing docker container:
docker ps
to check container iddocker stop <container id>
- install conda environment from
environment.yml
- install Geometry Scripts if wish to experiment with node tree generation from Python
basic_building.py
,building_mass.py
,building4distortion.py
,ledge.py
,roof.py
,shape_primitives.py
,window_with_ledge.py
dataset_counter.py
,dataset_gen.py
,distortion.py
,generate_dataset.py
,merge_dataset.py
,paramgen.py
,paramload.py
,params.py
,render.py
nn_*.py
average_performances.py
,nn_acc.py
,performance.py
ui_*.py
dataset.blend
for generating synthetic datasetsinterface.blend
for user interfacedistortion.blend
for distorted sketches renderingdataset_distortion.blend
for generating distortion datasets
./models/*
: model training output, including checkpoint, loss records, meta info for backup, loss curve visualization and notes file../datasets/*
: dataset directory, containing generated DAGDataset(s) and DAGDatasetDistorted(s)- in working directory:
results*.yml
containing model test outputs,performance*.yml
containing model evaluation results,performance*.pdf
visualizing model evaluation results ./inference/*
: captured sketch and model output files
- Generating dataset: run
dataset_gen.py
orgenerate_dataset.py
with commandline args. Grammar:python generate_dataset.py batch_num sample_num varying_params_num device distortion
; e.g.python generate_dataset.py 10 10 5 0 0
- Neural network training: run
nn_driver.py
; modify config in code as needed. - User Interface: open
interface.blend
withBlender 3.2
, go toScripts
and runui_interface.py
, the panel should appear undertool
section; for testing without PyTorch installation and model weight files, switch the import from usingui_external_inference.py
to useui_mock_inference.py
; Click the pencil icon to use Blender's annotation tool to draw. Toggle and adjust camera view as needed. To run inference, make sure to have proper model weights in./models/EncDecModel.pth
and a corresponding meta file in./models/meta.yml
and to have created the./inference
folder.
- All data used for training and testing is generated by our own dataset generation pipeline