-
Notifications
You must be signed in to change notification settings - Fork 259
Docker data processing (#1) #50
base: main
Are you sure you want to change the base?
Conversation
* Add Dockerfile and script * Change path in the volume * Add folder for the output data * Add docker file and fix paths to be compatible with python3
Hi @kacperkan! Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have you on file. In order for us to review and merge your code, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. If you have received this in error or have any questions, please contact us at [email protected]. Thanks! |
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Facebook open source project. Thanks! |
* Add Dockerfile and script * Change path in the volume * Add folder for the output data * Add docker file and fix paths to be compatible with python3
* Add Dockerfile and script * Change path in the volume * Add folder for the output data * Add docker file and fix paths to be compatible with python3 * Add processing of all shapes
pybind11 | ||
|
||
RUN cd Pangolin \ | ||
&& git submodule init \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Need to add a line saying && git checkout v0.6 \
after CDing into the Pangolin directory. Running this docker file with the newest version causes some problems with GL libraries.
Also, hope this gets pulled in soon, but thanks a lot for this Dockerfile setup, it's super useful. |
Add Dockerfile and script
Change path in the volume
Add folder for the output data
Add docker file and fix paths to be compatible with python3
The pull request includes a Dockerfile that allows to generate dataset, provided in the DeepSDF work. It runs headlessly without any issues on Nvidia cards.
I also added gitignore for common IDEs (mainly due to laziness to omit to add these files while committing the code).
I fixed
preprocess_data.py
file where,os.path.join
is called on both on a base path to the data, and a new path that considers the class and the instance of the object. I don't know why but running the original code created repeated paths (ex.data/ShapeNet/<class>/<instance>/data/ShapeNet ...
). I guess it is due to the breaking changes iniglob
method that occurred at some point.