Gaussian Process Implicit Surface (GPIS) Implementation for Touch-GS: Visual-Tactile Supervised 3D Gaussian Splatting
git clone https://github.com/armlabstanford/GPIS.git
cd GPIS
# create conda environment
conda create -n GPIS python=3.8
conda activate GPIS
conda install pytorch torchvision torchaudio pytorch-cuda=11.8 cudatoolkit=11.8 -c pytorch -c nvidia
pip install -r requirements.txt
The dataset used in our paper is available at Google Drive. This dataset includes the combined point cloud and normals from DenseTact for real data. For synthetic data we include the raw depth images rendered from blender. Download and extract the data to the data
folder. If you don't have one, make it with mkdir data
.
├── data
│ ├── real_bunny
│ │ ├── ...
│ ├── syn_bunny
│ │ ├── ...
python real_scene.py {PARAM_FILE}
python syn_scene.py {PARAM_FILE}
python real_scene.py data/real_bunny/params_bunny.json
Data is outputted to output/{depth, var}
and will include both GPIS estimated depths and uncertainties.
@article{swann2024touchgs,
author = {Aiden Swann and Matthew Strong and Won Kyung Do and Gadiel Sznaier Camps and Mac Schwager and Monroe Kennedy III},
title = {Touch-GS: Visual-Tactile Supervised 3D Gaussian Splatting},
journal = {arXiv},
year = {2024},
}