GraspFusion: Realizing Complex Motion by Learning and Fusing Grasp Modalities with Instance Segmentation (ICRA2019)
Shun Hasegawa*, Kentaro Wada*, Shingo Kitagawa, Yuto Uchimi, Kei Okada, Masayuki Inaba (* equal contribution)
- Install whole jsk_apc
catkin build grasp_fusion
just in case
# create catkin workspace
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws
git clone https://github.com/start-jsk/jsk_apc.git
cd src
rosdep install --from-path . -i -y -r
sudo -H pip install cupy-cuda101 # CUDA10.1, cupy-cuda92 for 9.2
cd ~/catkin_ws
source /opt/ros/kinetic/setup.zsh
catkin build grasp_fusion --no-deps
# create catkin workspace
mkdir -p ~/catkin_ws/src
cd ~/catkin_ws
git clone https://github.com/start-jsk/jsk_apc.git
cd src
wstool init
cat jsk_apc/.travis.rosinstall >> .rosinstall
cat jsk_apc/.travis.rosinstall.kinetic >> .rosinstall
wstool update -j -1
rosdep install --from-path . -i -y -r
sudo -H pip install cupy-cuda101 # CUDA10.1, cupy-cuda92 for 9.2
cd ~/catkin_ws
source /opt/ros/kinetic/setup.zsh
catkin build grasp_fusion
# item data (originally provided at Amazon Picking Challenge)
cd examples/grasp_fusion/instance_segmentation
./view_item_data_all.py
# training data (synthetic), testing data (real)
cd examples/grasp_fusion/instance_segmentation
./view_dataset.py --split train
./view_dataset.py --split test
# object-class-agnostic instance segmentation with Mask R-CNN
cd examples/grasp_fusion/instance_segmentation
./train.py --gpu 0
Figure: Visualization on Test Data (Top: ground truth, Bottom: prediction)
- Full log: https://drive.google.com/open?id=1LLYXlhqyliHJnJmN_BzN7HTTHFUiYqhL
- Model file: https://drive.google.com/uc?id=1rsXuIL-CAhBAzsJvZ2ZbGrbIOhb2dTGk
cd examples/grasp_fusion/affordance_segmentation
./get_heightmaps.py
./view_pinch_dataset.py
./view_suction_dataset.py
cd examples/grasp_fusion/affordance_segmentation
./train.py pinch --modal rgb+depth --resolution 15 --gpu 0
Figure: Visualization on Test Data (Top: ground truth, Bottom: prediction)
- Full log: https://drive.google.com/open?id=1wwUwEirUZITgpehvSYsnVj3uBiF8qiyp
- Model file: https://drive.google.com/uc?id=16NNJHVGja4NEW1LYRDYedJ0baqR6JXyZ
cd examples/grasp_fusion/affordance_segmentation
./train.py suction --modal rgb+depth --gpu 0
Figure: Visualization on Test Data (Top: ground truth, Bottom: prediction)
- Full log: https://drive.google.com/open?id=1z39gPIf_yXUnc1Dwpt4IvbZc_tmLUPtm
- Model file: https://drive.google.com/uc?id=1wTrWCPP2IuPzk06XQzn9oLJKk9iS1H7O
cd examples/grasp_fusion/primitive_matching
./get_primitives_poses.py
Figure: Visualization of Primitives (Red: suction, Green: pinch, Yellow: parallel-graspfusion)
rosrun grasp_fusion install_data.py
roslaunch grasp_fusion sample_instance_segmentation.launch
Figure: Predicted Instance Masks
roslaunch grasp_fusion sample_affordance_segmentation.launch affordance:=pinch
Figure: Predicted Pinch Points
roslaunch grasp_fusion sample_affordance_segmentation.launch affordance:=suction
Figure: Predicted Suction Points
roslaunch grasp_fusion sample_sole_affordance_segmentation.launch
Figure: Predicted Pinch Points Ignoring Inter-Object Relationship
roslaunch grasp_fusion sample_primitive_matching.launch
Figure: 3D Suction Points Predicted
# Holistic Integration for Demonstration
roslaunch grasp_fusion baxter.launch
roslaunch grasp_fusion setup_for_stow.launch
roslaunch grasp_fusion stow.launch
# Experiments of Grasp Stability
roslaunch grasp_fusion baxter.launch
roslaunch grasp_fusion setup_for_stow.launch
roslaunch grasp_fusion stow.launch main:=false
roseus `rospack find grasp_fusion`/euslisp/eval-picking.l
# in Euslisp interpreter
(eval-picking-init :ctype :larm-head-controller :moveit t)
(eval-picking-mainloop :larm)
## Please see warn messages and source codes for optional settings
@inproceedings{Hasegawa:etal:ICRA2019,
title={{GraspFusion}: Realizing Complex Motion by Learning and Fusing Grasp Modalities with Instance Segmentation},
author={Shun Hasegawa, Kentaro Wada, Shingo Kitagawa, Yuto Uchimi, Kei Okada, Masayuki Inaba},
booktitle={{Proceedings of the IEEE International Conference on Robotics and Automation (ICRA)}},
year={2019},
}
make install # Python3
# make install2 # Python2
source .anaconda/bin/activate
python -c 'import grasp_fusion_lib'
make lint
make test