diff --git a/README.md b/README.md index 8534112..6f269b5 100644 --- a/README.md +++ b/README.md @@ -3,14 +3,6 @@ - **Project member**: Chanyoung Ahn, [Taeyeoung lee](https://github.com/sen-lee) - **Project Result** : [[Video]](https://youtu.be/xjTqSPl0Xxc), [[Image]](https://github.com/cold-young/CS492_Intelligence_Contest/tree/chanyoung/contact_grasp/imgs) -<<<<<<< HEAD -# CS492_Intelligence_Contest -#### forked from Deep Grasping ROS (Seunghyeok Back) - -- ROS wrapper for DNN based robotic grasping algorithms -- Support Contact-GraspNet [[paper]](https://arxiv.org/abs/2103.14127) [[code]](https://github.com/NVlabs/contact_graspnet) - -======= ROS wrapper for DNN based robotic grasping algorithms \ Support Contact-GraspNet [[paper]](https://arxiv.org/abs/2103.14127) [[code]](https://github.com/NVlabs/contact_graspnet) @@ -18,7 +10,6 @@ Support Contact-GraspNet [[paper]](https://arxiv.org/abs/2103.14127) [[code]](ht ->>>>>>> chanyoung/contact_grasp ## Contact-GraspNet Framework ``` Deep_grasping/src @@ -28,19 +19,6 @@ Contact_Graspnet_server.py (for ROS) - Contact_graspnet_client.py (for ROS) /uoais - launch/uoais_rs_d435.launch (need to modify name space) -<<<<<<< HEAD - -``` -## Env setup -### 1. Install Repository (CS492_Intelligence_contest/src) -1. git clone 'CS_492_intelligence_contest' or 'gist-ailab/deep-grasping' -2. 'cd src', git clone 'contact_graspnet','uoais' -``` -git clone https://github.com/SeungBack/contact_graspnet.git -git clone https://github.com/gist-ailab/uoais.git -``` - + Enviroment setup(1~7) + Test ROS nodes 1.Realsense D-435 ..(나중에 namespace 변경 필요할듯) -======= -(TOBO) uoais_client.py (TODO)namespace ``` @@ -53,7 +31,6 @@ git clone https://github.com/cold-young/contact_graspnet.git git clone https://github.com/cold-young/uoais.git ``` + Enviroment setup(1~7) + Test "ROS nodes 1.Realsense D-435" ..(나중에 namespace 변경 필요할듯) ->>>>>>> chanyoung/contact_grasp - (https://github.com/gist-ailab/uoais#environment-setup) 3. Install `easy_tcp_python2_3`, `open3d-ros-helper` (Use conda env!!!) - https://github.com/SeungBack/open3d-ros-helper (env py2.7, pip2 install~) @@ -64,11 +41,7 @@ git clone https://github.com/cold-young/uoais.git 1. contact_graspnet_env(for contact_graspnet_client.py) - python 3.7, tensorflow 2.2, CUDA 10.1, CUCNN 7.6.0 - yml file location: https://github.com/SeungBack/contact_graspnet -<<<<<<< HEAD -- Install : easy_tcp_python2_3, -======= - Install : easy_tcp_python2_3 ->>>>>>> chanyoung/contact_grasp ``` conda env create -f contact_graspnet_env.yml @@ -78,44 +51,6 @@ pip install easy-tcp-python2-3 ``` -<<<<<<< HEAD -2. uoais(for uoais.py) - - python 3.7, pytorch 1.9.0, torchvision 0.9.0, CUDA 11.1(or 10.1), detectron2 v0.5 or v0.6 -``` -conda create -n uoais python=3.7 -conda activate uoais -pip install torch torchvision -pip install shapely torchfile opencv-python pyfastnoisesimd rapidfuzz -``` - -- install detectron2... - (https://detectron2.readthedocs.io/en/latest/tutorials/install.html#requirements) - -``` -python -m pip install 'git+https://github.com/facebookresearch/detectron2.git' -``` -(add --user if you don't have permission) - -Or, to install it from a local clone: -```git clone https://github.com/facebookresearch/detectron2.git -python -m pip install -e detectron2 -``` - -## RUN -### Robot -- Use Haetae2025. If you use RViz another machine, you have to run 'pick_and_place_demo/demo.launch' on haetae2025 -``` -export ROS_MASTER_URI=http://192.168.0.100:11311 -export ROS_HOST_URI=http://192.168.0.24 ?????? -sudo route add -net 192.168.10.10 netmask 255.255.255.255 gw 192.168.0.100 - -source ./robot.sh real manip (or source ./robot_zsh.sh real manip) -roslaunch pick_and_place_demo demo.launch sim:=false perception_src:=none -``` - -### RViz -- If you want to use 'wrist_camera' topics, have to run 'demo.launch' -======= 2. uoais(for uoais.py : 작동 O) - Enviroment: python 3.7, pytorch 1.8.0+cu111, torchvision 0.9.0+cu111, CUDA 11.1, detectron2 v0.6+cu111+torch 1.8.0 ``` @@ -162,7 +97,6 @@ bash file .... ### RViz(Haetae ROS 연결 확인 후) - If you want to use 'wrist_camera' topics, have to run 'demo.launch' ->>>>>>> chanyoung/contact_grasp ``` export ROS_MASTER_URI=http://192.168.0.100:11311 source navigation_ws/devel/setup.bash @@ -175,19 +109,12 @@ rosrun rviz rviz roscd deep_grasping && python src/contact_grasp_server.py ``` -<<<<<<< HEAD - -### Contact graspnet client -1. contact graps client.py -- python == 3.7 please create conda env -======= ### Contact graspnet client #### **1. contact graps client.py** - python == 3.7 please create conda env(OK) ->>>>>>> chanyoung/contact_grasp ``` conda activate contact_graspnet_env \ - && roscd deep_grasping/src/contact_graspnet \ + && roscd deep_grasping_ros/src/contact_graspnet \ && CUDA_VISIBLE_DEVICES=0 python contact_graspnet/contact_grasp_client.py --local_regions --filter_grasps ``` - Check grasp pose, orientation @@ -195,18 +122,11 @@ conda activate contact_graspnet_env \ rosservice call /get_target_grasp_pose ``` -<<<<<<< HEAD -2. uoais client.py -======= #### **2. uoais client.py (예제는 작동 됨)** ->>>>>>> chanyoung/contact_grasp ``` conda activate uoais \ && roscd deep_grasping/src/uoais \ && CUDA_VISIBLE_DEVICES=1 python demo/uoais_client.py -<<<<<<< HEAD -``` -======= ``` We must make 'uoais_client.py' or other files \ @@ -261,4 +181,3 @@ exit() conda activate uoais roslaunch uoais uoais_rs_d435.launch ``` ->>>>>>> chanyoung/contact_grasp