Skip to content

A library with functions of instruction interpretation, object recognition, arm motion planning and execution.

Notifications You must be signed in to change notification settings

scottlx/Wheelchair-Arm-Control

Repository files navigation

Wheelchair-Arm-Control

A robot arm control library with functions of instruction interpretation, object recognition, arm motion planning and execution.

Video demonstration

https://youtu.be/ml04Gt5j70s

Environment Setup

Ubuntu 18.04.1 LTS
python 2.7
ROS Melodic
ros_control
moveit!
gazebo_ros_pkgs
Pointnet
pyopenssl
MQTT
tensorflow 1.0
meshlab

How to run

  1. clone to your ros catkin workspace and build the package
cd ../catkin_ws/src
git clone https://github.com/scottlx/Wheelchair-Arm-Control.git  
cd ..  
catkin_make
  1. source setup.bash
source ../catkin_ws/devel/setup.bash

overview

my_arm

urdf files, mesh files, rviz model visualize launch file, gazebo launch file.

Visualize the arm model in rviz:

roslaunch my_arm display.launch

Spawn the model into gazebo:

roslaunch my_arm gazebo.launch

arm_moveit_config

Config file generated by moveit setup assistant

play with the motion_planning plugin in rviz

roslaunch arm_moveit_config execution.launch

alt

python_interface

A python execution script which integrates the whole system

python MQTT_sub.py

Object Recognition using Pointnet

  1. place a kinect in the gazebo Environment

  2. get raw point cloud data and preprocess (seperate the data into small batches and do normalization etc.) Here is the original point cloud data alt

  3. feed the preprocess point cloud data into the pointnet Here is the point cloud labeled by different colors: alt

  4. get the location of interested object according to labels' of points

Alexa

1.Go to Alexa developer console to create a new skill https://developer.amazon.com/alexa/console/ask 2.Go to A mazon Web Service to create new Lambda function and ioT service https://console.aws.amazon.com/console/home?region=us-east-1# 3.Use Alexa_skill.json to deploy your new skill 4.Upload Lambda_arm_control.zip to deploy your Lambda function 5.Connect three part together, and now you can see the topic published in AWS ioT MQTT client when you give new voice command to the Alexa

Goals to achieve

  1. Apply in a real robotic arm.

About

A library with functions of instruction interpretation, object recognition, arm motion planning and execution.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published