Skip to content

fls_model_standalone

Brian Bingham edited this page Mar 18, 2020 · 30 revisions

Tutorial: FLS Standalone Model

In this example we will demonstrate one aspect of the fls_gazebo ROS package that includes a Gazebo model of a Blueview P900 forward looking sonar. See creating new models for generic past instructions on creating models.

The goal is to great a standalone instance of the forward looking sonar model currently used in uuv_simulator to support testing and evaluation.

Quickstart

To run the example, start gazebo with...

roslaunch fls_gazebo blueview_standalone.launch 

and then start the image viewer with...

rosrun rqt_image_view rqt_image_view and subscribe to /depth/image_raw_sonar.

Below provides an outline of the steps taken to create this example.

Creating a ROS package and a new model

Create the model by creating a fls_gazebo/models/blueview_P900 directory with the following:

  • model.config
  • model.sdf
  • meshes/p900.dae

Make sure you have the <export> tags completed int he package.xml file so that the models are available to Gazebo.

Run Gazebo with roslaunch so that Gazebo is aware of the models in the ROS package. (Just running gazebo doesn't add the correct model path.)

roslaunch gazebo_ros empty_world.launch 

You should now be able to graphically insert the model and see it in Gazebo. Go to Insert tab in the GUI and navigation tot he blueview_p900 model. If successful you should see something like this... /tutorials/images/blueview_insert.png

Create a world file with the new model already in place

Following basic outline of Tutorial: Using roslaunch to start Gazebo, world files and URDF models. Create a launch file and a world file to facilitate loading our blueview_standalone.world file with roslaunch.

roslaunch fls_gazebo blueview_standalone.launch 

Also add a few objects within the field of view of the sonar.

Adding and configuring the plugin

Generally following the Intermediate: Velodyne tutorial. In the model.sdf description, add the sensor (of type DepthCameraSensor) within the forward_sonar_link link SDF tags.

There is not a lot of documentation on the Gazebo depth camera: Use a Gazebo Depth Camera with ROS and DepthCameraSensor.

Add viewer / visualization

rosrun rqt_image_view rqt_image_view

/tutorials/images/standalone_sidebyside.png

Use the rqt_image_view to look at the other image types to get a sense of the image generation process.

Can also visualize using the Gazebo Topic Viewer

/tutorials/images/blueview_gazeboviewer.png

Then you can use Gazebo Translation and Rotation modes to move around the sonar head or the targets. Here is a video demonstration of what you should see.

Clone this wiki locally