Navigation and SLAM Using the ROS 2 Navigation Stack

In this ROS 2 Navigation Stack tutorial, we will use information obtained from LIDAR scans to build a map of the environment and to localize on the map. The purpose of doing this is to enable our robot to navigate autonomously through both known and unknown environments (i.e. SLAM).

As noted in the official documentation, the two most commonly used packages for localization are the nav2_amcl package and the slam_toolbox. Both of these packages publish the map -> odom coordinate transformation which is necessary for a robot to localize on a map.

This tutorial is the fifth tutorial in my Ultimate Guide to the ROS 2 Navigation Stack (also known as Nav2).

You can get the entire code for this project here if you are using ROS Foxy.

If you are using ROS Galactic or newer, you can get the code here.

Let’s get started!

Prerequisites 

You have completed the first four tutorials of this series:

  1. How to Create a Simulated Mobile Robot in ROS 2 Using URDF
  2. Set Up the Odometry for a Simulated Mobile Robot in ROS 2
  3. Sensor Fusion Using the Robot Localization Package – ROS 2
  4. Set Up LIDAR for a Simulated Mobile Robot in ROS 2

Create a Launch File

Open a new terminal window, and move to your launch folder.

colcon_cd basic_mobile_robot
cd launch
gedit basic_mobile_bot_v5.launch.py

Copy and paste this code into the file.

Save the file, and close it.

Add a Static Map

We now need to add a static map of our world so our robot can plan an obstacle-free path between two points. 

I already created a map of the world in a previous tutorial, so we’ll use the yaml and pgm file from that tutorial.

Go to your ~/dev_ws/src/basic_mobile_robot/maps folder.

Place this pgm file and this yaml file inside the folder.

Add Navigation Stack Parameters

Let’s add parameters for the ROS 2 Navigation Stack. The official Configuration Guide has a full breakdown of all the tunable parameters. The parameters enable you to do all sorts of things with the ROS 2 Navigation Stack.

The most important parameters are for the Costmap 2D package. You can learn about this package here and here.

A costmap is a map made up of numerous grid cells. Each grid cell has a “cost”. The cost represents the difficulty a robot would have trying to move through that cell. 

For example, a cell containing an obstacle would have a high cost. A cell that has no obstacle in it would have a low cost.

The ROS Navigation Stack uses two costmaps to store information about obstacles in the world. 

  1. Global costmap: This costmap is used to generate long term plans over the entire environment….for example, to calculate the shortest path from point A to point B on a map.
  2. Local costmap: This costmap is used to generate short term plans over the environment….for example, to avoid obstacles.

We will use the AMCL (Adaptive Monte Carlo Localization) algorithm for localizing the robot in the world and for publishing the coordinate transform from the map to odom frame. 

AMCL localizes the robot in the world using LIDAR scans. It does this by matching real-time scan information to a known map. You can read more about AMCL here and here.

Go to your ~/dev_ws/src/basic_mobile_robot/params folder.

Place this nav2_params.yaml file inside the folder.

If you are using ROS 2 Galactic or newer, your code is here.

Create an RViz Configuration File

Go to your rviz folder.

colcon_cd basic_mobile_robot
cd rviz 

Create a new RViz file.

gedit nav2_config.rviz

Copy and paste this code inside the file.

Save the file, and close it.

Update the Plugin Parameters 

I updated the LIDAR plugin parameters inside model.sdf inside the basic_mobile_robot_description folder. 

I also updated the differential drive plugin to use odometry data from the WORLD as the source rather than ENCODER.

cd ~/dev_ws/src/basic_mobile_robot/models/basic_mobile_bot_description
gedit model.sdf

Make sure you copy and paste this code into the model.sdf file, and then save and close it.

Update the Robot Localization Parameters

Inside my ekf.yaml file, I updated the map_frame since we will be using a map. The robot_localization package will not be using the map, but I still want to update this parameter so that it is there if I need it.

cd ~/dev_ws/src/basic_mobile_robot/config
gedit ekf.yaml

Make sure you copy and paste this code into the ekf.yaml file, and then save and close it.

Build the Package

Now build the package by opening a terminal window, and typing the following command:

cd ~/dev_ws
colcon build

Launch the Robot Without SLAM

Open a new terminal window, and type the following command.

colcon_cd basic_mobile_robot

Launch the robot.

ros2 launch basic_mobile_robot basic_mobile_bot_v5.launch.py
1-launch-basic-mobile-robot-v5

Move the Robot From Point A to Point B

Now go to the RViz screen.

Set the initial pose of the robot by clicking the “2D Pose Estimate” on top of the rviz2 screen (Note: we could have also set the set_initial_pose and initial_pose parameters in the nav2_params.yaml file to True in order to automatically set an initial pose.)

Then click on the map in the estimated position where the robot is in Gazebo.

Set a goal for the robot to move to. Click “Navigation2 Goal” button in RViz, and click on a desired destination. 

You can also request goals through the terminal by using the following command:

ros2 topic pub /goal_pose geometry_msgs/PoseStamped "{header: {stamp: {sec: 0}, frame_id: 'map'}, pose: {position: {x: 5.0, y: -2.0, z: 0.0}, orientation: {w: 1.0}}}"

You will notice that we published the goal to the /goal_pose topic.

The wheeled robot will move to the goal destination.

4-launch-basic-mobile-robot-v5-1

In the bottom left of the screen, you can Pause and Reset.

If the robot does not move at all, press CTRL+C in all windows to close everything down. Then try launching the robot again.

The key to getting good performance with the ROS 2 Navigation Stack is to spend a lot of time (it can take me several days) tweaking the parameters in the nav2_params.yaml file we built earlier. Yes, it is super frustrating, but this is the only way to get navigation to work properly. 

Common things you can try changing are the robot_radius and the inflaition_radius parameters. You can also try changing the expected_planner_frequency, update_frequency, publish_frequency, and width/height of the rolling window in the local_costmap.

Also, you can try modifying the update_rate in the LIDAR sensor inside your robot model.sdf file.

Don’t change too many things all at once. Just change something, build the package, and then launch the robot again to see what happens. Then change another thing, and watch what happens, etc.

Now let’s check out the coordinate frames. Open a new terminal window, and type:

ros2 run tf2_tools view_frames.py

If you are using ROS 2 Galactic or newer, type:

ros2 run tf2_tools view_frames

In the current working directory, you will have a file called frames.pdf. Open that file.

evince frames.pdf

Here is what my coordinate transform (i.e. tf) tree looks like:

5-view-frames-robot

To see an image of the architecture of our ROS system, open a new terminal window, and type the following command:

rqt_graph

Press CTRL + C on all terminal windows to close down the programs.

Move the Robot Through Waypoints

Open a new terminal window, and type the following command.

colcon_cd basic_mobile_robot

Launch the robot.

ros2 launch basic_mobile_robot basic_mobile_bot_v5.launch.py

Now go to the RViz screen.

Set the initial pose of the robot by clicking the “2D Pose Estimate” on top of the rviz2 screen. 

Then click on the map in the estimated position where the robot is in Gazebo.

Now click the Waypoint mode button in the bottom left corner of RViz. Clicking this button puts the system in waypoint follower mode.

6-waypoints

Click “Navigation2 Goal” button, and click on areas of the map where you would like your robot to go (i.e. select your waypoints). Select as many waypoints as you want. 

I chose five waypoints. Each waypoint is labeled wp_#, where # is the number of the waypoint.

7-waypoint-icon

When you’re ready for the robot to follow the waypoints, click the Start Navigation button.

You should see your robot autonomously navigate to all the waypoints. At each waypoint, your robot will stop for 10-20 seconds, and then it will move to the next waypoint.

8-navigate-to-waypoint
9-processing-waypoint0
10-waypoint-succeeded
11-waypoint-succeeded
12-completed-all-waypoint

If your robot does not navigate to the waypoints, relaunch the robot and try again. Try selecting different waypoints. 

The ROS 2 Navigation Stack waypoint follower functionality isn’t perfect. Many times, the robot will skip over waypoints or abandon them completely. The most common error I get when this happens is the following:

[bt_navigator]: Action server failed while executing action callback: “send_goal failed”

[bt_navigator]: [navigate_to_pose] [ActionServer] Aborting handle.

This issue is a known problem in ROS 2 Foxy, and it appears to be fixed in the latest version of ROS 2 (i.e. Galactic). We won’t upgrade ROS right now, but this is something to keep in mind if you are using a version of ROS 2 that is newer than ROS 2 Foxy.

In addition, I like to play around with the parameters in the nav2_params.yaml file located inside the params folder of your package. A complete guide to all the parameters is here.

Finally, let’s check out the active ROS 2 topics.

ros2 topic list
13-ros2-topic-list-1
13-ros2-topic-list-2

Launch the Robot With SLAM

Make sure the SLAM toolbox is installed. Open a terminal window, and type:

sudo apt install ros-foxy-slam-toolbox

The syntax is:

sudo apt install ros-<ros2-distro>-slam-toolbox

Open the model.sdf file inside the basic_mobile_robot/models/basic_mobile_bot_description folder, and change the number of LIDAR samples (inside the <samples></samples> tag) to some high number like 120.

To launch the robot with SLAM (simultaneous localization and mapping), open a terminal window, and run the following command:

ros2 launch basic_mobile_robot basic_mobile_bot_v5.launch.py slam:=True

Use the rqt_robot_steering tool to slowly drive the robot around the room. Open a terminal window, and type: 

rqt_robot_steering

If you are using ROS 2 Galactic or newer, type:

sudo apt-get install ros-galactic-rqt-robot-steering

Where the syntax is:

sudo apt-get install ros-<ros-distribution>-rqt-robot-steering

Then type:

ros2 run rqt_robot_steering rqt_robot_steering --force-discover

The robot will build a map and localize at the same time. You can also use autonomous navigation using the RViz buttons like we did in the last section.

15-slam-map-building
16-making-a-map

Save the Map (ROS Foxy and Older)

If you are using a ROS Distribution that is ROS Foxy and older, you will have to follow these instructions to save the map you have built. These instructions will have to be done before you launch the robot with SLAM. Let’s walk through the process below.

Open a new terminal window, and type:

colcon_cd basic_mobile_robot
cd launch

Add the map_saver.launch.py file.

Now go back to the terminal window, and type the following command:

colcon_cd basic_mobile_robot
cd params

Add the map_saver_params.yaml file.

Go back to the terminal window.

Build the package by typing the following commands:

cd ~/dev_ws
colcon build

Launch the robot again with SLAM from your maps directory.

colcon_cd basic_mobile_robot
cd maps
ros2 launch basic_mobile_robot basic_mobile_bot_v5.launch.py slam:=True

Drive the robot around to create the map. In a new terminal window, you will type the following command to pull up the steering controller:

rqt_robot_steering

Execute the launch file once you’re done mapping the environment. Open a new terminal window, and type:

ros2 launch basic_mobile_robot map_saver.launch.py

Ignore any error messages that appear in the terminal window when you type the command above.

In a separate terminal, call the service to generate your map. We will call the map “my_map”:

ros2 service call /map_saver/save_map nav2_msgs/srv/SaveMap "{map_topic: map, map_url: my_map, image_format: pgm, map_mode: trinary, free_thresh: 0.25, occupied_thresh: 0.65}"
18-save-map

Your my_map.pgm and my_map.yaml file will save to the maps directory of your basic_mobile_robot package.

17-save-map

Save the Map (ROS Galactic and Newer)

If you have ROS Galactic or newer, open a new terminal window, and type:

colcon_cd basic_mobile_robot
cd maps

When you are happy with the map you have built, open a new terminal window, and type the following command to save the map:

ros2 run nav2_map_server map_saver_cli -f my_map

The syntax is:

ros2 run nav2_map_server map_saver_cli -f <map_name>
19-save-map

Your my_map.pgm and my_map.yaml map files will automatically save to the maps directory of your basic_mobile_robot package.

That’s it!

In the next tutorial, we will take a look at how to incorporate GPS data to create better localization. Stay tuned!

Set Up LIDAR for a Simulated Mobile Robot in ROS 2

In this tutorial, we will set up the LIDAR sensor for a simulated mobile robot.

This tutorial is the fourth tutorial in my Ultimate Guide to the ROS 2 Navigation Stack (also known as Nav2).

You can get the entire code for this project here.

If you are using ROS Galactic or newer, you can get the code here.

Let’s get started!

Prerequisites 

You have completed the first three tutorials of this series:

  1. How to Create a Simulated Mobile Robot in ROS 2 Using URDF
  2. Set Up the Odometry for a Simulated Mobile Robot in ROS 2
  3. Sensor Fusion Using the Robot Localization Package – ROS 2

Modify the SDF File for the Robot

Open a new terminal window, and type:

colcon_cd basic_mobile_robot
cd models
cd basic_mobile_bot_description

Type the following command:

gedit model.sdf

Here is my sdf file. You can copy and paste those lines inside your sdf file.

Here is the code for just the LIDAR:

  <!-- ****************************** LIDAR *****************************    -->
  <link name="lidar_link">    
    <inertial>
      <pose>0.215 0 0.13 0 0 0</pose>
      <inertia>
        <ixx>0.001</ixx>
        <ixy>0.000</ixy>
        <ixz>0.000</ixz>
        <iyy>0.001</iyy>
        <iyz>0.000</iyz>
        <izz>0.001</izz>
      </inertia>
      <mass>0.114</mass>
    </inertial>

    <collision name="lidar_collision">
      <pose>0.215 0 0.13 0 0 0</pose>
      <geometry>
        <cylinder>
          <radius>0.0508</radius>
          <length>0.18</length>
        </cylinder>
      </geometry>
    </collision>

    <visual name="lidar_visual">
      <pose>0.215 0 0.13 0 0 0</pose>
      <geometry>
        <cylinder>
          <radius>0.0508</radius>
          <length>0.18</length>
        </cylinder>
      </geometry>
      <material>
        <ambient>0.0 0.0 0.0 1.0</ambient>
        <diffuse>0.0 0.0 0.0 1.0</diffuse>
        <specular>0.0 0.0 0.0 1.0</specular>
        <emissive>0.0 0.0 0.0 1.0</emissive>
      </material>
    </visual>

    <sensor name="lidar" type="ray">
      <pose>0.215 0 0.215 0 0 0</pose>
      <always_on>true</always_on>
      <visualize>true</visualize>
      <update_rate>5</update_rate>
      <ray>
        <scan>
          <horizontal>
            <samples>360</samples>
            <resolution>1.00000</resolution>
            <min_angle>0.000000</min_angle>
            <max_angle>6.280000</max_angle>
          </horizontal>
        </scan>
        <range>
          <min>0.120000</min>
          <max>3.5</max>
          <resolution>0.015000</resolution>
        </range>
        <noise>
          <type>gaussian</type>
          <mean>0.0</mean>
          <stddev>0.01</stddev>
        </noise>
      </ray>
      <plugin name="scan" filename="libgazebo_ros_ray_sensor.so">
        <ros>
          <remapping>~/out:=scan</remapping>
        </ros>
        <output_type>sensor_msgs/LaserScan</output_type>
        <frame_name>lidar_link</frame_name>
      </plugin>
    </sensor>
  </link>

Save the file and close it to return to the terminal.

You’ll notice that we added the LIDAR link, joint, and Gazebo sensor plugin. I also added an inertial section to the front caster wheel.

Make sure that you have this model.config file inside the same folder as your sdf file.

Test Your SDF File

Now let’s run Gazebo so that we can see our model. Open a new terminal window, and type the following command:

gazebo

On the left-hand side, click the “Insert” tab.

On the left panel, find your robot. 

You should see your robot in the empty Gazebo environment. You can place it wherever you want by clicking inside the environment.

1-robot-scan-empty-environment

If you open a new terminal window and check out the ROS topics, you will see our /scan topic.

ros2 topic list
2-ros2-topic-list

To check out this topic, type:

ros2 topic info /scan

You will see this /scan topic has one publisher (i.e. our LIDAR).

3-ros2-topic-info-scan

To see the output of the /scan topic, you type:

ros2 topic echo /scan

You will see that the readings for all 360 beams of the LIDAR is “.inf”, which means infinity. These readings make sense since there are no objects in the environment to detect.

4-infinity-scanner-readings

Once you’re done, press CTRL+C in all windows to close everything down.

Modify the URDF File

Open a new terminal window, and type:

colcon_cd basic_mobile_robot
cd models

Create a new file named basic_mobile_bot_v2.urdf.

gedit basic_mobile_bot_v2.urdf

Inside this file, we will add a link and a joint for the LIDAR. Leave the caster wheel inertial section as-is.

Type this code inside the URDF file.

Save and close the file.

Edit the Launch File

Now that we have our robot with LIDAR, we need to modify the launch file.

colcon_cd basic_mobile_robot
cd launch
gedit basic_mobile_bot_v4.launch.py

Copy and paste this code into the file.

Save the file, and close it.

Build the Package

Now build the package by opening a terminal window, and typing the following command:

cd ~/dev_ws
colcon build

Launch the Robot

Open a new terminal, and launch the robot.

cd ~/dev_ws/
ros2 launch basic_mobile_robot basic_mobile_bot_v4.launch.py

Here is the output for Gazebo:

5-output-for-gazebo-1

Here is the output for RViz.

6-output-for-rviz

Under the TF dropdown in RViz, you should see “Transform OK” for all links.

7-transform-ok

Let’s visualize the sensor readings from the LIDAR. In RViz, make sure the Fixed Frame under Global Options is set to odom.

Click the Add button at the bottom-left part of the RViz window. Then go to the By topic tab, and select the LaserScan option under /scan. Then click OK.

8-laser-scan-topic

Under the Topic dropdown menu under LaserScan in the left panel of RViz, set the Reliability Policy to Best Effort. Set the size to 0.1 m.

9-laser-scan-settings

Now go back to a terminal window, and move the robot forward a bit so that it can start detecting that wall on the front end of the playground in Gazebo.

rqt_robot_steering

If you are using ROS 2 Galactic or newer, type:

sudo apt-get install ros-galactic-rqt-robot-steering

Where the syntax is:

sudo apt-get install ros-<ros-distribution>-rqt-robot-steering

Then type:

ros2 run rqt_robot_steering rqt_robot_steering --force-discover

You can see how the laser scan readings show up in RViz.

That’s it! 

In the next tutorial, we’ll learn how to localize and create a map using this LIDAR in conjunction with the ROS 2 Navigation Stack. Stay tuned!

Set Up the Odometry for a Simulated Mobile Robot in ROS 2

In this tutorial, I will show you how to set up the odometry for a mobile robot. This tutorial is the second tutorial in my Ultimate Guide to the ROS 2 Navigation Stack (also known as Nav2)

The official tutorial for setting up odometry is on this page, but I will walk you through the entire process, step-by-step.

You can get the entire code for this project here.

Let’s get started!

Prerequisites

Odometry in ROS 2

In robotics, odometry is about using data from sensors to estimate the change in a robot’s position, orientation, and velocity over time relative to some point (e.g. x=0, y=0, z=0). Odometry information is normally obtained from sensors such as wheel encoders, IMU (Inertial measurement unit), and LIDAR

In ROS, the coordinate frame most commonly used for odometry is known as the odom frame. Just like an odometer in your car which measures wheel rotations to determine the distance your car has traveled from some starting point, the odom frame is the point in the world where the robot first starts moving.

A robot’s position and orientation within the odom frame becomes less accurate over time and distance because sensors like IMUs (that measure acceleration) and wheel encoders (that measure the number of times each wheel has rotated) are not perfect. 

For example, imagine your robot runs into a wall. Its wheels might spin repeatedly without the robot moving anywhere (we call this wheel slip). The wheel odometry would indicate a further distance traveled by the robot than reality. We have to adjust for these inaccuracies.

To learn all about coordinate frames for mobile robots (including the odom frame), check out this post.

To correct for the inaccuracies of sensors, most ROS applications have an additional coordinate frame called map that provides globally accurate information and corrects drift that happens within the odom frame.

The ROS 2 Navigation Stack requires:

  1. Publishing of nav_msgs/Odometry messages to a ROS 2 topic
  2. Publishing of the coordinate transform from odom (parent frame) -> base_link (child frame) coordinate frames.

Getting both of these pieces of data published to the ROS system is our end goal in setting up the odometry for a robot.

Simulate the Odometry System Using Gazebo

Let’s set up the odometry for the simulated robot we created in the last tutorial. We will use Gazebo, an open-source 3D robotics simulator.

Set Up the Prerequisites

Gazebo is automatically included in ROS 2 installations. To test that Gazebo is installed, open a new terminal window, and type:

gazebo

You should see an empty world:

1-empty-gazebo-world

If you don’t see any output after a minute or two, follow these instructions to install Gazebo

Close Gazebo by going to the terminal window and pressing CTRL + C.

Now, open a new terminal window, and install the gazebo_ros_pkgs package.

sudo apt-get update
sudo apt-get upgrade
sudo apt install ros-foxy-gazebo-ros-pkgs

The syntax for the above command is:

sudo apt install ros-<ros2-distro>-gazebo-ros-pkgs

….where you replace <ros2-distro> with the ROS 2 distribution that you are using. I am using ROS 2 Foxy Fitzroy, so I use “foxy”.

Create an SDF File for the Robot

Later in this tutorial series, we will be using the robot_localization package to enable the robot to determine where it is in the environment. To use this package, we need to create an SDF file.

Why should we create an SDF file instead of using our URDF File? In working with ROS for many years, I have found that URDFs don’t always work that well with Gazebo. 

For example, when I ran through the official ROS 2 Navigation Stack robot localization demo, I found that the filtered odometry data was not actually generated.

So, we need to use an SDF file for Gazebo stuff and a URDF file for ROS stuff. We will try to make sure our SDF file generates a robot that is as close as possible to the robot generated by the URDF file.

Open a new terminal window, and type:

cd ~/dev_ws/src/basic_mobile_robot/models

If you have colcon_cd set up, you can also type:

colcon_cd basic_mobile_robot
cd models

Create Model.config

Let’s create a folder for the SDF model.

mkdir basic_mobile_bot_description

Move inside the folder.

cd basic_mobile_bot_description

Create a model.config file.

gedit model.config

Type this code inside this model.config file. You can see this file contains fields for the name of the robot, the version, the author (that’s you), your email address, and a description of the robot.

Save the file, and then close it.

Create Model.sdf

Now, let’s create an SDF (Simulation Description Format) file. This file will contain the tags that are needed to create an instance of the basic_mobile_bot model. We want to build it so that it is as close to the URDF robot representation as possible. Our robot has three wheels: two wheels in the back and a caster wheel in the front. 

The official tutorial for creating an SDF file is here (other good tutorials are here and here), but let’s do this together below. I put a lot of comments in the SDF file so that you can see what is going on. Don’t be intimidated by how the file looks. Just go through it slowly, one line at a time, section by section. There is no hurry.

Type the following command:

gedit model.sdf

Here is my sdf file. You can copy and paste those lines inside your sdf file.

Save the file and close it to return to the terminal.

You’ll notice there are some slight differences between the URDF and the SDF file formats.

You can see in the SDF file that we use an IMU sensor plugin to simulate IMU data. And, instead of using wheel encoders to calculate odometry from the motion of the wheels, we use Gazebo’s differential drive plugin and the joint state publisher plugin

The differential drive plugin will subscribe to velocity commands over the /cmd_vel topic and will publish odometry information to the /wheel/odometry topic. 

Remember, a differential drive robot is a mobile robot whose motion is based on two separately driven wheels that are on either side of the robot’s body.

The joint state publisher plugin will publish the angles of the drive wheel joints, which are continuously changing once the robot starts moving.

In a real robotics project, to calculate the odometry system, we would use an IMU sensor, and we would use wheel encoders. In this real-world project for example, I used wheel encoder data to generate an odometry message.

Add Meshes Folder

Now let’s have our robot look more realistic by adding a mesh to the base of the robot.

Open the file explorer by clicking on the Files icon on the left side of your terminal.

2-files-icon

Go to your basic_mobile_robot package.

Copy the meshes folder.

3-copy-the-meshes-folder

Paste the meshes folder inside the ~/dev_ws/src/basic_mobile_robot/models/basic_mobile_bot_description folder.

4-paste-meshes

Add the Path of the Model to the Bashrc File

Now we need to add the file path of the model to the bashrc file.

Open a terminal window, and type:

gedit ~/.bashrc

Add the following line to the bottom of the bashrc file:

export GAZEBO_MODEL_PATH=$GAZEBO_MODEL_PATH:/home/focalfossa/dev_ws/src/basic_mobile_robot/models/

The name of my Linux environment is focalfossa. Your environment may or may not have a different name.

Save the file and close it.

Test Your Robot

Now let’s run Gazebo so that we can see our model. Open a new terminal window, and type the following command:

gazebo

On the left-hand side, click the “Insert” tab.

On the left panel, find your robot. 

5-basic-mobile-bot

You should see your robot in the empty Gazebo environment. You can place it wherever you want by clicking inside the environment.

To test the robot, open the rqt_steering program. In a new terminal window, type the following command:

rqt_robot_steering

A GUI will pop up.

Change the topic on the GUI to /cmd_vel.

6-rqt-robot-steering

Move the sliders to move the robot forward/backward/right/left.

To see a list of ros topics, you can open a new terminal window, and type:

ros2 topic list
8-ros2-topic-list

You can see that our IMU data, wheel odometry data, and GPS data are all being published by the robot.

To see the wheel odometry data for example, you can type:

ros2 topic echo /wheel/odometry

You will see the current position and orientation of the robot relative to its starting point.

9-pose-starting-point

When you’re done, go back to the terminal windows, and type CTRL + C in all of them to close Gazebo and the steering program.

Create an SDF File for the World

Just like an SDF file can be used to define what a robot looks like, we can use an SDF file to define what the robot’s environment should look like. We want to make our robot’s environment look as realistic as possible.

This tutorial here shows you how to create your own Gazebo virtual world by dragging and dropping items into an empty world. We will walk through the process below.

Open a new terminal window, and type:

cd ~/dev_ws/src/basic_mobile_robot/worlds

If you have colcon_cd set up, you can also type:

colcon_cd basic_mobile_robot
cd worlds

Let’s create a folder for the SDF model.

mkdir basic_mobile_bot_world

Move inside the folder.

cd basic_mobile_bot_world

Create a world file.

gedit smalltown.world

Copy and paste this code inside this smalltown.world file. 

Save the file, and then close it.

The world file we just created has the following six sections (from top to bottom):

  1. Place objects in the world
  2. Define the lighting of the world
  3. Define the physics of the world
  4. Define the latitude, longitude, and elevation
  5. Define the user perspective of the world (i.e. aerial view)
  6. Import our own robot

Test Your World

Now let’s run Gazebo so that we can see our world model. Open a new terminal window, and type the following command. Make sure you are inside the worlds directory of your package:

gazebo smalltown.world

You should see the world with the robot inside of it. 

10-test-your-world

If you open a new terminal window, you can see that ROS automatically launched by typing the following command to see the list of active ROS topics:

ros2 topic list

If you go back to Gazebo, you can click on the World tab, and play around with the settings for GUI (user perspective), Spherical Coordinates (latitude, longitude, elevation), Physics, etc. If you want to save these settings, you will need to record the values and modify your smalltown.world file accordingly (I prefer to do this instead of going to File -> Save World).

12-modify-world-settings

Once you’re finished, go back to the terminal windows, and type CTRL + C in all of them to close Gazebo.

Edit the Launch File

Now that we have our robot and our world file, we need to modify the launch file.

colcon_cd basic_mobile_robot
cd launch
gedit basic_mobile_bot_v2.launch.py

Copy and paste this code into the file.

Save the file, and close it.

Build the Package

Now build the package by opening a terminal window, and typing the following command:

cd ~/dev_ws
colcon build

Launch the Robot

Open a new terminal, and launch the robot.

cd ~/dev_ws/
ros2 launch basic_mobile_robot basic_mobile_bot_v2.launch.py

It could take a while for Gazebo to build and load the virtual world, so be patient. You might also see some warnings that say: “Warning: Invalid frame ID “drivewhl_l_link” passed to canTransform argument source_frame – frame does not exist…”. Ignore this warning. It will go away as soon as Gazebo loads.

If Gazebo is not launching properly, you can terminate Gazebo as follows:

killall gazebo
killall gzserver
killall gzclient

Here is the output once everything is fully loaded:

13-everything-fully-loaded

To see the active topics, open a terminal window, and type:

ros2 topic list
14-ros2-topic-list

To see more information about the topics, execute:

ros2 topic info /imu/data
ros2 topic info /wheel/odometry
15-more-info-on-topics

The  /imu/data topic publishes sensor_msgs/Imu type messages, and the /wheel/odometry topic publishes nav_msgs/Odometry type messages. The information that is published on these topics comes from the IMU and differential drive Gazebo plugins that are defined inside our SDF file for the robot.

You will also see that both topics don’t have any subscribers yet. We will generate a robot_localization node in the next tutorial that will subscribe to both of these topics to provide a fused, locally accurate and smooth odometry information for the ROS 2 Navigation Stack.

When you’re finished, press CTRL+C in all terminal windows to stop all processes.

Troubleshooting

If you see an error that says “Warning: TF_OLD_DATA ignoring data from the past for frame”, it means that you need to make sure that all nodes are running on simulated time

It is likely that your robot state publisher has not had the use_sim_time parameter set to True

You can also type the following command (when ROS 2 is shutdown) to see if there are any ROS nodes that are running that shouldn’t be running. Often, you may have multiple joint state publishers that are conflicting with each other.

ros2 topic list

If you have a ROS 2 process that is running, be sure to kill it. Then restart the launch file.

In my launch file basic_mobile_bot_v2.launch.py, you can see how I set the simulated time to true for this node.

That’s it!

In our next tutorial, I will show you how to fuse the information from your IMU sensor and wheel odometry to create a smooth estimate of the robot’s location in the world. We call this sensor fusion.