Create Launch Files to Display URDF Files – ROS 2 Jazzy

In this tutorial, I will guide you through the process of creating a custom launch file to launch a robotic arm and a mobile robot in RViz.

RViz (short for “ROS Visualization”) is a 3D visualization tool for robots that allows you to view the robot’s sensors, environment, and state. I use it all the time to visualize and debug my robots.

Launch files in ROS 2 are powerful tools that allow you to start multiple nodes and set parameters with a single command, simplifying the process of managing your robot’s complex systems.

Prerequisites

All my code for this project is located here on GitHub (robotic arm) and here on GitHub (mobile robot).

Create the Launch File for the Robotic Arm

Open a terminal window.

Move to your robotic arm directory.

cd ~/ros2_ws/src/mycobot_ros2/mycobot_description/ && mkdir launch && cd launch

Add a file in the launch folder called robot_state_publisher.launch.py.

Add this code.

#!/usr/bin/env python3
#
# Author: Addison Sears-Collins
# Date: November 10, 2024
# Description: Display the robotic arm with RViz
#
# This file launches the robot state publisher, joint state publisher,
# and RViz2 for visualizing the mycobot robot.

from launch import LaunchDescription
from launch.actions import DeclareLaunchArgument
from launch.conditions import IfCondition, UnlessCondition
from launch.substitutions import Command, LaunchConfiguration, PathJoinSubstitution
from launch_ros.actions import Node
from launch_ros.parameter_descriptions import ParameterValue
from launch_ros.substitutions import FindPackageShare

# Define the arguments for the XACRO file
ARGUMENTS = [
    DeclareLaunchArgument('prefix', default_value='',
                          description='Prefix for robot joints and links'),
    DeclareLaunchArgument('add_world', default_value='true',
                          choices=['true', 'false'],
                          description='Whether to add world link'),
    DeclareLaunchArgument('base_link', default_value='base_link',
                          description='Name of the base link'),
    DeclareLaunchArgument('base_type', default_value='g_shape',
                          description='Type of the base'),
    DeclareLaunchArgument('flange_link', default_value='link6_flange',
                          description='Name of the flange link'),
    DeclareLaunchArgument('gripper_type', default_value='adaptive_gripper',
                          description='Type of the gripper'),
    DeclareLaunchArgument('use_gazebo', default_value='false',
                          choices=['true', 'false'],
                          description='Whether to use Gazebo simulation'),
    DeclareLaunchArgument('use_gripper', default_value='true',
                          choices=['true', 'false'],
                          description='Whether to attach a gripper')
]


def generate_launch_description():
    # Define filenames
    urdf_package = 'mycobot_description'
    urdf_filename = 'mycobot_280.urdf.xacro'
    rviz_config_filename = 'mycobot_280_description.rviz'

    # Set paths to important files
    pkg_share_description = FindPackageShare(urdf_package)
    default_urdf_model_path = PathJoinSubstitution(
        [pkg_share_description, 'urdf', 'robots', urdf_filename])
    default_rviz_config_path = PathJoinSubstitution(
        [pkg_share_description, 'rviz', rviz_config_filename])

    # Launch configuration variables
    jsp_gui = LaunchConfiguration('jsp_gui')
    rviz_config_file = LaunchConfiguration('rviz_config_file')
    urdf_model = LaunchConfiguration('urdf_model')
    use_rviz = LaunchConfiguration('use_rviz')
    use_sim_time = LaunchConfiguration('use_sim_time')

    # Declare the launch arguments
    declare_jsp_gui_cmd = DeclareLaunchArgument(
        name='jsp_gui',
        default_value='true',
        choices=['true', 'false'],
        description='Flag to enable joint_state_publisher_gui')

    declare_rviz_config_file_cmd = DeclareLaunchArgument(
        name='rviz_config_file',
        default_value=default_rviz_config_path,
        description='Full path to the RVIZ config file to use')

    declare_urdf_model_path_cmd = DeclareLaunchArgument(
        name='urdf_model',
        default_value=default_urdf_model_path,
        description='Absolute path to robot urdf file')

    declare_use_rviz_cmd = DeclareLaunchArgument(
        name='use_rviz',
        default_value='true',
        description='Whether to start RVIZ')

    declare_use_sim_time_cmd = DeclareLaunchArgument(
        name='use_sim_time',
        default_value='false',
        description='Use simulation (Gazebo) clock if true')

    robot_description_content = ParameterValue(Command([
        'xacro', ' ', urdf_model, ' ',
        'prefix:=', LaunchConfiguration('prefix'), ' ',
        'add_world:=', LaunchConfiguration('add_world'), ' ',
        'base_link:=', LaunchConfiguration('base_link'), ' ',
        'base_type:=', LaunchConfiguration('base_type'), ' ',
        'flange_link:=', LaunchConfiguration('flange_link'), ' ',
        'gripper_type:=', LaunchConfiguration('gripper_type'), ' ',
        'use_gazebo:=', LaunchConfiguration('use_gazebo'), ' ',
        'use_gripper:=', LaunchConfiguration('use_gripper')
    ]), value_type=str)

    # Subscribe to the joint states of the robot, and publish the 3D pose of each link.
    start_robot_state_publisher_cmd = Node(
        package='robot_state_publisher',
        executable='robot_state_publisher',
        name='robot_state_publisher',
        output='screen',
        parameters=[{
            'use_sim_time': use_sim_time,
            'robot_description': robot_description_content}])

    # Publish the joint state values for the non-fixed joints in the URDF file.
    start_joint_state_publisher_cmd = Node(
        package='joint_state_publisher',
        executable='joint_state_publisher',
        name='joint_state_publisher',
        parameters=[{'use_sim_time': use_sim_time}],
        condition=UnlessCondition(jsp_gui))

    # Depending on gui parameter, either launch joint_state_publisher or joint_state_publisher_gui
    start_joint_state_publisher_gui_cmd = Node(
        package='joint_state_publisher_gui',
        executable='joint_state_publisher_gui',
        name='joint_state_publisher_gui',
        parameters=[{'use_sim_time': use_sim_time}],
        condition=IfCondition(jsp_gui))

    # Launch RViz
    start_rviz_cmd = Node(
        condition=IfCondition(use_rviz),
        package='rviz2',
        executable='rviz2',
        name='rviz2',
        output='screen',
        arguments=['-d', rviz_config_file],
        parameters=[{'use_sim_time': use_sim_time}])

    # Create the launch description and populate
    ld = LaunchDescription(ARGUMENTS)

    # Declare the launch options
    ld.add_action(declare_jsp_gui_cmd)
    ld.add_action(declare_rviz_config_file_cmd)
    ld.add_action(declare_urdf_model_path_cmd)
    ld.add_action(declare_use_rviz_cmd)
    ld.add_action(declare_use_sim_time_cmd)

    # Add any actions
    ld.add_action(start_joint_state_publisher_cmd)
    ld.add_action(start_joint_state_publisher_gui_cmd)
    ld.add_action(start_robot_state_publisher_cmd)
    ld.add_action(start_rviz_cmd)

    return ld

Create the Launch File for the Mobile Robot

Open a terminal window.

Move to your robotic arm directory.

cd ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/ && mkdir launch && cd launch

Add a file in the launch folder called robot_state_publisher.launch.py.

Add this code.

#!/usr/bin/env python3
#
# Author: Addison Sears-Collins
# Date: November 11, 2024
# Description: Display the Yahboom (ROSMASTER) robot in RViz
#
# This file launches the robot state publisher, joint state publisher,
# and RViz2 for visualizing the ROSMASTER robot.

from launch import LaunchDescription
from launch.actions import DeclareLaunchArgument
from launch.conditions import IfCondition, UnlessCondition
from launch.substitutions import Command, LaunchConfiguration, PathJoinSubstitution
from launch_ros.actions import Node
from launch_ros.parameter_descriptions import ParameterValue
from launch_ros.substitutions import FindPackageShare

# Define the arguments for the XACRO file
ARGUMENTS = [
    DeclareLaunchArgument('prefix', default_value='',
                          description='Prefix for robot joints and links'),
    DeclareLaunchArgument('use_gazebo', default_value='false',
                          choices=['true', 'false'],
                          description='Whether to use Gazebo simulation')
]


def generate_launch_description():
    # Define filenames
    urdf_package = 'yahboom_rosmaster_description'
    urdf_filename = 'rosmaster_x3.urdf.xacro'
    rviz_config_filename = 'yahboom_rosmaster_description.rviz'

    # Set paths to important files
    pkg_share_description = FindPackageShare(urdf_package)
    default_urdf_model_path = PathJoinSubstitution(
        [pkg_share_description, 'urdf', 'robots', urdf_filename])
    default_rviz_config_path = PathJoinSubstitution(
        [pkg_share_description, 'rviz', rviz_config_filename])

    # Launch configuration variables
    jsp_gui = LaunchConfiguration('jsp_gui')
    rviz_config_file = LaunchConfiguration('rviz_config_file')
    urdf_model = LaunchConfiguration('urdf_model')
    use_rviz = LaunchConfiguration('use_rviz')
    use_sim_time = LaunchConfiguration('use_sim_time')

    # Declare the launch arguments
    declare_jsp_gui_cmd = DeclareLaunchArgument(
        name='jsp_gui',
        default_value='true',
        choices=['true', 'false'],
        description='Flag to enable joint_state_publisher_gui')

    declare_rviz_config_file_cmd = DeclareLaunchArgument(
        name='rviz_config_file',
        default_value=default_rviz_config_path,
        description='Full path to the RVIZ config file to use')

    declare_urdf_model_path_cmd = DeclareLaunchArgument(
        name='urdf_model',
        default_value=default_urdf_model_path,
        description='Absolute path to robot urdf file')

    declare_use_rviz_cmd = DeclareLaunchArgument(
        name='use_rviz',
        default_value='true',
        description='Whether to start RVIZ')

    declare_use_sim_time_cmd = DeclareLaunchArgument(
        name='use_sim_time',
        default_value='false',
        description='Use simulation (Gazebo) clock if true')

    robot_description_content = ParameterValue(Command([
        'xacro', ' ', urdf_model, ' ',
        'prefix:=', LaunchConfiguration('prefix'), ' ',
        'use_gazebo:=', LaunchConfiguration('use_gazebo')
    ]), value_type=str)

    # Subscribe to the joint states of the robot, and publish the 3D pose of each link.
    start_robot_state_publisher_cmd = Node(
        package='robot_state_publisher',
        executable='robot_state_publisher',
        name='robot_state_publisher',
        output='screen',
        parameters=[{
            'use_sim_time': use_sim_time,
            'robot_description': robot_description_content}])

    # Publish the joint state values for the non-fixed joints in the URDF file.
    start_joint_state_publisher_cmd = Node(
        package='joint_state_publisher',
        executable='joint_state_publisher',
        name='joint_state_publisher',
        parameters=[{'use_sim_time': use_sim_time}],
        condition=UnlessCondition(jsp_gui))

    # Depending on gui parameter, either launch joint_state_publisher or joint_state_publisher_gui
    start_joint_state_publisher_gui_cmd = Node(
        package='joint_state_publisher_gui',
        executable='joint_state_publisher_gui',
        name='joint_state_publisher_gui',
        parameters=[{'use_sim_time': use_sim_time}],
        condition=IfCondition(jsp_gui))

    # Launch RViz
    start_rviz_cmd = Node(
        condition=IfCondition(use_rviz),
        package='rviz2',
        executable='rviz2',
        name='rviz2',
        output='screen',
        arguments=['-d', rviz_config_file],
        parameters=[{'use_sim_time': use_sim_time}])

    # Create the launch description and populate
    ld = LaunchDescription(ARGUMENTS)

    # Declare the launch options
    ld.add_action(declare_jsp_gui_cmd)
    ld.add_action(declare_rviz_config_file_cmd)
    ld.add_action(declare_urdf_model_path_cmd)
    ld.add_action(declare_use_rviz_cmd)
    ld.add_action(declare_use_sim_time_cmd)

    # Add any actions
    ld.add_action(start_joint_state_publisher_cmd)
    ld.add_action(start_joint_state_publisher_gui_cmd)
    ld.add_action(start_robot_state_publisher_cmd)
    ld.add_action(start_rviz_cmd)

    return ld

Add the RViz Configuration File

Now that we have written our launch file, let’s add the RViz configuration file.

cd ~/ros2_ws/src/mycobot_ros2/mycobot_description/ && mkdir rviz && cd rviz

Create a file named mycobot_280_description.rviz

Add this code.

cd ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/ && mkdir rviz && cd rviz

Create a file named yahboom_rosmaster_description.rviz 

Add this code.

You don’t need to understand all the nitty gritty details of this configuration file. You can generate one automatically through RViz.

On a high-level, RViz configuration files end with the extension .rviz. These files set up an RViz configuration with a grid, a robot model, and coordinate frame visualizations. 

This configuration file also enables the camera movement tool and sets the initial camera view to an orbit view, which allows orbiting around a focal point in the scene. 

When RViz is launched with this configuration file, it will display the robot model and allow interaction and visualization of the robot.

Edit CMakeLists.txt

Now we need to edit CMakeLists.txt so the build system can find our new folders, launch and rviz.

cd ~/ros2_ws/src/mycobot_ros2/mycobot_description/ 
gedit CMakeLists.txt

Add this code:

# Copy necessary files to designated locations in the project
install (
  DIRECTORY launch meshes urdf rviz
  DESTINATION share/${PROJECT_NAME}
)
cd ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/ 
gedit CMakeLists.txt

Add this code:

# Copy necessary files to designated locations in the project
install (
  DIRECTORY launch meshes urdf rviz
  DESTINATION share/${PROJECT_NAME}
)

Save the file, and close it.

Now build your workspace.

cd ~/ros2_ws/
colcon build
source ~/.bashrc

Launch the Launch Files

Launch your launch files:

ros2 launch mycobot_description robot_state_publisher.launch.py
1-mycobot-custom-launch-rviz
ros2 launch yahboom_rosmaster_description robot_state_publisher.launch.py

You can also add launch arguments. To see the available launch arguments, type:

ros2 launch mycobot_description robot_state_publisher.launch.py --show-args

For example, to disable the gripper, type:

ros2 launch mycobot_description robot_state_publisher.launch.py use_gripper:=false

To add a prefix to the robot (e.g. left arm of a dual arm robot), type:

ros2 launch mycobot_description robot_state_publisher.launch.py use_gripper:=false prefix:=left_
2-remove-gripper-and-add-prefix

If you want to launch the robots with no GUIs, do this:

ros2 launch mycobot_description robot_state_publisher.launch.py jsp_gui:=false use_rviz:=false
ros2 launch yahboom_rosmaster_description robot_state_publisher.launch.py jsp_gui:=false use_rviz:=false
3-yahboom-robot

And there you have it…your first custom launch files.

Launch File Walkthrough

Let’s walk through this ROS 2 launch file that visualizes a robotic arm in RViz.

Starting with the imports, we bring in essential ROS 2 launch utilities.

from launch import LaunchDescription
from launch.actions import DeclareLaunchArgument
from launch.conditions import IfCondition, UnlessCondition

The launch file’s core purpose is orchestrating multiple nodes – think of it as a conductor coordinating different musicians in an orchestra. Here, our musicians are:

  • Robot State Publisher: Broadcasts the robot’s current pose
  • Joint State Publisher: Manages the robot’s joint positions
  • RViz: Provides the 3D visualization

The ARGUMENTS list defines the robot’s configurable parameters:

ARGUMENTS = [
    DeclareLaunchArgument('prefix', default_value='',
                          description='Prefix for robot joints and links'),
    DeclareLaunchArgument('add_world', default_value='true',
                          description='Whether to add world link'),
    # ... other arguments
]

These arguments allow users to customize the robot’s configuration without changing the code – like using command-line switches to modify a program’s behavior.

The generate_launch_description() function is where the magic happens. First, it sets up the file paths:

urdf_package = 'mycobot_description'
urdf_filename = 'mycobot_280.urdf.xacro'
rviz_config_filename = 'mycobot_280_description.rviz'

The robot’s description is loaded from a XACRO file (an XML macro file) and converted into a robot_description parameter:

robot_description_content = ParameterValue(Command([
    'xacro', ' ', urdf_model, ' ',
    'prefix:=', LaunchConfiguration('prefix'),
    # ... other parameters
]), value_type=str)

Then we create three key nodes:

1. Robot State Publisher:

start_robot_state_publisher_cmd = Node(
    package='robot_state_publisher',
    executable='robot_state_publisher',
    name='robot_state_publisher',
    output='screen',
    parameters=[{
        'use_sim_time': use_sim_time,
        'robot_description': robot_description_content}])

This node takes the robot description and joint states and publishes the 3D poses of all robot links – like a GPS system for each part of the robot.

2. Joint State Publisher (with optional GUI):

start_joint_state_publisher_cmd = Node(
    package='joint_state_publisher',
    executable='joint_state_publisher',
    name='joint_state_publisher',
    parameters=[{'use_sim_time': use_sim_time}],
    condition=UnlessCondition(jsp_gui))

This publishes joint positions – imagine it as the robot’s muscle control center. The GUI version allows manual control of these joints.

3. RViz:

start_rviz_cmd = Node(
    condition=IfCondition(use_rviz),
    package='rviz2',
    executable='rviz2',
    name='rviz2',
    output='screen',
    arguments=['-d', rviz_config_file])

This is our visualization tool, loading a pre-configured layout specified in the RViz config file.

Finally, everything is assembled into a LaunchDescription and returned:

ld = LaunchDescription(ARGUMENTS)
# Add all the actions...
return ld

This launch file structure is common in ROS 2 applications, providing a clean way to start multiple nodes simultaneously.

That’s it.

Keep building, and I will see you in the next tutorial!

Create and Visualize a Mobile Robot with URDF – ROS 2 Jazzy

In this tutorial, we will create a model of a mobile robot from scratch. Our robot model will be in the standard Unified Robot Description Format (URDF). 

By the end of this tutorial, you will be able to build this:

12-full-robot

We will then visualize the robot in RViz, a 3D visualization tool for ROS 2.

The official tutorial for creating a URDF file is here on the ROS 2 website; but that tutorial only deals with a fictitious robot.

It is far more fun and helpful to show you how to create a URDF file for a real-world robot, like the ones you will work with at your job or at school…like this one, for example…the Kuka omniMove robot used in an Airbus facility in Germany to move aircraft parts around the factory floor:

You can see this Kuka robot has mecanum wheels. The robot we will build in this tutorial will have mecanum wheels, also known as an omnidirectional robot. A mecanum wheel robot uses special wheels with rollers attached at an angle, allowing it to move in any direction by rotating the wheels independently. 

Compared to robots with standard wheels that can only move forward, backward, and turn, mecanum wheel robots have greater maneuverability and can move sideways without changing orientation.

Within ROS 2, defining the URDF file of your mobile robot is important because it allows software tools to understand the robot’s structure, enabling tasks like simulation, motion planning, and sensor data interpretation. It is like giving the robot a digital body that software can interact with.

I will walk through all the steps below for creating the URDF for the ROSMASTER X3 by Yahboom, a company that makes educational robots. 

Follow along with me click by click, keystroke by keystroke.  

Prerequisites

You can find all the code here on GitHub.

References

Here is my GitHub repository for this project.

Create a Package

The first step is to create a ROS 2 package to store all your files.

Open a new terminal window, and create a new folder named yahboom_rosmaster.

cd ~/ros2_ws/src
mkdir yahboom_rosmaster
cd yahboom_rosmaster

Now create the package where we will store our URDF file.

ros2 pkg create --build-type ament_cmake --license BSD-3-Clause yahboom_rosmaster_description

Now, let’s create a metapackage.

I discuss the purpose of a metapackage in this post.

A metapackage doesn’t contain anything except a list of dependencies to other packages. You can use a metapackage to make it easier to install multiple related packages at once. 

If you were to make your package available to install publicly using the apt-get package manager on Ubuntu for example, a metapackage would enable someone to automatically install all the ROS2 packages that are referenced in your metapackage. 

ros2 pkg create --build-type ament_cmake --license BSD-3-Clause yahboom_rosmaster
cd yahboom_rosmaster

Configure your package.xml file.

gedit package.xml

Make your package.xml file look like this:

<?xml version="1.0"?>
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="3">
  <name>yahboom_rosmaster</name>
  <version>0.0.0</version>
  <description>ROSMASTER series robots by Yahboom (metapackage).</description>
  <maintainer email="automaticaddison@todo.todo">ubuntu</maintainer>
  <license>BSD-3-Clause</license>

  <buildtool_depend>ament_cmake</buildtool_depend>
  
  <exec_depend>yahboom_rosmaster_description</exec_depend>

  <test_depend>ament_lint_auto</test_depend>
  <test_depend>ament_lint_common</test_depend>

  <export>
    <build_type>ament_cmake</build_type>
  </export>
</package>

Add a README.md to describe what the package is about.

cd ..
gedit README.md
# yahboom_rosmaster #
![OS](https://img.shields.io/ubuntu/v/ubuntu-wallpapers/noble)
![ROS_2](https://img.shields.io/ros/v/jazzy/rclcpp)

I also recommend adding placeholder README.md files to the yahboom_rosmaster folder.

# yahboom_rosmaster #

The yahboom_rosmaster package is a metapackage. It contains lists of dependencies to other packages.

… as well as the yahboom_rosmaster_description folder.

# yahboom_rosmaster_description #

The yahboom_rosmaster_description package contains the robot description files that define the physical aspects of a robot, including its geometry, kinematics, dynamics, and visual aspects.

Now let’s build our new package:

cd ~/ros2_ws
colcon build

Let’s see if our new package is recognized by ROS 2.

Either open a new terminal window or source the bashrc file like this:

source ~/.bashrc
ros2 pkg list

You can see the newly created packages right there at the bottom.

1-new-yahboom-packages

Now let’s create the following folders:

mkdir -p ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/meshes/rosmaster_x3/visual
mkdir -p ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/urdf/control
mkdir -p ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/urdf/mech
mkdir -p ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/urdf/sensors
mkdir -p ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/urdf/robots/

Add the Meshes

Mesh files are what make your robot look realistic in robotics simulation and visualization programs.

Mesh files visually represent the 3D shape of the robot parts. These files are typically in formats such as STL (Stereo Lithography – .stl) or COLLADA (.dae).

The mesh files we are going to use were already available in this GitHub repository. We didn’t have to create these files from scratch. I got them from the manufacturer’s website.

However, if you want to create your own custom 3D printed robotic arm in the future, for example, you can make your own mesh file. Here is how:

  • Design the robot’s body using CAD programs like Onshape, Fusion 360, AutoCAD, or Solidworks. These tools help you create 3D models of the robot parts.
  • Export the 3D models as mesh files in formats like STL or COLLADA. These files contain information about the robot’s shape, including vertices, edges, and faces.
  • If needed, use a tool like Blender to simplify the mesh files. This makes them easier to use in simulations and visualizations.
  • Add the simplified mesh files to your URDF file to visually represent what the robot looks like.

Let’s pull these mesh files off GitHub. 

First, open a new terminal window, and type:

cd ~/Downloads/

Clone the yahboom_rosmaster repository to your machine.

git clone https://github.com/automaticaddison/yahboom_rosmaster.git

Move to the mesh files for the robot we are going to model:

cp -r yahboom_rosmaster/yahboom_rosmaster_description/meshes/* ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/meshes/
ls ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/meshes/rosmaster_x3/visual/

You can see the mesh files (.stl) for the robot.

Configure CMakeLists.txt

Let’s open Visual Studio Code.

cd ~/ros2_ws/src/yahboom_rosmaster/
code .

Configure the CMakeLists.txt for the yahboom_rosmaster_description package. Make sure it looks like this:

cmake_minimum_required(VERSION 3.8)
project(yahboom_rosmaster_description)
 
# Check if the compiler being used is GNU's C++ compiler (g++) or Clang.
# Add compiler flags for all targets that will be defined later in the 
# CMakeLists file. These flags enable extra warnings to help catch
# potential issues in the code.
# Add options to the compilation process
if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang")
  add_compile_options(-Wall -Wextra -Wpedantic)
endif()
 
# Locate and configure packages required by the project.
find_package(ament_cmake REQUIRED)
find_package(urdf_tutorial REQUIRED)
 
# Copy necessary files to designated locations in the project
install (
  DIRECTORY meshes urdf
  DESTINATION share/${PROJECT_NAME}
)
 
# Automates the process of setting up linting for the package, which
# is the process of running tools that analyze the code for potential
# errors, style issues, and other discrepancies that do not adhere to
# specified coding standards or best practices.
if(BUILD_TESTING)
  find_package(ament_lint_auto REQUIRED)
  # the following line skips the linter which checks for copyrights
  # comment the line when a copyright and license is added to all source files
  set(ament_cmake_copyright_FOUND TRUE)
  # the following line skips cpplint (only works in a git repo)
  # comment the line when this package is in a git repo and when
  # a copyright and license is added to all source files
  set(ament_cmake_cpplint_FOUND TRUE)
  ament_lint_auto_find_test_dependencies()
endif()
 
ament_package()

Configure package.xml

Make sure your package.xml for the yahboom_rosmaster_description package looks like this:

<?xml version="1.0"?>
<?xml-model href="http://download.ros.org/schema/package_format3.xsd" schematypens="http://www.w3.org/2001/XMLSchema"?>
<package format="3">
  <name>yahboom_rosmaster_description</name>
  <version>0.0.0</version>
  <description>Contains the robot description files that define the physical
    aspects of a robot, including its geometry, kinematics, dynamics, and
    visual aspects.</description>
  <maintainer email="automaticaddison@todo.todo">ubuntu</maintainer>
  <license>BSD-3-Clause</license>

  <buildtool_depend>ament_cmake</buildtool_depend>
  <depend>urdf_tutorial</depend>

  <test_depend>ament_lint_auto</test_depend>
  <test_depend>ament_lint_common</test_depend>

  <export>
    <build_type>ament_cmake</build_type>
  </export>
</package>

Build the Package

Now let’s build the package.

cd ~/ros2_ws/
rosdep install -i --from-path src --rosdistro $ROS_DISTRO -y

You should see:

#All required rosdeps installed successfully

If not, type your password, and install the required dependencies.

Open a terminal window, and type:

build

If this command doesn’t work, type these commands:

echo "alias build='cd ~/dev_ws/ && colcon build && source ~/.bashrc'" >> ~/.bashrc
build
touch ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/urdf/mech/{rosmaster_x3_base.urdf.xacro,mecanum_wheel.urdf.xacro} ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/urdf/sensors/{rgbd_camera.urdf.xacro,imu.urdf.xacro,lidar.urdf.xacro} ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/urdf/robots/rosmaster_x3.urdf.xacro

Create the URDF File

Now let’s create our URDF file. We will actually create it in XACRO format. I will use the terms URDF and XACRO interchangeably going forward.

XACRO files are like blueprints for URDF files, using macros and variables to simplify complex robot descriptions.

Imagine XACRO as the architect drawing up plans, and URDF as the final, ready-to-use construction document. Both file types represent the robotic arm, but XACRO offers more flexibility and organization.

Before a ROS tool or component can use the information in a XACRO file, it must first be processed (translated) into a URDF file. This step allows for the dynamic generation of robot descriptions based on the specific configurations defined in the XACRO file.

Open a terminal window, and type this command to create all the files we need:

touch ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/urdf/mech/{rosmaster_x3_base.urdf.xacro,mecanum_wheel.urdf.xacro} ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/urdf/sensors/{rgbd_camera.urdf.xacro,imu.urdf.xacro,lidar.urdf.xacro} ~/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/urdf/robots/rosmaster_x3.urdf.xacro

Robot Base

Let’s start with creating our base: rosmaster_x3_base.urdf.xacro. Add this code.

Robot Wheels

Now let’s create a generic mecanum wheel: mecanum_wheel.urdf.xacro. Add this code.

RGBD Camera

Now let’s create the RGBD camera: rgbd_camera.urdf.xacro. An RGBD camera is like a regular digital camera that not only captures colors (RGB) but also measures how far away everything in the scene is from the camera (the D stands for depth). This added depth information allows the camera to create 3D maps of its surroundings.

You can find a big repository of sensors that can be implemented in simulation for Gazebo in this GitHub repository.

Add this code.

Robot LIDAR

Now let’s create the LIDAR: lidar.urdf.xacro. We will add the LIDAR plugin so we can generate simulated LIDAR data in a future tutorial.

Add this code.

Robot Inertial Measurement Unit (IMU)

Now let’s create the IMU: imu.urdf.xacro. An IMU (Inertial Measurement Unit) is a sensor that measures movement, specifically acceleration, rotation, and sometimes magnetic fields, to help determine an object’s position and motion. 

Add this code.

Full Robot

Now let’s create the full robot, bringing together all the components we have created: rosmaster_x3.urdf.xacro. Add this code.

Understanding the URDF

Let’s walk through each file so we can understand what is going on.

rosmaster_x3_base.urdf.xacro

At the top, we start with an XML declaration and define that this is a robot description using xacro, which is like a macro language for robot descriptions:

<?xml version="1.0"?>
<robot xmlns:xacro="http://www.ros.org/wiki/xacro">

Then we have a bunch of properties that define the robot’s dimensions in meters:

<xacro:property name="total_width" value="0.19940" />  <!-- About 20cm wide -->
<xacro:property name="wheel_width" value="0.0304" />   <!-- About 3cm wheel width -->
<xacro:property name="wheel_radius" value="0.0325" />  <!-- About 3.25cm wheel radius -->
3-diagram

The code defines a macro called “rosmaster_x3_base” – think of this like a template for the robot’s base. The base has several important parts.

The base_footprint is like an invisible point on the ground directly below the robot:

<link name="${prefix}base_footprint"/>

The base_link is the main body of the robot. It has three important sections:

  • visual: This defines how the robot looks in simulation using a 3D model file (STL)
  • collision: This is a simplified box shape used to detect when the robot hits things
  • inertial: This defines the robot’s mass and how its weight is distributed
<link name="${prefix}base_link">
    <visual>
        <!-- The robot's appearance -->
        <geometry>
            <mesh filename="file://$(find yahboom_rosmaster_description)/meshes/rosmaster_x3/visual/base_link_X3.STL"/>
        </geometry>
        <!-- Makes it green -->
        <material name="green">
          <color rgba="0 0.7 0 1"/>
        </material>
    </visual>
    <!-- Other parts... -->
</link>

The gazebo tag adds special properties for the Gazebo simulator, like how the material looks in different lighting:

<gazebo reference="${prefix}base_link">
    <visual>
        <material>
            <ambient>0 0.7 0 1</ambient>
            <diffuse>0 0.7 0 1</diffuse>
            <specular>0 0.7 0 1</specular>
        </material>
    </visual>
</gazebo>

Finally, there’s a joint that connects the base_footprint to the base_link. It’s “fixed” which means these parts don’t move relative to each other:

<joint name="${prefix}base_joint" type="fixed">
    <parent link="${prefix}base_footprint"/>
    <child link="${prefix}base_link"/>
    <origin xyz="0 0 ${wheel_radius}" rpy="0 0 0"/>
</joint>

The math in the inertial section (ixx, iyy, izz) describes how the robot resists rotation around different axes – this is important for realistic physics simulation. The formula used is the standard equation for the moment of inertia of a rectangular box.

4-weight

mecanum_wheel.urdf.xacro

6-robot-with-mecanum-wheels-and-axes-rviz
I will show you how to launch this later in this tutorial

Unlike regular wheels that can only move forward and backward, mecanum wheels have small rollers arranged at 45-degree angles around their circumference. This unique design allows a robot to move sideways, diagonally, or even rotate in place while remaining stationary – similar to how a crab can walk sideways.

In this file, we start with the basic properties of our wheel. These measurements define the physical characteristics:

<xacro:property name="wheel_radius" value="0.0325" />      <!-- Wheel is 6.5cm in diameter -->
<xacro:property name="wheel_separation" value="0.169" />   <!-- Distance between left and right wheels -->
<xacro:property name="wheel_width" value="0.03040" />     <!-- How thick the wheel is -->
<xacro:property name="wheel_mass" value="0.1" />          <!-- Wheel weighs 0.1 kg -->
<xacro:property name="wheel_xoff" value="0.08" />         <!-- How far forward/back the wheel is -->
<xacro:property name="wheel_yoff" value="-0.01" />        <!-- Small sideways offset -->

The wheel radius affects how fast the robot moves for a given motor speed. 

The separation between wheels influences turning behavior – wider-set wheels provide more stability but require more torque to turn. 

The mass and dimensions affect the robot’s physics simulation, including momentum and inertia.

Next, we define a macro – think of it as a template – that we can use to create wheels:

<xacro:macro name="mecanum_wheel" params="prefix side x_reflect y_reflect">

This macro is clever – instead of writing separate code for each wheel, we write it once and use parameters to customize it. The prefix parameter helps us name each wheel uniquely (like “front_left_wheel” or “rear_right_wheel”). The x_reflect and y_reflect parameters are either 1 or -1, letting us mirror the wheel’s position and orientation for different corners of the robot.

The visual component defines what we see in the simulator:

<visual>
    <origin xyz="0 0 0" rpy="${pi/2} 0 0"/>   <!-- Rotated 90 degrees around X axis -->
    <geometry>
        <mesh filename="file://$(find yahboom_rosmaster_description)/meshes/rosmaster_x3/visual/${side}_wheel_X3.STL"/>
    </geometry>
    <material name="dark_gray">
        <color rgba="0.2 0.2 0.2 1.0"/>       <!-- Dark gray color -->
    </material>
</visual>

This section loads a 3D model (STL file) of the wheel. The rotation parameters (rpy = roll, pitch, yaw) ensure the wheel is oriented correctly. The dark gray color makes it easy to distinguish from other robot parts.

For physics simulation, we need a collision model. While the visual model can be complex and detailed, the collision model is kept simple for computational efficiency:

<collision>
    <geometry>
        <cylinder radius="${wheel_radius}" length="${wheel_width}"/>
    </geometry>
</collision>

Instead of using the detailed STL model for collision detection, we use a simple cylinder. This significantly speeds up physics calculations while maintaining reasonable accuracy.

The inertial properties define how the wheel behaves physically:

<inertial>
    <mass value="${wheel_mass}"/>
    <inertia
        ixx="${(wheel_mass/12.0) * (3*wheel_radius*wheel_radius + wheel_width*wheel_width)}" 
        iyy="${(wheel_mass/2.0) * (wheel_radius*wheel_radius)}"
        izz="${(wheel_mass/12.0) * (3*wheel_radius*wheel_radius + wheel_width*wheel_width)}"/>
</inertial>

These seemingly complex formulas are based on physics equations for a cylinder’s moment of inertia. They determine how the wheel resists changes in rotation around different axes. 

The joint configuration defines how the wheel connects to the robot:

<joint name="${prefix}${side}_wheel_joint" type="continuous">
    <axis xyz="0 1 0"/>                <!-- Spins around Y axis -->
    <parent link="${prefix}base_link"/>
    <child link="${prefix}${side}_wheel_link"/>
    <!-- Position is calculated based on x_reflect and y_reflect to place wheel correctly -->
    <origin xyz="${x_reflect*wheel_xoff} ${y_reflect*(wheel_separation/2+wheel_yoff)} ${-wheel_radius}" rpy="0 0 0"/>
</joint>

A “continuous” joint means it can rotate indefinitely in either direction. The axis specification (0 1 0) means it rotates around the Y axis of the parent link.

The origin calculation uses our x_reflect and y_reflect parameters to position each wheel correctly relative to the robot’s center.

Finally, we have Gazebo-specific settings:

<gazebo reference="${prefix}${side}_wheel_link">
    <mu1>0.01</mu1>    <!-- Friction coefficients -->
    <mu2>0.01</mu2>    <!-- Low values for smooth rolling -->
</gazebo>

The mu1 and mu2 values are friction coefficients. For mecanum wheels, we keep these values low because the rollers should allow easy sideways movement. Higher values would make the wheels grip too much and resist the sliding motion that makes mecanum wheels special.

To implement this in your own robot, you’d call this macro four times, once for each wheel. 

7-robot-with-mecanum-wheels

rgbd.camera.urdf.xacro

An RGBD (Red, Green, Blue + Depth) camera combines a regular color camera with depth sensing capabilities. This allows robots to not just see colors and shapes, but also understand how far away objects are – important for navigation and manipulation tasks.

The code starts by defining a macro called “rgbd_camera” with numerous parameters that make the camera highly configurable:

<xacro:macro name="rgbd_camera" params="
    prefix:=''
    camera_name:='cam_1'
    parent:='base_link'
    mesh_file:='file://$(find yahboom_rosmaster_description)/meshes/intel_realsense/visual/d435.stl'
    xyz_offset:='0.105 0 0.05'
    rpy_offset:='0 -0.50 0'

The parameters control everything from basic naming and positioning to detailed physics properties. The default values are carefully chosen to match real-world RGBD cameras. 

For instance, the camera’s mass is set to 0.072 kg and includes precise inertial properties that match the Intel RealSense D435’s physical characteristics.

The camera’s physical structure consists of multiple “frames” or coordinate systems, each serving a specific purpose:

<link name="${prefix}${camera_name}_link">
    <visual>
        <origin xyz="${mesh_xyz_offset}" rpy="${mesh_rpy_offset}"/>
        <geometry>
            <mesh filename="${mesh_file}" />
        </geometry>
        ...
    </visual>

The main camera link holds the visual 3D model (loaded from an STL file) and can optionally include collision geometry for physical simulation. The aluminum material gives it a realistic appearance.

What makes this implementation particularly sophisticated is its handling of multiple camera frames:

  • Depth frame: Captures distance information
  • Infrared frames (infra1 and infra2): Used for stereo depth perception
  • Color frame: Regular RGB camera
  • Optical frames: Aligned with standard ROS conventions

Each frame is connected by fixed joints with specific offsets:

<joint name="${prefix}${camera_name}_depth_joint" type="fixed">
    <origin xyz="${depth_frame_xyz_offset}" rpy="${depth_frame_rpy_offset}"/>
    <parent link="${prefix}${camera_name}_link"/>
    <child link="${prefix}${camera_name}_depth_frame" />
</joint>

The Gazebo simulation settings at the end define how the camera operates in simulation:

<gazebo reference="${prefix}${camera_name}_link">
    <sensor name="${prefix}${camera_name}" type="rgbd_camera">
        <camera>
            <horizontal_fov>${horizontal_fov}</horizontal_fov>
            

imu.urdf.xacro

This file defines an IMU sensor for a robot simulation in ROS. An IMU measures a robot’s orientation, acceleration, and angular velocity. It tells the robot which way is up, how fast it’s moving, and how it’s rotating in space – just like the sensor in your smartphone that knows when you turn the screen.

10-mecanum-wheel-with-imu

The core macro definition starts with configurable parameters that let you customize how the IMU is attached to your robot:

<xacro:macro name="imu_sensor" params="
    prefix:=''
    parent:='base_link'
    frame_id:='imu'
    xyz_offset:='0 0 0'
    rpy_offset:='0 0 0'
    ...

The physical properties of the IMU are carefully defined to match real-world characteristics. The sensor weighs 31 grams (0.031 kg) and measures approximately 39mm × 38mm × 13mm. It updates its measurements 15 times per second (15 Hz) and is rendered in a dark black color to match typical IMU hardware.

One of the most interesting parts is how the code calculates the moment of inertia – a measure of how hard it is to rotate the sensor around different axes:

<xacro:property name="ixx" value="${(mass/12.0) * (height*height + width*width)}" />
<xacro:property name="iyy" value="${(mass/12.0) * (length*length + height*height)}" />
<xacro:property name="izz" value="${(mass/12.0) * (length*length + width*width)}" />

These calculations come from physics equations for rectangular objects. The 1/12 factor appears because of how mass is distributed in a rectangular shape. Each axis needs different calculations because rotating a rectangular object requires different amounts of force depending on which way you’re turning it.

The visual representation keeps things simple with a basic black box shape:

<visual>
    <origin xyz="0 0 0" rpy="0 0 0"/>
    <geometry>
        <box size="${length} ${width} ${height}"/>
    </geometry>
    <material name="${material_name}">
        <color rgba="${material_color}"/>
    </material>
</visual>

The IMU attaches to your robot using a fixed joint, meaning it doesn’t move relative to whatever part of the robot you attach it to:

<joint name="${prefix}${frame_id}_joint" type="fixed">
    <parent link="${prefix}${parent}"/>
    <child link="${prefix}${frame_id}_link" />
    <origin xyz="${xyz_offset}" rpy="${rpy_offset}"/>
</joint>

The Gazebo simulation settings at the end define how the IMU behaves in the virtual world:

<gazebo reference="${prefix}${frame_id}_link">
    <sensor name="${prefix}imu_sensor" type="imu">
        <topic>${prefix}${topic_name}</topic>
        <update_rate>${update_rate}</update_rate>
        <always_on>${always_on}</always_on>
        <visualize>${visualize}</visualize>
        <gz_frame_id>${prefix}${frame_id}_link</gz_frame_id>
    </sensor>
</gazebo>

This section tells Gazebo to create a virtual IMU sensor that publishes its data to a ROS topic named “imu/data” by default. The sensor stays on continuously during simulation and updates 15 times per second. The visualization is turned off by default since you typically don’t need to see sensor data graphics during simulation.

lidar.urdf.xacro

This file defines a LIDAR (Light Detection and Ranging) sensor for robot simulation in ROS, specifically modeling an RPLidar S2. A LIDAR sensor spins around and uses laser beams to measure distances to objects, creating a 2D map of its surroundings.

8-mecanum-wheel-robot-rplidars2-intel-realsense-stl

The macro starts by defining the physical and operational characteristics of the LIDAR:

<xacro:macro name="lidar_sensor" params="
    prefix:=''
    parent:='base_link'
    frame_id:='laser_frame'
    mesh_file:='file://$(find yahboom_rosmaster_description)/meshes/rplidar/rplidar_s2.stl'
    xyz_offset:='0 0 0.0825'
    ...

The physical properties match a real RPLidar S2 with a width of 77mm diameter, height of 39.8mm, and mass of 185 grams. Like most LIDAR sensors, it’s rendered in black to match the real hardware.

The sensor’s moment of inertia calculations treat the LIDAR as a cylinder:

<xacro:property name="radius" value="${lidar_width/2}" />
<xacro:property name="ixx_iyy" value="${(mass/12) * (3 * radius * radius + lidar_height * lidar_height)}" />
<xacro:property name="izz" value="${(mass/2) * (radius * radius)}" />

The visual appearance uses a detailed 3D model scaled down from millimeters to meters:

<visual>
    <origin xyz="${mesh_xyz_offset}" rpy="${mesh_rpy_offset}"/>
    <geometry>
        <mesh filename="${mesh_file}" scale="${mesh_scale}"/>
    </geometry>
    <material name="${material_name}">
        <color rgba="${material_color}"/>
    </material>
</visual>

The most interesting part is the LIDAR sensor configuration in Gazebo:

<sensor name="${prefix}lidar_sensor" type="gpu_lidar">
    <topic>${prefix}${topic_name}</topic>
    <update_rate>${update_rate}</update_rate>
    <ray>
        <scan>
            <horizontal>
                <samples>${ray_count}</samples>
                <resolution>1</resolution>
                <min_angle>${min_angle}</min_angle>
                <max_angle>${max_angle}</max_angle>
            </horizontal>

This LIDAR simulates a full 360-degree scan (from -π to π radians) with 360 individual laser beams, taking 10 scans per second. It can measure distances from 5cm to 30 meters with a resolution of 13mm, and uses GPU acceleration for better performance.

The sensor is configured as a 2D LIDAR, meaning it scans in a single plane. This is clear from the vertical configuration:

<vertical>
    <samples>1</samples>
    <resolution>1</resolution>
    <min_angle>0</min_angle>
    <max_angle>0</max_angle>
</vertical>

The sensor publishes its data to a ROS topic called “scan” and runs continuously during simulation. The GPU acceleration means it uses your computer’s graphics card to process the laser measurements, making the simulation more efficient.

The physical mounting is handled by a fixed joint that typically places the LIDAR about 8.25cm above its parent link:

<joint name="${prefix}${frame_id}_joint" type="fixed">
    <parent link="${prefix}${parent}"/>
    <child link="${prefix}${frame_id}" />
    <origin xyz="${xyz_offset}" rpy="${rpy_offset}"/>
</joint>

When you use this macro in your robot description, you get a realistic LIDAR sensor that creates accurate distance measurements in a 360-degree field of view. It updates 10 times per second, has realistic physical properties for simulation, uses GPU acceleration for efficient processing, and publishes data in the standard ROS laser scan format.

The data from this simulated LIDAR works just like a real LIDAR for navigation, mapping, obstacle avoidance, and localization tasks in your robotics applications.

rosmaster_x3.urdf.xacro

This is the main assembly file for the ROSMASTER X3 robot. Think of it as a blueprint that brings together all the individual components we looked at earlier into a complete robot. Let’s walk through how this file builds the robot piece by piece.

First, it sets up some basic information:

<?xml version="1.0"?>
<robot xmlns:xacro="http://www.ros.org/wiki/xacro" name="rosmaster_x3">
    <xacro:property name="M_PI" value="3.1415926535897931" />
    <xacro:arg name="prefix" default=""/>

The prefix argument lets you create multiple robots in the same simulation by giving each one a unique name prefix.

Next, it imports all the component descriptions we’ve looked at:

<xacro:include filename="$(find yahboom_rosmaster_description)/urdf/mech/rosmaster_x3_base.urdf.xacro"/>
<xacro:include filename="$(find yahboom_rosmaster_description)/urdf/mech/mecanum_wheel.urdf.xacro"/>
<xacro:include filename="$(find yahboom_rosmaster_description)/urdf/sensors/rgbd_camera.urdf.xacro"/>
<xacro:include filename="$(find yahboom_rosmaster_description)/urdf/sensors/lidar.urdf.xacro"/>
<xacro:include filename="$(find yahboom_rosmaster_description)/urdf/sensors/imu.urdf.xacro"/>

Then it starts assembling the robot. First comes the base:

<xacro:rosmaster_x3_base prefix="$(arg prefix)"/>

Next, it adds the four mecanum wheels. Notice how it uses x_reflect and y_reflect to position each wheel correctly:

<xacro:mecanum_wheel
  prefix="$(arg prefix)"
  side="front_left"
  x_reflect="1"
  y_reflect="1"/>

<xacro:mecanum_wheel
  prefix="$(arg prefix)"
  side="front_right"
  x_reflect="1"
  y_reflect="-1"/>

The reflect values work like this:

  • Front wheels have x_reflect=”1″ because they’re at the front
  • Back wheels have x_reflect=”-1″ because they’re at the back
  • Left wheels have y_reflect=”1″
  • Right wheels have y_reflect=”-1″

Then it adds the RGBD camera (like a RealSense):

<xacro:rgbd_camera
  prefix="$(arg prefix)"
  camera_name="cam_1"
  xyz_offset="0.105 0 0.05"
  rpy_offset="0 -0.50 0"/>

The camera is positioned 10.5cm forward, 5cm up, and tilted down slightly (0.5 radians) to see the ground in front of the robot.

The LIDAR sensor comes next:

<xacro:lidar_sensor
  prefix="$(arg prefix)"
  parent="base_link"
  frame_id="laser_frame"
  xyz_offset="0 0 0.0825"
  rpy_offset="0 0 0"/>

It’s mounted 8.25cm above the base, perfectly level to scan the horizontal plane around the robot.

Finally, it adds the IMU sensor:

<xacro:imu_sensor
  prefix="$(arg prefix)"
  parent="base_link"
  frame_id="imu"
  xyz_offset="0 0 0.006"
  rpy_offset="0 0 0"/>

The IMU sits very close to the base (0.6cm up) because it needs to measure the robot’s core movement.

This assembly creates a mobile robot that can:

  • Move in any direction using its mecanum wheels
  • See objects and depth with its RGBD camera
  • Scan its surroundings with the LIDAR
  • Track its movement and orientation with the IMU

All these sensors work together – the LIDAR maps the space, the camera identifies objects, and the IMU helps keep track of the robot’s position and movement. The mecanum wheels then let it navigate precisely through its environment.

Build the Package

Now let’s build the package.

build

Visualize the URDF File

Let’s see the URDF file in RViz. 

Launch the URDF file. The conversion from XACRO to URDF happens behind the scenes. Be sure to have the correct path to your XACRO file.

ros2 launch urdf_tutorial display.launch.py model:=/home/ubuntu/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/urdf/robots/rosmaster_x3.urdf.xacro
12-full-robot

By convention, the red axis is the x-axis, the green axis in the y-axis, and the blue axis is the z-axis.

13-by-convention

Uncheck the TF checkbox to turn off the axes.

You can use the Joint State Publisher GUI pop-up window to move the links around.

On the left panel under Displays, play around by checking and unchecking different options.

For example, under Robot Model, you can see how the mass is distributed for the robot arm by unchecking “Visual Enabled” and “Collision Enabled” and checking the “Mass” checkbox under “Mass Properties”.

15-mass-mecanum-wheel-robot

You can also see what simulation engines will use to detect collisions when the robotic arm is commanded to go to a certain point.

Uncheck “Visual Enabled” under Robot Model and check “Collision Enabled.”

14-collision-enabled

You can also see the coordinate frames. 

Open a new terminal window, and type the following commands:

cd ~/Documents/
ros2 run tf2_tools view_frames

To see the coordinate frames, type:

dir
evince frames_YYYY-MM-DD_HH.MM.SS.pdf
16-yahboom-mobile-robot
Sorry the text is so small

To close RViz, press CTRL + C.

So we can quickly visualize our robot in the future, let’s add a bash command that will enable us to quickly see our URDF.

echo "alias yahboom='ros2 launch urdf_tutorial display.launch.py model:=/home/ubuntu/ros2_ws/src/yahboom_rosmaster/yahboom_rosmaster_description/urdf/robots/rosmaster_x3.urdf.xacro'" >> ~/.bashrc

To see it was added, type:

cat ~/.bashrc
build

Going forward, if you want to see your URDF file, type this command in the terminal window:

yahboom

That’s it. Keep building, and I will see you in the next tutorial!

Calculating Wheel Odometry for a Differential Drive Robot

Have you ever wondered how robots navigate their environments so precisely? A key component for many mobile robots is differential drive. This type of robot uses two independently controlled wheels, allowing for maneuvers like moving forward, backward, and turning. 

But how does the robot itself know where it is and how far it’s traveled? This is where wheel odometry comes in.

Wheel odometry is a technique for estimating a robot’s position and orientation based on the rotations of its wheels. By measuring the number of revolutions each wheel makes, we can calculate the distance traveled and any changes in direction. This information is important for tasks like path planning, obstacle avoidance, and overall robot control.

This tutorial will guide you through calculating wheel odometry for a differential drive robot. We’ll explore how to convert raw wheel encoder data – the number of revolutions for each wheel – into the robot’s displacement in the x and y directions (relative to a starting point) and the change in its orientation angle. Buckle up and get ready to dive into the fascinating world of robot self-localization!

Prerequisites

Calculate Wheel Displacements

First, we calculate the distance each wheel has traveled based on the number of revolutions since the last time step. This requires knowing:

1-calculate-wheel-displacement-distance

Calculate the Robot’s Average Displacement and Orientation Change

Next, we determine the robot’s average displacement and the change in its orientation. The average displacement (Davg) is the mean of the distances traveled by both wheels:

2-robot-displacement

The change in orientation (Δθ), measured in radians, is influenced by the difference in wheel displacements and the distance between the wheels (L):

3-change-in-orientation-robot

Calculate Changes in the Global Position

Now, we can calculate the robot’s movement in the global reference frame. Assuming the robot’s initial orientation is θ, and using Davg and Δθ, we find the changes in the x and y positions as follows:

4b-change-in-global-position

You will often see the following equation instead:

4-change-in-global-position

This simplification assumes Δθ is relatively small, allowing us to approximate the displacement direction using the final orientation θnew without significant loss of accuracy. It’s a useful approximation for small time steps or when precise integration of orientation over displacement is not critical.

To find the robot’s new orientation:

5-robot-new-orientation

Note, for robotics projects, it is common for us to modify this angle so that it is always between -pi and +pi. Here is what that code would look like:

# Keep angle between -PI and PI  
if (self.new_yaw_angle > PI):
    self.new_yaw_angle = self.new_yaw_angle - (2 * PI)
if (self.new_yaw_angle < -PI):
    self.new_yaw_angle = self.new_yaw_angle + (2 * PI)

For ROS 2, you would then convert this new yaw angle into a quaternion.

Update the Robot’s Global Position

Finally, we update the robot’s position in the global reference frame. If the robot’s previous position was (x, y), the new position (xnew, ynew) is given by:

6-update-robot-global-position

Practical Example

Let’s apply these steps with sample values:

7-define-values

Using these values, we’ll calculate the robot’s new position and orientation.

Step 1: Wheel Displacements

8-step-1

Step 2: Average Displacement and Orientation Change

9-step-2

Step 3: Changes in Global Position

10-step-3

Step 4: New Global Position and Orientation

Therefore the new orientation of the robot is 2.52 radians, and the robot is currently located at (x=-4.34, y = 4.80).

11-step-4

Important Note on Assumptions

The calculations for wheel odometry as demonstrated in the example above are made under two crucial assumptions:

  1. No Wheel Slippage: It’s assumed that the wheels of the robot do not slip during movement. Wheel slippage can occur due to loss of traction, often caused by slick surfaces or rapid acceleration/deceleration. When slippage occurs, the actual distance traveled by the robot may differ from the calculated values, as the wheel rotations do not accurately reflect movement over the ground.
  2. Adequate Friction: The calculations also assume that there is adequate friction between the wheels and the surface on which the robot is moving. Adequate friction is necessary for the wheels to grip the surface effectively, allowing for precise control over the robot’s movement. Insufficient friction can lead to wheel slippage, which, as mentioned, would result in inaccuracies in the odometry data.

These assumptions are essential for the accuracy of wheel odometry calculations. In real-world scenarios, various factors such as floor material, wheel material, and robot speed can affect these conditions.

Therefore, while the mathematical model provides a foundational understanding of how to calculate a robot’s position and orientation based on wheel rotations, you should be aware of these limitations and consider implementing corrective measures or additional sensors to account for potential discrepancies in odometry data due to slippage or inadequate friction.