Create a Launch File for a Simulated Robotic Arm – ROS 2

In this tutorial, I will guide you through the process of creating a launch file for a robotic arm URDF file. 

Launch files in ROS 2 are powerful tools that allow you to start multiple nodes and set parameters with a single command, simplifying the process of managing your robot’s complex systems.

Prerequisites

All my code for this project is located here on GitHub.

Directions

Open a terminal window.

Move to your robotic arm directory (e.g. mycobot_ros2).

cd ~/ros2_ws/src/mycobot_ros2/mycobot_description/
mkdir launch
cd launch
gedit mycobot_280_arduino_view_description.launch.py

Add this code.

# Author: Addison Sears-Collins
# Date: March 26, 2024
# Description: Display the robotic arm with RViz

from launch import LaunchDescription
from launch.actions import DeclareLaunchArgument
from launch.conditions import IfCondition, UnlessCondition
from launch.substitutions import Command, LaunchConfiguration, PathJoinSubstitution
from launch_ros.actions import Node
from launch_ros.parameter_descriptions import ParameterValue
from launch_ros.substitutions import FindPackageShare

def generate_launch_description():

    # Define filenames    
    urdf_package = 'mycobot_description'
    urdf_filename = 'mycobot_280_urdf.xacro'
    rviz_config_filename = 'mycobot_280_arduino_view_description.rviz'

    # Set paths to important files
    pkg_share_description = FindPackageShare(urdf_package)
    default_urdf_model_path = PathJoinSubstitution([pkg_share_description, 'urdf', urdf_filename])
    default_rviz_config_path = PathJoinSubstitution([pkg_share_description, 'rviz', rviz_config_filename])

    # Launch configuration variables specific to simulation
    jsp_gui = LaunchConfiguration('jsp_gui')
    rviz_config_file = LaunchConfiguration('rviz_config_file')
    urdf_model = LaunchConfiguration('urdf_model')
    use_rviz = LaunchConfiguration('use_rviz')
    use_sim_time = LaunchConfiguration('use_sim_time')

    # Declare the launch arguments 
    declare_jsp_gui_cmd = DeclareLaunchArgument(
        name='jsp_gui', 
        default_value='true', 
        choices=['true', 'false'],
        description='Flag to enable joint_state_publisher_gui')
    
    declare_rviz_config_file_cmd = DeclareLaunchArgument(
        name='rviz_config_file',
        default_value=default_rviz_config_path,
        description='Full path to the RVIZ config file to use')

    declare_urdf_model_path_cmd = DeclareLaunchArgument(
        name='urdf_model', 
        default_value=default_urdf_model_path, 
        description='Absolute path to robot urdf file')

    declare_use_rviz_cmd = DeclareLaunchArgument(
        name='use_rviz',
        default_value='true',
        description='Whether to start RVIZ')

    declare_use_sim_time_cmd = DeclareLaunchArgument(
        name='use_sim_time',
        default_value='false',
        description='Use simulation (Gazebo) clock if true')
    
    # Specify the actions

    # Publish the joint state values for the non-fixed joints in the URDF file.
    start_joint_state_publisher_cmd = Node(
        package='joint_state_publisher',
        executable='joint_state_publisher',
        name='joint_state_publisher',
        condition=UnlessCondition(jsp_gui))

    # Depending on gui parameter, either launch joint_state_publisher or joint_state_publisher_gui
    start_joint_state_publisher_gui_cmd = Node(
        package='joint_state_publisher_gui',
        executable='joint_state_publisher_gui',
        name='joint_state_publisher_gui',
        condition=IfCondition(jsp_gui))

    # Subscribe to the joint states of the robot, and publish the 3D pose of each link.
    robot_description_content = ParameterValue(Command(['xacro ', urdf_model]), value_type=str)
    start_robot_state_publisher_cmd = Node(
        package='robot_state_publisher',
        executable='robot_state_publisher',
        name='robot_state_publisher',
        output='screen',
        parameters=[{
            'use_sim_time': use_sim_time, 
            'robot_description': robot_description_content}])
      
    # Launch RViz
    start_rviz_cmd = Node(
        condition=IfCondition(use_rviz),
        package='rviz2',
        executable='rviz2',
        name='rviz2',
        output='screen',
        arguments=['-d', rviz_config_file],
        parameters=[{
        'use_sim_time': use_sim_time}])
  
    # Create the launch description and populate
    ld = LaunchDescription()

    # Declare the launch options
    ld.add_action(declare_jsp_gui_cmd)
    ld.add_action(declare_rviz_config_file_cmd)
    ld.add_action(declare_urdf_model_path_cmd)
    ld.add_action(declare_use_rviz_cmd)
    ld.add_action(declare_use_sim_time_cmd)

    # Add any actions
    ld.add_action(start_joint_state_publisher_cmd)
    ld.add_action(start_joint_state_publisher_gui_cmd)
    ld.add_action(start_robot_state_publisher_cmd)
    ld.add_action(start_rviz_cmd)

    return ld

Now add the RViz configuration file.

cd ..
mkdir rviz
cd rviz
gedit mycobot_280_arduino_view_description.rviz 

Add this code.

Panels:
  - Class: rviz_common/Displays
    Name: Displays
  - Class: rviz_common/Views
    Name: Views
Visualization Manager:
  Displays:
    - Class: rviz_default_plugins/Grid
      Name: Grid
      Value: true
    - Alpha: 0.8
      Class: rviz_default_plugins/RobotModel
      Description Topic:
        Value: /robot_description
      Name: RobotModel
      Value: true
    - Class: rviz_default_plugins/TF
      Name: TF
      Value: true
  Global Options:
    Fixed Frame: base_link
  Tools:
    - Class: rviz_default_plugins/MoveCamera
  Value: true
  Views:
    Current:
      Class: rviz_default_plugins/Orbit
      Distance: 1.7
      Name: Current View
      Pitch: 0.33
      Value: Orbit (rviz)
      Yaw: 5.5
Window Geometry:
  Height: 800
  Width: 1200

Edit CMakeLists.txt.

cd ..
gedit CMakeLists.txt
cmake_minimum_required(VERSION 3.8)
project(mycobot_description)

# Check if the compiler being used is GNU's C++ compiler (g++) or Clang.
# Add compiler flags for all targets that will be defined later in the 
# CMakeLists file. These flags enable extra warnings to help catch
# potential issues in the code.
# Add options to the compilation process
if(CMAKE_COMPILER_IS_GNUCXX OR CMAKE_CXX_COMPILER_ID MATCHES "Clang")
  add_compile_options(-Wall -Wextra -Wpedantic)
endif()

# Locate and configure packages required by the project.
find_package(ament_cmake REQUIRED)

# Copy necessary files to designated locations in the project
install (
  DIRECTORY launch meshes rviz urdf
  DESTINATION share/${PROJECT_NAME}
)

# Automates the process of setting up linting for the package, which
# is the process of running tools that analyze the code for potential
# errors, style issues, and other discrepancies that do not adhere to
# specified coding standards or best practices.
if(BUILD_TESTING)
  find_package(ament_lint_auto REQUIRED)
  # the following line skips the linter which checks for copyrights
  # comment the line when a copyright and license is added to all source files
  set(ament_cmake_copyright_FOUND TRUE)
  # the following line skips cpplint (only works in a git repo)
  # comment the line when this package is in a git repo and when
  # a copyright and license is added to all source files
  set(ament_cmake_cpplint_FOUND TRUE)
  ament_lint_auto_find_test_dependencies()
endif()

ament_package()

Build your workspace.

cd ~/ros2_ws/
colcon build
source ~/.bashrc

Now launch your launch file:

ros2 launch mycobot_description mycobot_280_arduino_view_description.launch.py
launch-file-mycobot-ros2

Calculating Wheel Odometry for a Differential Drive Robot

Have you ever wondered how robots navigate their environments so precisely? A key component for many mobile robots is differential drive. This type of robot uses two independently controlled wheels, allowing for maneuvers like moving forward, backward, and turning. 

But how does the robot itself know where it is and how far it’s traveled? This is where wheel odometry comes in.

Wheel odometry is a technique for estimating a robot’s position and orientation based on the rotations of its wheels. By measuring the number of revolutions each wheel makes, we can calculate the distance traveled and any changes in direction. This information is important for tasks like path planning, obstacle avoidance, and overall robot control.

This tutorial will guide you through calculating wheel odometry for a differential drive robot. We’ll explore how to convert raw wheel encoder data – the number of revolutions for each wheel – into the robot’s displacement in the x and y directions (relative to a starting point) and the change in its orientation angle. Buckle up and get ready to dive into the fascinating world of robot self-localization!

Prerequisites

Calculate Wheel Displacements

First, we calculate the distance each wheel has traveled based on the number of revolutions since the last time step. This requires knowing:

1-calculate-wheel-displacement-distance

Calculate the Robot’s Average Displacement and Orientation Change

Next, we determine the robot’s average displacement and the change in its orientation. The average displacement (Davg) is the mean of the distances traveled by both wheels:

2-robot-displacement

The change in orientation (Δθ), measured in radians, is influenced by the difference in wheel displacements and the distance between the wheels (L):

3-change-in-orientation-robot

Calculate Changes in the Global Position

Now, we can calculate the robot’s movement in the global reference frame. Assuming the robot’s initial orientation is θ, and using Davg and Δθ, we find the changes in the x and y positions as follows:

4b-change-in-global-position

You will often see the following equation instead:

4-change-in-global-position

This simplification assumes Δθ is relatively small, allowing us to approximate the displacement direction using the final orientation θnew without significant loss of accuracy. It’s a useful approximation for small time steps or when precise integration of orientation over displacement is not critical.

To find the robot’s new orientation:

5-robot-new-orientation

Note, for robotics projects, it is common for us to modify this angle so that it is always between -pi and +pi. Here is what that code would look like:

# Keep angle between -PI and PI  
if (self.new_yaw_angle > PI):
    self.new_yaw_angle = self.new_yaw_angle - (2 * PI)
if (self.new_yaw_angle < -PI):
    self.new_yaw_angle = self.new_yaw_angle + (2 * PI)

For ROS 2, you would then convert this new yaw angle into a quaternion.

Update the Robot’s Global Position

Finally, we update the robot’s position in the global reference frame. If the robot’s previous position was (x, y), the new position (xnew, ynew) is given by:

6-update-robot-global-position

Practical Example

Let’s apply these steps with sample values:

7-define-values

Using these values, we’ll calculate the robot’s new position and orientation.

Step 1: Wheel Displacements

8-step-1

Step 2: Average Displacement and Orientation Change

9-step-2

Step 3: Changes in Global Position

10-step-3

Step 4: New Global Position and Orientation

Therefore the new orientation of the robot is 2.52 radians, and the robot is currently located at (x=-4.34, y = 4.80).

11-step-4

Important Note on Assumptions

The calculations for wheel odometry as demonstrated in the example above are made under two crucial assumptions:

  1. No Wheel Slippage: It’s assumed that the wheels of the robot do not slip during movement. Wheel slippage can occur due to loss of traction, often caused by slick surfaces or rapid acceleration/deceleration. When slippage occurs, the actual distance traveled by the robot may differ from the calculated values, as the wheel rotations do not accurately reflect movement over the ground.
  2. Adequate Friction: The calculations also assume that there is adequate friction between the wheels and the surface on which the robot is moving. Adequate friction is necessary for the wheels to grip the surface effectively, allowing for precise control over the robot’s movement. Insufficient friction can lead to wheel slippage, which, as mentioned, would result in inaccuracies in the odometry data.

These assumptions are essential for the accuracy of wheel odometry calculations. In real-world scenarios, various factors such as floor material, wheel material, and robot speed can affect these conditions.

Therefore, while the mathematical model provides a foundational understanding of how to calculate a robot’s position and orientation based on wheel rotations, you should be aware of these limitations and consider implementing corrective measures or additional sensors to account for potential discrepancies in odometry data due to slippage or inadequate friction.

Calculating Wheel Velocities for a Differential Drive Robot

Have you ever wondered how those robots navigate around restaurants or hospitals, delivering food or fetching supplies? These little marvels of engineering often rely on a simple yet powerful concept: the differential drive robot.

A differential drive robot uses two independently powered wheels and often has one or more passive caster wheels. By controlling the speed and direction of each wheel, the robot can move forward, backward, turn, or even spin in place. 

1-differential-drive-robot-base

This tutorial will delve into the behind-the-scenes calculations that translate desired robot motion (linear and angular velocities) into individual wheel speeds (rotational velocities in revolutions per second).

Real-World Applications

Differential drive robots, with their two independently controlled wheels, are widely used for their maneuverability and simplicity. Here are some real-world applications:

Delivery and Warehousing:

  • Inventory management:  Many warehouse robots use differential drive designs. They can navigate narrow aisles, efficiently locate and transport goods, and perform stock checks.  
  • Last-mile delivery:  Delivery robots on sidewalks and in controlled environments often employ differential drive. They can navigate crowded areas and deliver packages autonomously. 

Domestic and Public Use:

  • Floor cleaning robots:  These robots navigate homes and offices using differential drive. They can efficiently clean floors while avoiding obstacles.
  • Disinfection robots:  In hospitals and public areas, differential drive robots can be used for disinfecting surfaces with UV light or other methods.
  • Security robots:  These robots patrol buildings or outdoor spaces, using differential drive for maneuverability during surveillance.

Specialized Applications:

  • Agricultural robots:  Differential drive robots can be used in fields for tasks like planting seeds or collecting data on crops.
  • Military robots:  Small reconnaissance robots often use differential drive for navigating rough terrain.
  • Entertainment robots:  Robots designed for entertainment or education may utilize differential drive for movement and interaction.
  • Restaurant robots: Robots that deliver from the kitchen to the dining room.

Why Differential Drive for these Applications?

Differential drive robots offer several advantages for these tasks:

  • Simple design:  Their two-wheeled platform makes them relatively easy and inexpensive to build.
  • Maneuverability:  By controlling the speed of each wheel independently, they can turn sharply and navigate complex environments.
  • Compact size:  Their design allows them to operate in tight spaces.

Prerequisites

Convert Commanded Velocities to Wheel Linear Velocities

The robot’s motion is defined by two control inputs: the linear velocity (V) and the angular velocity (ω) of the robot in its reference frame.

2-differential-drive-diagram-robot-reference-frame

The linear velocity is the speed at which you want the robot to move forward or backward, while the angular velocity defines how fast you want the robot to turn clockwise and counterclockwise.

The linear velocities of the right (Vr) and the left (Vl) wheels can be calculated using the robot’s wheelbase width (L), which is the distance between the centers of the two wheels:

3-equation-1-linear-velocities-right-left-wheel

Convert Linear Velocity to Rotational Velocity

Once we have the linear velocities of the wheels, we need to convert them into rotational velocities. The rotational velocity (θ) in radians per second for each wheel is given by:

4-equation-rotational-velocity-wheel

Thus, the rotational velocities of the right and left wheels are:

5-rotational-velocities-right-left-wheels

Convert Radians per Second to Revolutions per Second

The final step is to convert the rotational velocities from radians per second to revolutions per second (rev/s) for practical use. Since there are 2π radians in a full revolution:

6-revolutions-per-second

Example

Let’s calculate the wheel velocities for a robot with a wheel radius of 0.05m, wheelbase width of 0.30m, commanded linear velocity of 1.0m/s, and angular velocity of 0.5rad/s.

7-full-calculation