Calculating Wheel Velocities for a Differential Drive Robot

Have you ever wondered how those robots navigate around restaurants or hospitals, delivering food or fetching supplies? These little marvels of engineering often rely on a simple yet powerful concept: the differential drive robot.

A differential drive robot uses two independently powered wheels and often has one or more passive caster wheels. By controlling the speed and direction of each wheel, the robot can move forward, backward, turn, or even spin in place. 

1-differential-drive-robot-base

This tutorial will delve into the behind-the-scenes calculations that translate desired robot motion (linear and angular velocities) into individual wheel speeds (rotational velocities in revolutions per second).

Real-World Applications

Differential drive robots, with their two independently controlled wheels, are widely used for their maneuverability and simplicity. Here are some real-world applications:

Delivery and Warehousing:

  • Inventory management:  Many warehouse robots use differential drive designs. They can navigate narrow aisles, efficiently locate and transport goods, and perform stock checks.  
  • Last-mile delivery:  Delivery robots on sidewalks and in controlled environments often employ differential drive. They can navigate crowded areas and deliver packages autonomously. 

Domestic and Public Use:

  • Floor cleaning robots:  These robots navigate homes and offices using differential drive. They can efficiently clean floors while avoiding obstacles.
  • Disinfection robots:  In hospitals and public areas, differential drive robots can be used for disinfecting surfaces with UV light or other methods.
  • Security robots:  These robots patrol buildings or outdoor spaces, using differential drive for maneuverability during surveillance.

Specialized Applications:

  • Agricultural robots:  Differential drive robots can be used in fields for tasks like planting seeds or collecting data on crops.
  • Military robots:  Small reconnaissance robots often use differential drive for navigating rough terrain.
  • Entertainment robots:  Robots designed for entertainment or education may utilize differential drive for movement and interaction.
  • Restaurant robots: Robots that deliver from the kitchen to the dining room.

Why Differential Drive for these Applications?

Differential drive robots offer several advantages for these tasks:

  • Simple design:  Their two-wheeled platform makes them relatively easy and inexpensive to build.
  • Maneuverability:  By controlling the speed of each wheel independently, they can turn sharply and navigate complex environments.
  • Compact size:  Their design allows them to operate in tight spaces.

Prerequisites

Convert Commanded Velocities to Wheel Linear Velocities

The robot’s motion is defined by two control inputs: the linear velocity (V) and the angular velocity (ω) of the robot in its reference frame.

2-differential-drive-diagram-robot-reference-frame

The linear velocity is the speed at which you want the robot to move forward or backward, while the angular velocity defines how fast you want the robot to turn clockwise and counterclockwise.

The linear velocities of the right (Vr) and the left (Vl) wheels can be calculated using the robot’s wheelbase width (L), which is the distance between the centers of the two wheels:

3-equation-1-linear-velocities-right-left-wheel

Convert Linear Velocity to Rotational Velocity

Once we have the linear velocities of the wheels, we need to convert them into rotational velocities. The rotational velocity (θ) in radians per second for each wheel is given by:

4-equation-rotational-velocity-wheel

Thus, the rotational velocities of the right and left wheels are:

5-rotational-velocities-right-left-wheels

Convert Radians per Second to Revolutions per Second

The final step is to convert the rotational velocities from radians per second to revolutions per second (rev/s) for practical use. Since there are 2π radians in a full revolution:

6-revolutions-per-second

Example

Let’s calculate the wheel velocities for a robot with a wheel radius of 0.05m, wheelbase width of 0.30m, commanded linear velocity of 1.0m/s, and angular velocity of 0.5rad/s.

7-full-calculation

Sensor Fusion Using the Robot Localization Package – ROS 2

In this tutorial, I will show you how to set up the robot_localization ROS 2 package on a simulated mobile robot. We will use the robot_localization package to fuse odometry data from the /wheel/odometry topic with IMU data from the /imu/data topic to provide locally accurate, smooth odometry estimates. Wheels can slip, so using the robot_localization package can help correct for this.

This tutorial is the third tutorial in my Ultimate Guide to the ROS 2 Navigation Stack (also known as Nav2).

You can get the entire code for this project here.

If you are using ROS Galactic or newer, you can get the code here.

Let’s get started!

Prerequisites 

You have completed the first two tutorials of this series:

  1. How to Create a Simulated Mobile Robot in ROS 2 Using URDF
  2. Set Up the Odometry for a Simulated Mobile Robot in ROS 2

About the Robot Localization Package

We will configure the robot_localization package to use an Extended Kalman Filter (ekf_node) to fuse the data from sensor inputs. These sensor inputs come from the IMU Gazebo plugin and the differential drive Gazebo plugin that are defined in our SDF file.

In a real robotics project, instead of simulated IMU and odometry data, we would use data from an IMU sensor like the BNO055 and wheel encoders, respectively.

 The ekf_node will subscribe to the following topics (ROS message types are in parentheses):

  • /wheel/odometry :  Position and velocity estimate based on the information from the differential drive Gazebo plugin (in a real robotics project this would be information drawn from wheel encoder tick counts). The orientation is in quaternion format. (nav_msgs/Odometry)
  • /imu/data : Data from the Inertial Measurement Unit (IMU) sensor Gazebo plugin (sensor_msgs/Imu.msg)

This node will publish data to the following topics:

  • /odometry/filtered : The smoothed odometry information (nav_msgs/Odometry)
  • /tf : Coordinate transform from the odom frame (parent) to the base_footprint frame (child). To learn about coordinate frames in ROS, check out this post.

Install the Robot Localization Package

Let’s begin by installing the robot_localization package. Open a new terminal window, and type the following command:

sudo apt install ros-foxy-robot-localization

If you are using a newer version of ROS 2 like ROS 2 Humble, type the following:

 sudo apt install ros-humble-robot-localization 

Instead of the commands above, you can also type the following command directly into the terminal. It will automatically detect your ROS 2 distribution:

sudo apt install ros-$ROS_DISTRO-robot-localization

If you are using ROS 2 Galactic and want to build from the source instead of using the off-the-shelf packages above, you will need to download the robot_localization package to your workspace.

cd ~/dev_ws/src
git clone -b fix/galactic/load_parameters https://github.com/nobleo/robot_localization.git

The reason you need to download that package above is because the Navigation Stack might throw the following exception if you don’t:

[ekf_node-1] terminate called after throwing an instance of ‘rclcpp::ParameterTypeException’ [ekf_node-1] what(): expected [string] got [not set]

cd ..
colcon build

Set the Configuration Parameters

We now need to specify the configuration parameters of the ekf_node by creating a YAML file.

Open a new terminal window, and type:

colcon_cd basic_mobile_robot
cd config
gedit ekf.yaml

Copy and paste this code inside the YAML file.

Save and close the file.

You can get a complete description of all the parameters on this page. Also you can check out this link on GitHub for a sample ekf.yaml file.

Create a Launch File

Now go to your launch folder. Open a new terminal window, and type:

colcon_cd basic_mobile_robot
cd launch
gedit basic_mobile_bot_v3.launch.py

Copy and paste this code into the file.

If you are using ROS 2 Galactic or newer, your code is here.

Save the file, and close it.

Move to the package.

colcon_cd basic_mobile_robot

Open the package.xml file.

gedit package.xml file.

Copy and paste this code into the file.

Save the file, and close it.

Open the CMakeLists.txt file.

gedit CMakeLists.txt

Copy and paste this code into the file.

Save the file, and close it.

Build the Package

Now build the package by opening a terminal window, and typing the following command:

cd ~/dev_ws
colcon build

Launch the Robot

Open a new terminal, and launch the robot.

cd ~/dev_ws/
ros2 launch basic_mobile_robot basic_mobile_bot_v3.launch.py

It might take a while for Gazebo and RViz to load, so be patient.

To see the active topics, open a terminal window, and type:

ros2 topic list

To see more information about the topics, execute:

ros2 topic info /imu/data
ros2 topic info /wheel/odometry

You should see an output similar to below:

16-ros2-topic-info-imu-wheel

Both topics have 1 publisher and 1 subscriber.

To see the output of the robot localization package (i.e. the Extended Kalman Filter (EKF)), type:

ros2 topic echo /odometry/filtered
16-zz-odometry-filtered

I will move my robot in the reverse direction using the rqt_robot_steering GUI. Open a new terminal window, and type:

rqt_robot_steering

If you are using ROS 2 Galactic or newer, type:

sudo apt-get install ros-galactic-rqt-robot-steering

Where the syntax is:

sudo apt-get install ros-<ros-distribution>-rqt-robot-steering

Then type:

ros2 run rqt_robot_steering rqt_robot_steering --force-discover

Move the sliders to move the robot.

17-move-the-sliders

We can see the output of the odom -> base_footprint transform by typing the following command:

ros2 run tf2_ros tf2_echo odom base_footprint

Let’s see the active nodes.

ros2 node list
18-ros2-node-list

Let’s check out the ekf_node (named ekf_filter_node).

ros2 node info /ekf_filter_node
19-filter-node-info

Let’s check out the ROS node graph.

rqt_graph

Click the blue circular arrow in the upper left to refresh the node graph. Also select “Nodes/Topics (all)”.

20-ros-node-graph

To see the coordinate frames, type the following command in a terminal window.

ros2 run tf2_tools view_frames.py

If you are using ROS 2 Galactic or newer, type:

ros2 run tf2_tools view_frames

In the current working directory, you will have a file called frames.pdf. Open that file.

evince frames.pdf

Here is what my coordinate transform (i.e. tf) tree looks like:

21-view-coordinate-frames

You can see that the parent frame is the odom frame. The odom frame is the initial position and orientation of the robot. Every other frame below that is a child of the odom frame.

Later, we will add a map frame. The map frame will be the parent frame of the odom frame.

Finally, in RViz, under Global Options, change Fixed Frame to odom.

22-fixed-frame-to-odom

Open the steering tool again.

rqt_robot_steering

If you move the robot around using the sliders, you will see the robot move in both RViz and Gazebo.

gazebo-and-rviz-1

That’s it!

In the next tutorial, I will show you how to add LIDAR to your robot so that you can map the environment and detect obstacles.

Revolutionizing Healthcare: The Future Impact of Robots

The healthcare industry has witnessed incredible advancements over the years, from groundbreaking medical discoveries to innovative surgical techniques. However, as we take a look into the future, there is one technological revolution that promises to redefine healthcare as we know it: robots.

Robots are going to transform every aspect of healthcare, from diagnostics and surgery to patient care and beyond. In this blog post, we’ll cover the exciting potential of robots in healthcare and the profound impact they are likely to have on the industry.

Robots in Diagnostics

human-skeleton

One of the primary areas where robots are already making significant strides is in diagnostics.

Imagine a robot capable of analyzing vast amounts of patient data, including medical history, genetic information, and real-time monitoring data. This robot can quickly identify potential health issues, predict disease risks, and recommend personalized treatment plans. With machine learning and artificial intelligence, these robots can continuously improve their diagnostic accuracy, ultimately saving lives by catching diseases at their earliest stages.

Let’s take a look at some companies that are currently involved in this market:

  • Diagnostic Robotics: This company has developed an AI-powered platform that can be used to diagnose a wide range of diseases, including cancer, heart disease, and diabetes. The platform is currently being used by healthcare providers around the world.
  • Paige AI: This company has developed AI-powered software that can be used to analyze pathology images and detect cancer cells. The software is currently being used by pathologists around the world to help them diagnose cancer more accurately and efficiently.

Robotic-assisted Surgery

surgey-operating-room

Robotic-assisted surgery is another promising frontier in healthcare. Robots equipped with advanced surgical instruments and guided by skilled surgeons can perform intricate procedures with unmatched precision and minimal invasiveness. These practices will reduce patient trauma, speed up recovery times, and lower the risk of complications.

davinci-surgical-robot

Also, robots can bridge geographical gaps by enabling remote surgery, allowing a surgeon in one location to operate on a patient in another, potentially saving lives in emergency situations and improving access to specialized care in remote areas.

Here are some of the leading companies involved in robotic-assisted surgery:

  • Intuitive Surgical: Intuitive Surgical is the dominant player in the robotic-assisted surgery market, with its da Vinci surgical system being the most widely used robotic surgical system in the world.
  • Medtronic: Medtronic recently entered the robotic-assisted surgery market with its Hugo robotic surgery system.
  • Johnson & Johnson: Johnson & Johnson offers a variety of robotic surgery systems through its Johnson & Johnson Medical Devices division, including the Monarch platform for bronchoscopy, and the Velys platform for orthopedic surgery.
  • Stryker: Stryker offers the Mako robotic-assisted surgery system for orthopedic surgery.
  • Brainlab: Brainlab offers the Curve robotic-assisted surgery system for spinal and neurosurgery.
  • CMR Surgical: CMR Surgical offers the Versius robotic-assisted surgery system for a variety of procedures, including general surgery, gynecologic surgery, and urologic surgery.

Enhanced Rehabilitation

rehabilitation

Rehabilitation is a cornerstone of healthcare, especially for patients recovering from injuries, surgeries, or chronic conditions. Robots are playing an increasingly important role in this area, providing consistent, personalized, and intensive therapy.

For example, robotic exoskeletons help patients regain mobility, while robotic arms assist those with limited dexterity. These devices not only enhance the quality of life for patients but also alleviate the strain on healthcare professionals by automating routine tasks, allowing them to focus on more complex aspects of care.

One of the first companies that comes to mind is ReWalk Robotics. ReWalk Robotics develops and manufactures robotic exoskeletons for people with spinal cord injuries.

ReWalk’s most well-known product is the ReWalk exoskeleton, which allows people with paraplegia to walk again. ReWalk Robotics also produces the ReStore exoskeleton, which is used to help people with lower limb disability regain their mobility.

Another company is Cyberdyne. Cyberdyne is a Japanese company that develops and manufactures robotic exoskeletons for medical and industrial use. Its most well-known product is the HAL (Hybrid Assistive Limb) exoskeleton, which is used to help people with disabilities walk, stand, and climb stairs. The HAL exoskeleton is also used in industrial settings to help workers lift heavy objects and perform other physically demanding tasks.

Robots in Patient Care

xtend_ai

The future of healthcare also includes robots directly interacting with patients. Social robots equipped with natural language processing capabilities can provide companionship and emotional support to patients, particularly those in long-term care facilities or dealing with mental health issues. These robots can engage in conversations, give reminders when patients need to take medication, and monitor vital signs, ensuring patients receive the attention and care they need.

The company I work for, Xtend AI, is doing just that.

Pharmaceutical Advancements

pharmaceuical-advancements

In the pharmaceutical industry, robots are revolutionizing drug discovery and development. Robotic systems can automate high-throughput screening of compounds, drastically accelerating the process of identifying potential drug candidates.

Additionally, robots can handle complex chemical reactions with precision, leading to the creation of more effective and safe medications. This not only reduces the time and cost of drug development but also opens doors to personalized medicine, where treatments are tailored to individual patients based on their genetic makeup and health data.

One robot that already has a head start in this area is NiCoLA-B. This robot is used at the U.K. Center for Lead Discovery to test more than 300,000 compounds a day in search of promising drug candidates. It uses sound waves to move droplets of potential drugs into miniature wells on assay plates, where they are tested for activity.

Logistics and Supply Chain Management

logistics

Efficient logistics and supply chain management are critical in healthcare, ensuring that medications, medical equipment, and supplies are readily available when needed. Robots are already playing a pivotal role in this aspect by automating inventory management, drug dispensing, and even transportation within healthcare facilities. This not only reduces human errors but also optimizes resource allocation and minimizes wastage, ultimately leading to cost savings and improved patient care.

Challenges and Ethical Considerations

While the future of healthcare with robots is incredibly promising, it is not without its challenges and ethical considerations. Privacy concerns, data security, and the potential for bias in AI algorithms must be addressed. I predict there will always be a need for human oversight and expertise to ensure robots operate safely and effectively.

The Best is Yet to Come

The future impact of robots on healthcare is nothing short of transformative. From diagnostics and surgery to patient care and pharmaceutical advancements, robots are set to revolutionize every facet of the healthcare industry. As technology continues to advance, we can look forward to a healthcare system that is more efficient, accessible, and personalized, ultimately leading to better patient outcomes and an improved quality of life for all.

However, it is important we navigate this transformation with careful consideration of the ethical and privacy implications, ensuring that the benefits of healthcare robotics are realized while minimizing potential risks. The future of healthcare is indeed exciting, and robots will undoubtedly be at the forefront of this revolution.

Keep building!