Calculating Wheel Odometry for a Differential Drive Robot

Have you ever wondered how robots navigate their environments so precisely? A key component for many mobile robots is differential drive. This type of robot uses two independently controlled wheels, allowing for maneuvers like moving forward, backward, and turning. 

But how does the robot itself know where it is and how far it’s traveled? This is where wheel odometry comes in.

Wheel odometry is a technique for estimating a robot’s position and orientation based on the rotations of its wheels. By measuring the number of revolutions each wheel makes, we can calculate the distance traveled and any changes in direction. This information is important for tasks like path planning, obstacle avoidance, and overall robot control.

This tutorial will guide you through calculating wheel odometry for a differential drive robot. We’ll explore how to convert raw wheel encoder data – the number of revolutions for each wheel – into the robot’s displacement in the x and y directions (relative to a starting point) and the change in its orientation angle. Buckle up and get ready to dive into the fascinating world of robot self-localization!

Prerequisites

Calculate Wheel Displacements

First, we calculate the distance each wheel has traveled based on the number of revolutions since the last time step. This requires knowing:

1-calculate-wheel-displacement-distance

Calculate the Robot’s Average Displacement and Orientation Change

Next, we determine the robot’s average displacement and the change in its orientation. The average displacement (Davg) is the mean of the distances traveled by both wheels:

2-robot-displacement

The change in orientation (Δθ), measured in radians, is influenced by the difference in wheel displacements and the distance between the wheels (L):

3-change-in-orientation-robot

Calculate Changes in the Global Position

Now, we can calculate the robot’s movement in the global reference frame. Assuming the robot’s initial orientation is θ, and using Davg and Δθ, we find the changes in the x and y positions as follows:

4b-change-in-global-position

You will often see the following equation instead:

4-change-in-global-position

This simplification assumes Δθ is relatively small, allowing us to approximate the displacement direction using the final orientation θnew without significant loss of accuracy. It’s a useful approximation for small time steps or when precise integration of orientation over displacement is not critical.

To find the robot’s new orientation:

5-robot-new-orientation

Note, for robotics projects, it is common for us to modify this angle so that it is always between -pi and +pi. Here is what that code would look like:

# Keep angle between -PI and PI  
if (self.new_yaw_angle > PI):
    self.new_yaw_angle = self.new_yaw_angle - (2 * PI)
if (self.new_yaw_angle < -PI):
    self.new_yaw_angle = self.new_yaw_angle + (2 * PI)

For ROS 2, you would then convert this new yaw angle into a quaternion.

Update the Robot’s Global Position

Finally, we update the robot’s position in the global reference frame. If the robot’s previous position was (x, y), the new position (xnew, ynew) is given by:

6-update-robot-global-position

Practical Example

Let’s apply these steps with sample values:

7-define-values

Using these values, we’ll calculate the robot’s new position and orientation.

Step 1: Wheel Displacements

8-step-1

Step 2: Average Displacement and Orientation Change

9-step-2

Step 3: Changes in Global Position

10-step-3

Step 4: New Global Position and Orientation

Therefore the new orientation of the robot is 2.52 radians, and the robot is currently located at (x=-4.34, y = 4.80).

11-step-4

Important Note on Assumptions

The calculations for wheel odometry as demonstrated in the example above are made under two crucial assumptions:

  1. No Wheel Slippage: It’s assumed that the wheels of the robot do not slip during movement. Wheel slippage can occur due to loss of traction, often caused by slick surfaces or rapid acceleration/deceleration. When slippage occurs, the actual distance traveled by the robot may differ from the calculated values, as the wheel rotations do not accurately reflect movement over the ground.
  2. Adequate Friction: The calculations also assume that there is adequate friction between the wheels and the surface on which the robot is moving. Adequate friction is necessary for the wheels to grip the surface effectively, allowing for precise control over the robot’s movement. Insufficient friction can lead to wheel slippage, which, as mentioned, would result in inaccuracies in the odometry data.

These assumptions are essential for the accuracy of wheel odometry calculations. In real-world scenarios, various factors such as floor material, wheel material, and robot speed can affect these conditions.

Therefore, while the mathematical model provides a foundational understanding of how to calculate a robot’s position and orientation based on wheel rotations, you should be aware of these limitations and consider implementing corrective measures or additional sensors to account for potential discrepancies in odometry data due to slippage or inadequate friction.

Calculating Wheel Velocities for a Differential Drive Robot

Have you ever wondered how those robots navigate around restaurants or hospitals, delivering food or fetching supplies? These little marvels of engineering often rely on a simple yet powerful concept: the differential drive robot.

A differential drive robot uses two independently powered wheels and often has one or more passive caster wheels. By controlling the speed and direction of each wheel, the robot can move forward, backward, turn, or even spin in place. 

1-differential-drive-robot-base

This tutorial will delve into the behind-the-scenes calculations that translate desired robot motion (linear and angular velocities) into individual wheel speeds (rotational velocities in revolutions per second).

Real-World Applications

Differential drive robots, with their two independently controlled wheels, are widely used for their maneuverability and simplicity. Here are some real-world applications:

Delivery and Warehousing:

  • Inventory management:  Many warehouse robots use differential drive designs. They can navigate narrow aisles, efficiently locate and transport goods, and perform stock checks.  
  • Last-mile delivery:  Delivery robots on sidewalks and in controlled environments often employ differential drive. They can navigate crowded areas and deliver packages autonomously. 

Domestic and Public Use:

  • Floor cleaning robots:  These robots navigate homes and offices using differential drive. They can efficiently clean floors while avoiding obstacles.
  • Disinfection robots:  In hospitals and public areas, differential drive robots can be used for disinfecting surfaces with UV light or other methods.
  • Security robots:  These robots patrol buildings or outdoor spaces, using differential drive for maneuverability during surveillance.

Specialized Applications:

  • Agricultural robots:  Differential drive robots can be used in fields for tasks like planting seeds or collecting data on crops.
  • Military robots:  Small reconnaissance robots often use differential drive for navigating rough terrain.
  • Entertainment robots:  Robots designed for entertainment or education may utilize differential drive for movement and interaction.
  • Restaurant robots: Robots that deliver from the kitchen to the dining room.

Why Differential Drive for these Applications?

Differential drive robots offer several advantages for these tasks:

  • Simple design:  Their two-wheeled platform makes them relatively easy and inexpensive to build.
  • Maneuverability:  By controlling the speed of each wheel independently, they can turn sharply and navigate complex environments.
  • Compact size:  Their design allows them to operate in tight spaces.

Prerequisites

Convert Commanded Velocities to Wheel Linear Velocities

The robot’s motion is defined by two control inputs: the linear velocity (V) and the angular velocity (ω) of the robot in its reference frame.

2-differential-drive-diagram-robot-reference-frame

The linear velocity is the speed at which you want the robot to move forward or backward, while the angular velocity defines how fast you want the robot to turn clockwise and counterclockwise.

The linear velocities of the right (Vr) and the left (Vl) wheels can be calculated using the robot’s wheelbase width (L), which is the distance between the centers of the two wheels:

3-equation-1-linear-velocities-right-left-wheel

Convert Linear Velocity to Rotational Velocity

Once we have the linear velocities of the wheels, we need to convert them into rotational velocities. The rotational velocity (θ) in radians per second for each wheel is given by:

4-equation-rotational-velocity-wheel

Thus, the rotational velocities of the right and left wheels are:

5-rotational-velocities-right-left-wheels

Convert Radians per Second to Revolutions per Second

The final step is to convert the rotational velocities from radians per second to revolutions per second (rev/s) for practical use. Since there are 2π radians in a full revolution:

6-revolutions-per-second

Example

Let’s calculate the wheel velocities for a robot with a wheel radius of 0.05m, wheelbase width of 0.30m, commanded linear velocity of 1.0m/s, and angular velocity of 0.5rad/s.

7-full-calculation

Sensor Fusion Using the Robot Localization Package – ROS 2

In this tutorial, I will show you how to set up the robot_localization ROS 2 package on a simulated mobile robot. We will use the robot_localization package to fuse odometry data from the /wheel/odometry topic with IMU data from the /imu/data topic to provide locally accurate, smooth odometry estimates. Wheels can slip, so using the robot_localization package can help correct for this.

This tutorial is the third tutorial in my Ultimate Guide to the ROS 2 Navigation Stack (also known as Nav2).

You can get the entire code for this project here.

If you are using ROS Galactic or newer, you can get the code here.

Let’s get started!

Prerequisites 

You have completed the first two tutorials of this series:

  1. How to Create a Simulated Mobile Robot in ROS 2 Using URDF
  2. Set Up the Odometry for a Simulated Mobile Robot in ROS 2

About the Robot Localization Package

We will configure the robot_localization package to use an Extended Kalman Filter (ekf_node) to fuse the data from sensor inputs. These sensor inputs come from the IMU Gazebo plugin and the differential drive Gazebo plugin that are defined in our SDF file.

In a real robotics project, instead of simulated IMU and odometry data, we would use data from an IMU sensor like the BNO055 and wheel encoders, respectively.

 The ekf_node will subscribe to the following topics (ROS message types are in parentheses):

  • /wheel/odometry :  Position and velocity estimate based on the information from the differential drive Gazebo plugin (in a real robotics project this would be information drawn from wheel encoder tick counts). The orientation is in quaternion format. (nav_msgs/Odometry)
  • /imu/data : Data from the Inertial Measurement Unit (IMU) sensor Gazebo plugin (sensor_msgs/Imu.msg)

This node will publish data to the following topics:

  • /odometry/filtered : The smoothed odometry information (nav_msgs/Odometry)
  • /tf : Coordinate transform from the odom frame (parent) to the base_footprint frame (child). To learn about coordinate frames in ROS, check out this post.

Install the Robot Localization Package

Let’s begin by installing the robot_localization package. Open a new terminal window, and type the following command:

sudo apt install ros-foxy-robot-localization

If you are using a newer version of ROS 2 like ROS 2 Humble, type the following:

 sudo apt install ros-humble-robot-localization 

Instead of the commands above, you can also type the following command directly into the terminal. It will automatically detect your ROS 2 distribution:

sudo apt install ros-$ROS_DISTRO-robot-localization

If you are using ROS 2 Galactic and want to build from the source instead of using the off-the-shelf packages above, you will need to download the robot_localization package to your workspace.

cd ~/dev_ws/src
git clone -b fix/galactic/load_parameters https://github.com/nobleo/robot_localization.git

The reason you need to download that package above is because the Navigation Stack might throw the following exception if you don’t:

[ekf_node-1] terminate called after throwing an instance of ‘rclcpp::ParameterTypeException’ [ekf_node-1] what(): expected [string] got [not set]

cd ..
colcon build

Set the Configuration Parameters

We now need to specify the configuration parameters of the ekf_node by creating a YAML file.

Open a new terminal window, and type:

colcon_cd basic_mobile_robot
cd config
gedit ekf.yaml

Copy and paste this code inside the YAML file.

Save and close the file.

You can get a complete description of all the parameters on this page. Also you can check out this link on GitHub for a sample ekf.yaml file.

Create a Launch File

Now go to your launch folder. Open a new terminal window, and type:

colcon_cd basic_mobile_robot
cd launch
gedit basic_mobile_bot_v3.launch.py

Copy and paste this code into the file.

If you are using ROS 2 Galactic or newer, your code is here.

Save the file, and close it.

Move to the package.

colcon_cd basic_mobile_robot

Open the package.xml file.

gedit package.xml file.

Copy and paste this code into the file.

Save the file, and close it.

Open the CMakeLists.txt file.

gedit CMakeLists.txt

Copy and paste this code into the file.

Save the file, and close it.

Build the Package

Now build the package by opening a terminal window, and typing the following command:

cd ~/dev_ws
colcon build

Launch the Robot

Open a new terminal, and launch the robot.

cd ~/dev_ws/
ros2 launch basic_mobile_robot basic_mobile_bot_v3.launch.py

It might take a while for Gazebo and RViz to load, so be patient.

To see the active topics, open a terminal window, and type:

ros2 topic list

To see more information about the topics, execute:

ros2 topic info /imu/data
ros2 topic info /wheel/odometry

You should see an output similar to below:

16-ros2-topic-info-imu-wheel

Both topics have 1 publisher and 1 subscriber.

To see the output of the robot localization package (i.e. the Extended Kalman Filter (EKF)), type:

ros2 topic echo /odometry/filtered
16-zz-odometry-filtered

I will move my robot in the reverse direction using the rqt_robot_steering GUI. Open a new terminal window, and type:

rqt_robot_steering

If you are using ROS 2 Galactic or newer, type:

sudo apt-get install ros-galactic-rqt-robot-steering

Where the syntax is:

sudo apt-get install ros-<ros-distribution>-rqt-robot-steering

Then type:

ros2 run rqt_robot_steering rqt_robot_steering --force-discover

Move the sliders to move the robot.

17-move-the-sliders

We can see the output of the odom -> base_footprint transform by typing the following command:

ros2 run tf2_ros tf2_echo odom base_footprint

Let’s see the active nodes.

ros2 node list
18-ros2-node-list

Let’s check out the ekf_node (named ekf_filter_node).

ros2 node info /ekf_filter_node
19-filter-node-info

Let’s check out the ROS node graph.

rqt_graph

Click the blue circular arrow in the upper left to refresh the node graph. Also select “Nodes/Topics (all)”.

20-ros-node-graph

To see the coordinate frames, type the following command in a terminal window.

ros2 run tf2_tools view_frames.py

If you are using ROS 2 Galactic or newer, type:

ros2 run tf2_tools view_frames

In the current working directory, you will have a file called frames.pdf. Open that file.

evince frames.pdf

Here is what my coordinate transform (i.e. tf) tree looks like:

21-view-coordinate-frames

You can see that the parent frame is the odom frame. The odom frame is the initial position and orientation of the robot. Every other frame below that is a child of the odom frame.

Later, we will add a map frame. The map frame will be the parent frame of the odom frame.

Finally, in RViz, under Global Options, change Fixed Frame to odom.

22-fixed-frame-to-odom

Open the steering tool again.

rqt_robot_steering

If you move the robot around using the sliders, you will see the robot move in both RViz and Gazebo.

gazebo-and-rviz-1

That’s it!

In the next tutorial, I will show you how to add LIDAR to your robot so that you can map the environment and detect obstacles.