Sensor Fusion Using the Robot Localization Package – ROS 2

In this tutorial, I will show you how to set up the robot_localization ROS 2 package on a simulated mobile robot. We will use the robot_localization package to fuse odometry data from the /wheel/odometry topic with IMU data from the /imu/data topic to provide locally accurate, smooth odometry estimates. Wheels can slip, so using the robot_localization package can help correct for this.

This tutorial is the third tutorial in my Ultimate Guide to the ROS 2 Navigation Stack (also known as Nav2).

You can get the entire code for this project here.

If you are using ROS Galactic or newer, you can get the code here.

Let’s get started!

Prerequisites 

You have completed the first two tutorials of this series:

  1. How to Create a Simulated Mobile Robot in ROS 2 Using URDF
  2. Set Up the Odometry for a Simulated Mobile Robot in ROS 2

About the Robot Localization Package

We will configure the robot_localization package to use an Extended Kalman Filter (ekf_node) to fuse the data from sensor inputs. These sensor inputs come from the IMU Gazebo plugin and the differential drive Gazebo plugin that are defined in our SDF file.

In a real robotics project, instead of simulated IMU and odometry data, we would use data from an IMU sensor like the BNO055 and wheel encoders, respectively.

 The ekf_node will subscribe to the following topics (ROS message types are in parentheses):

  • /wheel/odometry :  Position and velocity estimate based on the information from the differential drive Gazebo plugin (in a real robotics project this would be information drawn from wheel encoder tick counts). The orientation is in quaternion format. (nav_msgs/Odometry)
  • /imu/data : Data from the Inertial Measurement Unit (IMU) sensor Gazebo plugin (sensor_msgs/Imu.msg)

This node will publish data to the following topics:

  • /odometry/filtered : The smoothed odometry information (nav_msgs/Odometry)
  • /tf : Coordinate transform from the odom frame (parent) to the base_footprint frame (child). To learn about coordinate frames in ROS, check out this post.

Install the Robot Localization Package

Let’s begin by installing the robot_localization package. Open a new terminal window, and type the following command:

sudo apt install ros-foxy-robot-localization

If you are using a newer version of ROS 2 like ROS 2 Humble, type the following:

 sudo apt install ros-humble-robot-localization 

Instead of the commands above, you can also type the following command directly into the terminal. It will automatically detect your ROS 2 distribution:

sudo apt install ros-$ROS_DISTRO-robot-localization

If you are using ROS 2 Galactic and want to build from the source instead of using the off-the-shelf packages above, you will need to download the robot_localization package to your workspace.

cd ~/dev_ws/src
git clone -b fix/galactic/load_parameters https://github.com/nobleo/robot_localization.git

The reason you need to download that package above is because the Navigation Stack might throw the following exception if you don’t:

[ekf_node-1] terminate called after throwing an instance of ‘rclcpp::ParameterTypeException’ [ekf_node-1] what(): expected [string] got [not set]

cd ..
colcon build

Set the Configuration Parameters

We now need to specify the configuration parameters of the ekf_node by creating a YAML file.

Open a new terminal window, and type:

colcon_cd basic_mobile_robot
cd config
gedit ekf.yaml

Copy and paste this code inside the YAML file.

Save and close the file.

You can get a complete description of all the parameters on this page. Also you can check out this link on GitHub for a sample ekf.yaml file.

Create a Launch File

Now go to your launch folder. Open a new terminal window, and type:

colcon_cd basic_mobile_robot
cd launch
gedit basic_mobile_bot_v3.launch.py

Copy and paste this code into the file.

If you are using ROS 2 Galactic or newer, your code is here.

Save the file, and close it.

Move to the package.

colcon_cd basic_mobile_robot

Open the package.xml file.

gedit package.xml file.

Copy and paste this code into the file.

Save the file, and close it.

Open the CMakeLists.txt file.

gedit CMakeLists.txt

Copy and paste this code into the file.

Save the file, and close it.

Build the Package

Now build the package by opening a terminal window, and typing the following command:

cd ~/dev_ws
colcon build

Launch the Robot

Open a new terminal, and launch the robot.

cd ~/dev_ws/
ros2 launch basic_mobile_robot basic_mobile_bot_v3.launch.py

It might take a while for Gazebo and RViz to load, so be patient.

To see the active topics, open a terminal window, and type:

ros2 topic list

To see more information about the topics, execute:

ros2 topic info /imu/data
ros2 topic info /wheel/odometry

You should see an output similar to below:

16-ros2-topic-info-imu-wheel

Both topics have 1 publisher and 1 subscriber.

To see the output of the robot localization package (i.e. the Extended Kalman Filter (EKF)), type:

ros2 topic echo /odometry/filtered
16-zz-odometry-filtered

I will move my robot in the reverse direction using the rqt_robot_steering GUI. Open a new terminal window, and type:

rqt_robot_steering

If you are using ROS 2 Galactic or newer, type:

sudo apt-get install ros-galactic-rqt-robot-steering

Where the syntax is:

sudo apt-get install ros-<ros-distribution>-rqt-robot-steering

Then type:

ros2 run rqt_robot_steering rqt_robot_steering --force-discover

Move the sliders to move the robot.

17-move-the-sliders

We can see the output of the odom -> base_footprint transform by typing the following command:

ros2 run tf2_ros tf2_echo odom base_footprint

Let’s see the active nodes.

ros2 node list
18-ros2-node-list

Let’s check out the ekf_node (named ekf_filter_node).

ros2 node info /ekf_filter_node
19-filter-node-info

Let’s check out the ROS node graph.

rqt_graph

Click the blue circular arrow in the upper left to refresh the node graph. Also select “Nodes/Topics (all)”.

20-ros-node-graph

To see the coordinate frames, type the following command in a terminal window.

ros2 run tf2_tools view_frames.py

If you are using ROS 2 Galactic or newer, type:

ros2 run tf2_tools view_frames

In the current working directory, you will have a file called frames.pdf. Open that file.

evince frames.pdf

Here is what my coordinate transform (i.e. tf) tree looks like:

21-view-coordinate-frames

You can see that the parent frame is the odom frame. The odom frame is the initial position and orientation of the robot. Every other frame below that is a child of the odom frame.

Later, we will add a map frame. The map frame will be the parent frame of the odom frame.

Finally, in RViz, under Global Options, change Fixed Frame to odom.

22-fixed-frame-to-odom

Open the steering tool again.

rqt_robot_steering

If you move the robot around using the sliders, you will see the robot move in both RViz and Gazebo.

gazebo-and-rviz-1

That’s it!

In the next tutorial, I will show you how to add LIDAR to your robot so that you can map the environment and detect obstacles.

Revolutionizing Healthcare: The Future Impact of Robots

The healthcare industry has witnessed incredible advancements over the years, from groundbreaking medical discoveries to innovative surgical techniques. However, as we take a look into the future, there is one technological revolution that promises to redefine healthcare as we know it: robots.

Robots are going to transform every aspect of healthcare, from diagnostics and surgery to patient care and beyond. In this blog post, we’ll cover the exciting potential of robots in healthcare and the profound impact they are likely to have on the industry.

Robots in Diagnostics

human-skeleton

One of the primary areas where robots are already making significant strides is in diagnostics.

Imagine a robot capable of analyzing vast amounts of patient data, including medical history, genetic information, and real-time monitoring data. This robot can quickly identify potential health issues, predict disease risks, and recommend personalized treatment plans. With machine learning and artificial intelligence, these robots can continuously improve their diagnostic accuracy, ultimately saving lives by catching diseases at their earliest stages.

Let’s take a look at some companies that are currently involved in this market:

  • Diagnostic Robotics: This company has developed an AI-powered platform that can be used to diagnose a wide range of diseases, including cancer, heart disease, and diabetes. The platform is currently being used by healthcare providers around the world.
  • Paige AI: This company has developed AI-powered software that can be used to analyze pathology images and detect cancer cells. The software is currently being used by pathologists around the world to help them diagnose cancer more accurately and efficiently.

Robotic-assisted Surgery

surgey-operating-room

Robotic-assisted surgery is another promising frontier in healthcare. Robots equipped with advanced surgical instruments and guided by skilled surgeons can perform intricate procedures with unmatched precision and minimal invasiveness. These practices will reduce patient trauma, speed up recovery times, and lower the risk of complications.

davinci-surgical-robot

Also, robots can bridge geographical gaps by enabling remote surgery, allowing a surgeon in one location to operate on a patient in another, potentially saving lives in emergency situations and improving access to specialized care in remote areas.

Here are some of the leading companies involved in robotic-assisted surgery:

  • Intuitive Surgical: Intuitive Surgical is the dominant player in the robotic-assisted surgery market, with its da Vinci surgical system being the most widely used robotic surgical system in the world.
  • Medtronic: Medtronic recently entered the robotic-assisted surgery market with its Hugo robotic surgery system.
  • Johnson & Johnson: Johnson & Johnson offers a variety of robotic surgery systems through its Johnson & Johnson Medical Devices division, including the Monarch platform for bronchoscopy, and the Velys platform for orthopedic surgery.
  • Stryker: Stryker offers the Mako robotic-assisted surgery system for orthopedic surgery.
  • Brainlab: Brainlab offers the Curve robotic-assisted surgery system for spinal and neurosurgery.
  • CMR Surgical: CMR Surgical offers the Versius robotic-assisted surgery system for a variety of procedures, including general surgery, gynecologic surgery, and urologic surgery.

Enhanced Rehabilitation

rehabilitation

Rehabilitation is a cornerstone of healthcare, especially for patients recovering from injuries, surgeries, or chronic conditions. Robots are playing an increasingly important role in this area, providing consistent, personalized, and intensive therapy.

For example, robotic exoskeletons help patients regain mobility, while robotic arms assist those with limited dexterity. These devices not only enhance the quality of life for patients but also alleviate the strain on healthcare professionals by automating routine tasks, allowing them to focus on more complex aspects of care.

One of the first companies that comes to mind is ReWalk Robotics. ReWalk Robotics develops and manufactures robotic exoskeletons for people with spinal cord injuries.

ReWalk’s most well-known product is the ReWalk exoskeleton, which allows people with paraplegia to walk again. ReWalk Robotics also produces the ReStore exoskeleton, which is used to help people with lower limb disability regain their mobility.

Another company is Cyberdyne. Cyberdyne is a Japanese company that develops and manufactures robotic exoskeletons for medical and industrial use. Its most well-known product is the HAL (Hybrid Assistive Limb) exoskeleton, which is used to help people with disabilities walk, stand, and climb stairs. The HAL exoskeleton is also used in industrial settings to help workers lift heavy objects and perform other physically demanding tasks.

Robots in Patient Care

xtend_ai

The future of healthcare also includes robots directly interacting with patients. Social robots equipped with natural language processing capabilities can provide companionship and emotional support to patients, particularly those in long-term care facilities or dealing with mental health issues. These robots can engage in conversations, give reminders when patients need to take medication, and monitor vital signs, ensuring patients receive the attention and care they need.

The company I work for, Xtend AI, is doing just that.

Pharmaceutical Advancements

pharmaceuical-advancements

In the pharmaceutical industry, robots are revolutionizing drug discovery and development. Robotic systems can automate high-throughput screening of compounds, drastically accelerating the process of identifying potential drug candidates.

Additionally, robots can handle complex chemical reactions with precision, leading to the creation of more effective and safe medications. This not only reduces the time and cost of drug development but also opens doors to personalized medicine, where treatments are tailored to individual patients based on their genetic makeup and health data.

One robot that already has a head start in this area is NiCoLA-B. This robot is used at the U.K. Center for Lead Discovery to test more than 300,000 compounds a day in search of promising drug candidates. It uses sound waves to move droplets of potential drugs into miniature wells on assay plates, where they are tested for activity.

Logistics and Supply Chain Management

logistics

Efficient logistics and supply chain management are critical in healthcare, ensuring that medications, medical equipment, and supplies are readily available when needed. Robots are already playing a pivotal role in this aspect by automating inventory management, drug dispensing, and even transportation within healthcare facilities. This not only reduces human errors but also optimizes resource allocation and minimizes wastage, ultimately leading to cost savings and improved patient care.

Challenges and Ethical Considerations

While the future of healthcare with robots is incredibly promising, it is not without its challenges and ethical considerations. Privacy concerns, data security, and the potential for bias in AI algorithms must be addressed. I predict there will always be a need for human oversight and expertise to ensure robots operate safely and effectively.

The Best is Yet to Come

The future impact of robots on healthcare is nothing short of transformative. From diagnostics and surgery to patient care and pharmaceutical advancements, robots are set to revolutionize every facet of the healthcare industry. As technology continues to advance, we can look forward to a healthcare system that is more efficient, accessible, and personalized, ultimately leading to better patient outcomes and an improved quality of life for all.

However, it is important we navigate this transformation with careful consideration of the ethical and privacy implications, ensuring that the benefits of healthcare robotics are realized while minimizing potential risks. The future of healthcare is indeed exciting, and robots will undoubtedly be at the forefront of this revolution.

Keep building!

How Robots Help Us Explore Extreme Environments

Robots are now being used to explore some of the most dangerous and inhospitable places on Earth, and even beyond.

In this blog post, we will cover some of the ways that robots are helping to explore the unknown. We will also take a look at some of the challenges that need to be overcome in order to develop robots that can safely and effectively explore even the most extreme environments.

Robots in Space

One of the most exciting areas of robotic exploration is space. Robots have been used to explore the Moon, Mars, and other planets in our solar system. They have also been used to repair and service satellites in orbit.

One of the most famous robotic space explorers is the Curiosity rover, which landed on Mars in 2012. Curiosity has been exploring the Gale Crater on Mars for over a decade, and has made many important discoveries about the planet’s past and present environment.

curiosity_rover_mars

Another notable robotic space explorer is the Perseverance rover, which landed on Mars in 2021. Perseverance is tasked with collecting samples from Mars that will be returned to Earth for analysis. This could help us to learn even more about the Red Planet and its potential for habitability.

perseverance_mars_rover

Robots in the Deep Sea

Robots are also being used to explore the deep sea. The deep sea is one of the least explored places on Earth, and robots are helping us to learn more about its unique ecosystems and biodiversity.

One example of a robotic deep sea explorer is the remotely operated vehicle (ROV) Nereus. Nereus is capable of diving to depths of over 10,000 meters, and has been used to explore the Mariana Trench, the deepest point in the ocean.

nereus_underwater_vehicle

Another example of a robotic deep sea explorer is the autonomous underwater vehicle (AUV) Sentry. Sentry is capable of operating independently for months at a time, and has been used to map the seafloor and collect data on marine life.

sentry

Robots in Other Extreme Environments

Robots are also being used to explore other extreme environments on Earth, such as volcanoes, caves, and glaciers. These environments can be dangerous for humans to explore, but robots can safely navigate them and collect data.

One example of a robotic extreme environment explorer is the robot submarine Nereid Under Ice (NUI). NUI is a hybrid remotely operated vehicle (ROV) developed by the Woods Hole Oceanographic Institution (WHOI). It is designed to explore and sample under-ice environments, which are difficult to access using traditional methods.

drift-ice-3048163_640

NUI is equipped with a high-definition video camera, a 7-function electro-hydraulic manipulator arm, and a range of acoustic, chemical, and biological sensors. It can operate in water depths of up to 4,000 meters and can be deployed from icebreakers or research vessels.

Challenges and Future Directions

There are still a number of challenges that need to be overcome in order to develop robots that can safely and effectively explore even the most extreme environments.

One challenge is developing robots that are powered by long-lasting batteries. This is especially important for robots that need to operate in remote or inaccessible areas.

Another challenge is developing robots that can withstand harsh environmental conditions. For example, robots that explore volcanoes need to be able to withstand high temperatures and toxic gases.

kilauea-3088675_640

Finally, robots need to be equipped with sensors and artificial intelligence (AI) that allow them to perceive their surroundings and make decisions autonomously. This is especially important for robots that need to operate in dangerous or unpredictable environments.

Additional Thoughts

Here are some additional thoughts on how robots are helping us to explore the unknown:

  • Robots are being used to explore the human body. For example, robotic surgical systems allow surgeons to perform complex procedures with greater precision and accuracy than would be possible with traditional methods.
  • Robots are being used to explore the past. For example, archaeologists are using robots to excavate ancient ruins and search for lost artifacts.

The possibilities for robotic exploration are endless. As robots become more capable and sophisticated, we can expect them to help us to learn more about the world around us.

Keep building!