In this tutorial, I will show you how to detect an ArUco Marker in a real-time video stream (i.e. my webcam) using OpenCV (Python). I will follow this tutorial.
By the end of this tutorial, you will be able to generate output like this:
Open your favorite code editor, and write the following code. I will name my program detect_aruco_marker.py. This program detects an ArUco marker in a real-time video stream (we’ll use the built-in webcam).
#!/usr/bin/env python
'''
Welcome to the ArUco Marker Detector!
This program:
- Detects ArUco markers using OpenCV and Python
'''
from __future__ import print_function # Python 2/3 compatibility
import cv2 # Import the OpenCV library
import numpy as np # Import Numpy library
# Project: ArUco Marker Detector
# Date created: 12/18/2021
# Python version: 3.8
# Reference: https://www.pyimagesearch.com/2020/12/21/detecting-aruco-markers-with-opencv-and-python/
desired_aruco_dictionary = "DICT_ARUCO_ORIGINAL"
# The different ArUco dictionaries built into the OpenCV library.
ARUCO_DICT = {
"DICT_4X4_50": cv2.aruco.DICT_4X4_50,
"DICT_4X4_100": cv2.aruco.DICT_4X4_100,
"DICT_4X4_250": cv2.aruco.DICT_4X4_250,
"DICT_4X4_1000": cv2.aruco.DICT_4X4_1000,
"DICT_5X5_50": cv2.aruco.DICT_5X5_50,
"DICT_5X5_100": cv2.aruco.DICT_5X5_100,
"DICT_5X5_250": cv2.aruco.DICT_5X5_250,
"DICT_5X5_1000": cv2.aruco.DICT_5X5_1000,
"DICT_6X6_50": cv2.aruco.DICT_6X6_50,
"DICT_6X6_100": cv2.aruco.DICT_6X6_100,
"DICT_6X6_250": cv2.aruco.DICT_6X6_250,
"DICT_6X6_1000": cv2.aruco.DICT_6X6_1000,
"DICT_7X7_50": cv2.aruco.DICT_7X7_50,
"DICT_7X7_100": cv2.aruco.DICT_7X7_100,
"DICT_7X7_250": cv2.aruco.DICT_7X7_250,
"DICT_7X7_1000": cv2.aruco.DICT_7X7_1000,
"DICT_ARUCO_ORIGINAL": cv2.aruco.DICT_ARUCO_ORIGINAL
}
def main():
"""
Main method of the program.
"""
# Check that we have a valid ArUco marker
if ARUCO_DICT.get(desired_aruco_dictionary, None) is None:
print("[INFO] ArUCo tag of '{}' is not supported".format(
args["type"]))
sys.exit(0)
# Load the ArUco dictionary
print("[INFO] detecting '{}' markers...".format(
desired_aruco_dictionary))
this_aruco_dictionary = cv2.aruco.Dictionary_get(ARUCO_DICT[desired_aruco_dictionary])
this_aruco_parameters = cv2.aruco.DetectorParameters_create()
# Start the video stream
cap = cv2.VideoCapture(0)
while(True):
# Capture frame-by-frame
# This method returns True/False as well
# as the video frame.
ret, frame = cap.read()
# Detect ArUco markers in the video frame
(corners, ids, rejected) = cv2.aruco.detectMarkers(
frame, this_aruco_dictionary, parameters=this_aruco_parameters)
# Check that at least one ArUco marker was detected
if len(corners) > 0:
# Flatten the ArUco IDs list
ids = ids.flatten()
# Loop over the detected ArUco corners
for (marker_corner, marker_id) in zip(corners, ids):
# Extract the marker corners
corners = marker_corner.reshape((4, 2))
(top_left, top_right, bottom_right, bottom_left) = corners
# Convert the (x,y) coordinate pairs to integers
top_right = (int(top_right[0]), int(top_right[1]))
bottom_right = (int(bottom_right[0]), int(bottom_right[1]))
bottom_left = (int(bottom_left[0]), int(bottom_left[1]))
top_left = (int(top_left[0]), int(top_left[1]))
# Draw the bounding box of the ArUco detection
cv2.line(frame, top_left, top_right, (0, 255, 0), 2)
cv2.line(frame, top_right, bottom_right, (0, 255, 0), 2)
cv2.line(frame, bottom_right, bottom_left, (0, 255, 0), 2)
cv2.line(frame, bottom_left, top_left, (0, 255, 0), 2)
# Calculate and draw the center of the ArUco marker
center_x = int((top_left[0] + bottom_right[0]) / 2.0)
center_y = int((top_left[1] + bottom_right[1]) / 2.0)
cv2.circle(frame, (center_x, center_y), 4, (0, 0, 255), -1)
# Draw the ArUco marker ID on the video frame
# The ID is always located at the top_left of the ArUco marker
cv2.putText(frame, str(marker_id),
(top_left[0], top_left[1] - 15),
cv2.FONT_HERSHEY_SIMPLEX,
0.5, (0, 255, 0), 2)
# Display the resulting frame
cv2.imshow('frame',frame)
# If "q" is pressed on the keyboard,
# exit this loop
if cv2.waitKey(1) & 0xFF == ord('q'):
break
# Close down the video stream
cap.release()
cv2.destroyAllWindows()
if __name__ == '__main__':
print(__doc__)
main()
Save the file, and close it.
You need to have opencv-contrib-python installed and not opencv-python. Open a terminal window, and type:
pip uninstall opencv-python
pip3 install opencv-contrib-python==4.6.0.66
To run the program in Linux for example, type the following command:
python3 detect_aruco_marker.py
If you want to restore OpenCV to the previous version after you’re finished creating the ArUco markers, type:
pip uninstall opencv-contrib-python
pip install opencv-python
To set the changes, I recommend rebooting your computer.
By the end of this tutorial, you will be able to create this:
ROS 2 Control is a framework for robots that lets you create programs to control their movement. Think of it as a middleman between the robot hardware and the robot software.
ROS 2 Control connects directly to your hardware (e.g. Arduino) to send commands to the hardware from the software and to receive joint states from the hardware (i.e. the current angular position in radians of each motor of the robotic arm) and send that information out to the rest of ROS 2.
A complete block diagram showing an overview of ROS 2 Control is on this page.
If you look at that block diagram, you will see that the two main building blocks for ROS 2 Control are:
Controller Manager (managers the controllers, which calculate the commands needed to make the robot behave as desired)
Resource Manager (manages the hardware interfaces, or communication with the actual robot hardware)
The Controller Manager is responsible for loading, unloading, and managing multiple controllers within the ROS 2 Control framework. It handles the lifecycle of controllers, coordinates their execution, and ensures proper switching between different control modes.
ROS 2 has many different controllers, with some of the most popular being:
Joint Position Controller: Inputs a desired joint position and outputs position commands to the hardware.
Joint Trajectory Controller: Accepts a trajectory of joint positions over time, from a software like MoveIt2, as input (think of it as a list of goal positions for your robotic arm) and outputs a series of position, velocity, or effort commands to the hardware interface so that arm can follow that trajectory.
Diff Drive Controller: Takes velocity commands (linear and angular) from a software such as Nav2 as input and outputs wheel velocities or positions for differential drive robots.
These controllers process their respective inputs to generate appropriate outputs, enabling precise control of various robotic systems from manipulators to mobile platforms.
The Resource Manager manages the hardware resources of the robot (i.e. “Hardware Interface”), providing an abstraction layer between the controllers and the actual hardware. It handles the communication with the physical devices (such as the Arduino on the robotic arm), manages state information (e.g. the current angle of each joint of the robotic arm), and provides a unified interface for controllers to interact with various hardware components.
If you are using a virtual machine on Ubuntu, I recommend you use Oracle VirtualBox instead of VMWare. I had a lot of issues on VMWare that prevented me from running Gazebo.
I am using ROS 2 Iron, the most recent version at the time of writing this tutorial, but you are welcome to use other versions of ROS 2 if you prefer.
If you take a look at the file, you will see we have added a <transmissions> element.
The purpose of the <transmission> element is to define how the joint commands from the controllers are mapped to the actual joint motions of the robot.
Historically, in ROS 1, the <transmission> element was commonly used to specify the relationship between the joint commands and the actuator commands. It provided a way to define the gearing ratio, mechanical reduction, and other properties related to the transmission of motion from the actuators to the joints.
In general, when using the ros2_control framework in ROS 2, the <transmission> element is not strictly necessary in the XACRO files. The joint-actuator mapping and control-related configurations are typically handled through the controller configuration YAML files and the hardware interface layer.
This XML code defines a Gazebo plugin for a robot named “mycobot_280” using the ROS 2 control framework. It specifies the robot description, robot state publisher node, and the location of the controller configuration YAML file for the mycobot_280 robot.
This XML code defines the interface between the robot’s hardware and the ROS 2 Control library for a robot named “mycobot_280”. It specifies the joint names, their command and state interfaces, and the minimum and maximum position limits for each joint, including the gripper joints with mimic properties.
If you take a closer look at the file, you can see that ROS 2 Control provides a structured way to handle the hardware interface for different types of robots through a set of standardized components. These components are primarily divided into command interfaces and state interfaces. Here’s an overview of each:
Command Interfaces (write to hardware)
Command interfaces are used for sending commands to hardware components. These commands could be setpoints, desired positions, velocities, or efforts (torque/force) that the hardware should attempt to achieve. Command interfaces are important for motors and are defined based on the type of control you need to exert on a hardware component. Typical command interfaces include:
Position Commands: Used to set a desired position for the motors.
Velocity Commands: Used to set a desired velocity for the motors.
Effort Commands: Used to control motors based on force or torque (e.g. the force to be applied to grasp an object)
These interfaces allow controllers to interact directly with the hardware abstraction layers, issuing commands that the hardware then tries to follow based on its capabilities and feedback mechanisms.
State Interfaces (read from hardware)
State interfaces are used to read the current state of the hardware components. These are essential for monitoring and feedback loops in control systems. State interfaces provide real-time data on various parameters such as position, velocity, effort, and other specific sensor readings. Common types of state interfaces include:
Position State: Provides the current position of actuators or sensors.
Velocity State: Gives the current velocity of moving parts.
Effort State: Indicates the current force or torque being applied by actuators.
In practice, to use ros2_control, you define these interfaces in a YAML configuration file (see next section) where you specify each joint or hardware component along with its corresponding command and state interfaces. This configuration is then loaded and managed by a controller manager, which handles the life cycle and updates of each controller in real-time.
This framework allows developers to focus more on the higher-level control strategies rather than the specifics of each hardware component, promoting reusability and scalability in robotics applications.
Create a YAML Configuration File for the Controller Manager
In this section, we will create a YAML configuration file that defines the controllers and their properties for our robotic arm. The controller manager in ROS 2 Control uses this file to load and configure the appropriate controllers during runtime.
The YAML file allows us to specify the joint controllers, their types, and associated parameters. By creating a well-structured configuration file, we can easily manage and control the behavior of our robotic arm in the Gazebo simulation environment.
We will walk through the process of creating the YAML file step by step, explaining each part and its significance. By the end of this section, you will have a complete configuration file ready to be used with the controller manager in your ROS 2 Control setup.
cd ~/ros2_ws/src/mycobot_ros2/mycobot_gazebo/config
# controller_manager provides the necessary infrastructure to manage multiple controllers efficiently and robustly.
controller_manager:
ros__parameters:
update_rate: 10 # update_rate specifies how often (in Hz) the controllers should be updated.
# The JointTrajectoryController allows you to send joint trajectory commands to a group
# of joints on a robot. These commands specify the desired positions for each joint.
arm_controller:
type: joint_trajectory_controller/JointTrajectoryController
grip_controller:
type: joint_trajectory_controller/JointTrajectoryController
# Responsible for publishing the current state of the robot's joints to the /joint_states
# ROS 2 topic
joint_state_broadcaster:
type: joint_state_broadcaster/JointStateBroadcaster
# Define the parameters for each controller
arm_controller:
ros__parameters:
joints:
- link1_to_link2
- link2_to_link3
- link3_to_link4
- link4_to_link5
- link5_to_link6
- link6_to_link6flange
# The controller will expect position commands as input for each of these joints.
command_interfaces:
- position
# Tells the controller that it should expect to receive position data as the state
# feedback from the hardware interface,
state_interfaces:
- position
# If true, When set to true, the controller will not use any feedback from the system
# (e.g., joint positions, velocities, efforts) to compute the control commands.
open_loop_control: true
# When set to true, it allows the controller to integrate the trajectory goals it receives.
# This means that if the goal trajectory only specifies positions, the controller will
# numerically integrate the positions to compute the velocities and accelerations required
# to follow the trajectory.
allow_integration_in_goal_trajectories: true
grip_controller:
ros__parameters:
joints:
- gripper_controller
command_interfaces:
- position
state_interfaces:
- position
open_loop_control: true
allow_integration_in_goal_trajectories: true
Here’s a brief explanation of the different parts of the YAML file:
1. controller_manager: This section defines the configuration for the controller manager, which is responsible for managing multiple controllers in the system.
2. arm_controller and grip_controller: These sections define the configuration for the arm and grip controllers, respectively. Both controllers are of type joint_trajectory_controller/JointTrajectoryController, which allows sending joint trajectory commands to a group of joints.
3. joint_state_broadcaster: This section defines the configuration for the joint state broadcaster, which is responsible for publishing the current state of the robot’s joints to the /joint_states ROS 2 topic. It is of type joint_state_broadcaster/JointStateBroadcaster.
4. ros__parameters under each controller:
joints: Specifies the list of joints that the controller should control. In this case, the arm controller controls joints from `link1_to_link2` to `link6_to_link6flange`, while the grip controller controls the `gripper_controller` joint.
command_interfaces: Specifies the type of command interface the controller expects. In this case, both controllers expect position commands.
state_interfaces: Specifies the type of state feedback the controller expects from the hardware interface. Here, both controllers expect position feedback.
open_loop_control: When set to true, the controller will not use any feedback from the system (e.g., joint positions, velocities, efforts) to compute the control commands. It will solely rely on the provided commands.
allow_integration_in_goal_trajectories: When set to true, the controller will integrate the trajectory goals it receives. If the goal trajectory only specifies positions, the controller will numerically integrate the positions to compute the velocities and accelerations required to follow the trajectory.
Create a Launch File
cd ~/ros2_ws/src/mycobot_ros2/mycobot_gazebo/launch
Take a look at the file. The main body of the robot consists of a base and six arm segments. These segments are joined together in a way that allows them to rotate, much like your own arm can rotate at different points. At the end of the arm, there’s a part called a flange, which is where the gripper is attached.
The gripper has several small parts that work together to allow it to open and close, mimicking the action of fingers grasping an object. T
For each part of the robot, the file describes both how it looks (its “visual” properties) and its physical shape for the purpose of detecting collisions with other objects. Many of these descriptions use 3D mesh files to define complex shapes accurately.
The file also includes information about the physical properties of each part, such as its mass and how that mass is distributed. This is important for simulating how the robot would move and interact with objects in the real world.
The joints that allow the robot’s parts to move have defined limits on how far and how fast they can rotate. This ensures that the simulated robot behaves like its real-world counterpart, preventing unrealistic movements.
Finally, the file includes some information about how the robot should appear in the Gazebo simulation environment, specifying colors and materials for different parts.
Now let’s create some more code. This file is one of the files that we imported in the URDF file we just created. Open a terminal window, and type this:
Manages the overall integration between Gazebo and ROS 2 control
Pay careful attention to the remapping of the topics. The controller manager subscribes to the URDF via the /controller_manager/robot_description topic by default. You need to remap this subscription to the /robot_description topic, which is published by the robot state publisher.
Implements the specific hardware interfaces for joints and sensors
Handles the direct read/write operations with the simulated hardware
GazeboSimROS2ControlPlugin loads the Controller Manager, while GazeboSimSystem implements the specific hardware interface that allows ROS 2 controllers (which are managed by the Controller Manager) to send and receive data from the the simulated robotic arm in Gazebo. They work together to provide a complete bridge between Gazebo simulation and ROS 2 control systems.
Create a Launch File
cd ~/ros2_ws/src/mycobot_ros2/mycobot_gazebo/launch
Let’s create a ROS 2 node that can loop through a list of trajectories to simulate the arm moving from the home position to a goal location and then back to home.
Python
Let’s start by creating a script using Python.
cd ~/ros2_ws/src/mycobot_ros2/mycobot_gazebo/scripts
gedit example_joint_trajectory_publisher.py
Add this code and save the file.
#! /usr/bin/env python3
"""
Description:
ROS 2: Executes a sample trajectory for a robotic arm in Gazebo
-------
Publishing Topics (ROS 2):
Desired goal pose of the robotic arm
/arm_controller/joint_trajectory - trajectory_msgs/JointTrajectory
Desired goal pose of the gripper
/grip_controller/joint_trajectory - trajectory_msgs/JointTrajectory
-------
Author: Addison Sears-Collins
Date: April 29, 2024
"""
import rclpy # Python client library for ROS 2
from rclpy.node import Node # Handles the creation of nodes
from trajectory_msgs.msg import JointTrajectory, JointTrajectoryPoint
from builtin_interfaces.msg import Duration
from std_msgs.msg import Header # Import the Header message
# Define constants
arm_joints = [ 'link1_to_link2',
'link2_to_link3',
'link3_to_link4',
'link4_to_link5',
'link5_to_link6',
'link6_to_link6flange']
gripper_joints = ['gripper_controller']
class ExampleJointTrajectoryPublisherPy(Node):
"""This class executes a sample trajectory for a robotic arm
"""
def __init__(self):
""" Constructor.
"""
# Initialize the class using the constructor
super().__init__('example_joint_trajectory_publisher_py')
# Create the publisher of the desired arm and gripper goal poses
self.arm_pose_publisher = self.create_publisher(JointTrajectory, '/arm_controller/joint_trajectory', 1)
self.gripper_pose_publisher = self.create_publisher(JointTrajectory, '/grip_controller/joint_trajectory', 1)
self.timer_period = 5.0 # seconds 5.0
self.timer = self.create_timer(self.timer_period, self.timer_callback)
self.frame_id = "base_link"
# Desired time from the trajectory start to arrive at the trajectory point.
# Needs to be less than or equal to the self.timer_period above to allow
# the robotic arm to smoothly transition between points.
self.duration_sec = 2
self.duration_nanosec = 0.5 * 1e9 # (seconds * 1e9)
# Set the desired goal poses for the robotic arm.
self.arm_positions = []
self.arm_positions.append([0.0, 0.0, 0.0, 0.0, 0.0, 0.0]) # Home location
self.arm_positions.append([-1.345, -1.23, 0.264, -0.296, 0.389, -1.5]) # Goal location
self.arm_positions.append([-1.345, -1.23, 0.264, -0.296, 0.389, -1.5])
self.arm_positions.append([0.0, 0.0, 0.0, 0.0, 0.0, 0.0]) # Home location
self.gripper_positions = []
self.gripper_positions.append([0.0]) # Open gripper
self.gripper_positions.append([0.0])
self.gripper_positions.append([-0.70]) # Close gripper
self.gripper_positions.append([-0.70])
# Keep track of the current trajectory we are executing
self.index = 0
def timer_callback(self):
"""Set the goal pose for the robotic arm.
"""
# Create new JointTrajectory messages
msg_arm = JointTrajectory()
msg_arm.header = Header()
msg_arm.header.frame_id = self.frame_id
msg_arm.joint_names = arm_joints
msg_gripper = JointTrajectory()
msg_gripper.header = Header()
msg_gripper.header.frame_id = self.frame_id
msg_gripper.joint_names = gripper_joints
# Create JointTrajectoryPoints
point_arm = JointTrajectoryPoint()
point_arm.positions = self.arm_positions[self.index]
point_arm.time_from_start = Duration(sec=int(self.duration_sec), nanosec=int(self.duration_nanosec)) # Time to next position
msg_arm.points.append(point_arm)
self.arm_pose_publisher.publish(msg_arm)
point_gripper = JointTrajectoryPoint()
point_gripper.positions = self.gripper_positions[self.index]
point_gripper.time_from_start = Duration(sec=int(self.duration_sec), nanosec=int(self.duration_nanosec))
msg_gripper.points.append(point_gripper)
self.gripper_pose_publisher.publish(msg_gripper)
# Reset the index
if self.index == len(self.arm_positions) - 1:
self.index = 0
else:
self.index = self.index + 1
def main(args=None):
# Initialize the rclpy library
rclpy.init(args=args)
# Create the node
example_joint_trajectory_publisher_py = ExampleJointTrajectoryPublisherPy()
# Spin the node so the callback function is called.
rclpy.spin(example_joint_trajectory_publisher_py)
# Destroy the node
example_joint_trajectory_publisher_py.destroy_node()
# Shutdown the ROS client library for Python
rclpy.shutdown()
if __name__ == '__main__':
main()
/**
* @file example_joint_trajectory_publisher.cpp
* @brief ROS 2: Executes a sample trajectory for a robotic arm in Gazebo
*
* Publishing Topics (ROS 2):
* Desired goal pose of the robotic arm
* /arm_controller/joint_trajectory - trajectory_msgs::msg::JointTrajectory
*
* Desired goal pose of the gripper
* /grip_controller/joint_trajectory - trajectory_msgs::msg::JointTrajectory
*
* @author Addison Sears-Collins
* @date April 29, 2024
*/
#include <chrono>
#include <functional>
#include <memory>
#include <string>
#include "rclcpp/rclcpp.hpp"
#include "trajectory_msgs/msg/joint_trajectory.hpp"
#include "std_msgs/msg/header.hpp"
using namespace std::chrono_literals;
// Define constants
const std::vector<std::string> arm_joints = {
"link1_to_link2",
"link2_to_link3",
"link3_to_link4",
"link4_to_link5",
"link5_to_link6",
"link6_to_link6flange"
};
const std::vector<std::string> gripper_joints = {
"gripper_controller"
};
class ExampleJointTrajectoryPublisherCpp : public rclcpp::Node
{
public:
ExampleJointTrajectoryPublisherCpp()
: Node("example_joint_trajectory_publisher_cpp")
{
// Create the publisher of the desired arm and gripper goal poses
arm_pose_publisher_ = create_publisher<trajectory_msgs::msg::JointTrajectory>("/arm_controller/joint_trajectory", 1);
gripper_pose_publisher_ = create_publisher<trajectory_msgs::msg::JointTrajectory>("/grip_controller/joint_trajectory", 1);
// Create a timer to periodically call the timerCallback function
timer_ = create_wall_timer(5s, std::bind(&ExampleJointTrajectoryPublisherCpp::timerCallback, this));
frame_id_ = "base_link";
// Desired time from the trajectory start to arrive at the trajectory point.
// Needs to be less than or equal to the timer period above to allow
// the robotic arm to smoothly transition between points.
duration_sec_ = 2;
duration_nanosec_ = 0.5 * 1e9; // (seconds * 1e9)
// Set the desired goal poses for the robotic arm.
arm_positions_ = {
{0.0, 0.0, 0.0, 0.0, 0.0, 0.0}, // Home location
{-1.345, -1.23, 0.264, -0.296, 0.389, -1.5}, // Goal location
{-1.345, -1.23, 0.264, -0.296, 0.389, -1.5},
{0.0, 0.0, 0.0, 0.0, 0.0, 0.0} // Home location
};
gripper_positions_ = {
{0.0}, // Open gripper
{0.0},
{-0.70}, // Close gripper
{-0.70}
};
// Keep track of the current trajectory we are executing
index_ = 0;
}
private:
void timerCallback()
{
// Create new JointTrajectory messages for arm and gripper
auto msg_arm = trajectory_msgs::msg::JointTrajectory();
msg_arm.header.frame_id = frame_id_;
msg_arm.joint_names = arm_joints;
auto msg_gripper = trajectory_msgs::msg::JointTrajectory();
msg_gripper.header.frame_id = frame_id_;
msg_gripper.joint_names = gripper_joints;
// Create JointTrajectoryPoints for arm and gripper
auto point_arm = trajectory_msgs::msg::JointTrajectoryPoint();
point_arm.positions = arm_positions_[index_];
point_arm.time_from_start = rclcpp::Duration(duration_sec_, duration_nanosec_);
msg_arm.points.push_back(point_arm);
arm_pose_publisher_->publish(msg_arm);
auto point_gripper = trajectory_msgs::msg::JointTrajectoryPoint();
point_gripper.positions = gripper_positions_[index_];
point_gripper.time_from_start = rclcpp::Duration(duration_sec_, duration_nanosec_);
msg_gripper.points.push_back(point_gripper);
gripper_pose_publisher_->publish(msg_gripper);
// Reset the index
if (index_ == arm_positions_.size() - 1) {
index_ = 0;
} else {
index_++;
}
}
// Publishers for arm and gripper joint trajectories
rclcpp::Publisher<trajectory_msgs::msg::JointTrajectory>::SharedPtr arm_pose_publisher_;
rclcpp::Publisher<trajectory_msgs::msg::JointTrajectory>::SharedPtr gripper_pose_publisher_;
// Timer for periodic callback
rclcpp::TimerBase::SharedPtr timer_;
// Frame ID for the joint trajectories
std::string frame_id_;
// Duration for each trajectory point
int duration_sec_;
int duration_nanosec_;
// Desired goal poses for the robotic arm and gripper
std::vector<std::vector<double>> arm_positions_;
std::vector<std::vector<double>> gripper_positions_;
// Index to keep track of the current trajectory point
size_t index_;
};
int main(int argc, char * argv[])
{
// Initialize the ROS 2 client library
rclcpp::init(argc, argv);
// Create an instance of the ExampleJointTrajectoryPublisherCpp node
auto node = std::make_shared<ExampleJointTrajectoryPublisherCpp>();
// Spin the node to execute the callbacks
rclcpp::spin(node);
// Shutdown the ROS 2 client library
rclcpp::shutdown();
return 0;
}
Update your CMakeLists.txt file and package.xml file.
In this tutorial, I will guide you through the process of simulating and performing basic control of a robotic arm in Gazebo. By the end of this tutorial, you will be able to build this:
Gazebo is a robotics simulator that enables the testing and development of robots in a virtual environment. It supports a wide range of robots and integrates seamlessly with ROS 2, facilitating the transition from simulation to real-world application. This makes Gazebo an essential tool for roboticists aiming to prototype and refine algorithms efficiently.
Before we begin, I should advise you that Gazebo has been going through a lot of changes over the last several years. These changes are discussed here. They have changed the name several times and have not fully ported all the ROS plugins from the old classic Gazebo to the new Gazebo simulation engine (some folks still call the new Gazebo, “Ignition”, although it is no longer called Ignition).
For this reason, I have split this tutorial into two sections: Gazebo (new version) and Gazebo Classic (old version). I will show you how to launch and perform basic control of a robotic arm using both Gazebo versions.
If you are using a virtual machine on Ubuntu, I recommend you use Oracle VirtualBox instead of VMWare. I had a lot of issues on VMWare that prevented me from running Gazebo.
I am using ROS 2 Iron, the most recent version at the time of writing this tutorial, but you are welcome to use other versions of ROS 2 if you prefer.
I am using ROS 2 Iron, but regardless of the ROS 2 version you are using, you will need to open a terminal window, and type this command to install the package:
sudo apt-get install ros-${ROS_DISTRO}-ros-gz
Type Y and press Enter to install the package.
The command above installs ros_gz and the correct version of Gazebo for your ROS 2 version.
Now install Numpy, a scientific computing library for Python.
Now let’s create our world. This first world we will create is an empty world.
cd worlds
gedit empty.world
Add this code.
<?xml version="1.0" ?>
<sdf version="1.6">
<world name="default">
<!-- Plugin for simulating physics -->
<plugin
filename="gz-sim-physics-system"
name="gz::sim::systems::Physics">
</plugin>
<!-- Plugin for handling user commands -->
<plugin
filename="gz-sim-user-commands-system"
name="gz::sim::systems::UserCommands">
</plugin>
<!-- Plugin for broadcasting scene updates -->
<plugin
filename="gz-sim-scene-broadcaster-system"
name="gz::sim::systems::SceneBroadcaster">
</plugin>
<!-- To add realistic gravity, do: 0.0 0.0 -9.8, otherwise do 0.0 0.0 0.0 -->
<gravity>0.0 0.0 -9.8</gravity>
<!-- Include a model of the Sun from an external URI -->
<include>
<uri>
https://fuel.gazebosim.org/1.0/OpenRobotics/models/Sun
</uri>
</include>
<!-- Include a model of the Ground Plane from an external URI -->
<include>
<uri>
https://fuel.gazebosim.org/1.0/OpenRobotics/models/Ground Plane
</uri>
</include>
<!-- Define scene properties -->
<scene>
<shadows>false</shadows>
</scene>
</world>
</sdf>
Save the file, and close it.
This file describes a simulated world using the Simulation Description Format (SDF) version 1.6. It defines a world named “default”.
The world includes three important plugins:
The physics plugin simulates how objects interact and move in the world.
The user commands plugin allows people to control things in the simulation.
The scene broadcaster plugin sends out updates about what’s happening in the world.
Gravity is set to mimic Earth’s gravity, pulling objects downward.
Two models are brought into the world from external sources:
Sun, which provides light
Ground Plane, which gives a flat surface for objects to rest on.
The scene settings specify that shadows should not be rendered in this world.
This file serves as a basic template for creating a simulated environment, providing the fundamental elements needed for a functional simulation.
This code describes a simulated world using the Simulation Description Format (SDF) version 1.6. It defines a detailed indoor residential environment. The world is named ‘default’ and includes plugins for physics simulation, user commands, and scene broadcasting. Gravity is set to Earth-like conditions.
The scene properties define ambient and background lighting, with shadows, grid, and origin visual turned off. The world is populated with numerous models representing typical household items and furniture. These include structural elements like walls, windows, doors, and flooring, as well as furniture such as beds, chairs, tables, and sofas.
The environment also includes kitchen appliances, electronic devices, decorative items, and various small objects to create a realistic home setting. Each model is positioned precisely within the world using coordinate systems. Many models are marked as static, meaning they won’t move during the simulation.
Lighting is carefully set up with multiple light sources, including a sun for overall illumination and various point and spot lights to simulate indoor lighting. These lights are placed strategically to create a realistic lighting environment, with some casting shadows and others providing ambient illumination.
Create a models folder. These models are physical objects that will exist inside your house world.
cd ..
mkdir models
cd models
Make sure you put these models inside your models folder.
Let’s walk through one of those models. We will take a look at the refrigerator model.
The refrigerator model is defined using the Simulation Description Format (SDF) version 1.6. Here’s a breakdown of its structure:
The model is named “aws_robomaker_residential_Refrigerator_01“. It contains a single link, which represents the main body of the refrigerator.
The link has three main components:
Inertial properties: A mass of 1 unit is specified, which affects how the object behaves in physical simulations.
Collision geometry: This defines the shape used for collision detection. It uses a mesh file named “aws_Refrigerator_01_collision.DAE” located in the model’s meshes directory. This mesh is a simplified version of the refrigerator’s shape for efficient collision calculations.
Visual geometry: This defines how the refrigerator looks in the simulation. It uses a separate mesh file named “aws_Refrigerator_01_visual.DAE“. This mesh is more detailed than the collision mesh to provide a realistic appearance.
Both the collision and visual meshes are scaled to 1:1:1, meaning they use their original size.
The visual component also includes a metadata tag specifying it belongs to layer 1, which is used for rendering or interaction purposes in the simulation environment.
Finally, the model is set to static, meaning it won’t move during the simulation. This is appropriate for a large appliance like a refrigerator that typically stays in one place.
This structure allows the simulation to efficiently handle both the physical interactions and visual representation of the refrigerator in the Gazebo virtual environment.
Oh and one other thing I should mention. DAE (Digital Asset Exchange) files, also known as COLLADA (COLLAborative Design Activity) files, can be created by various 3D modeling and animation software.
The DAE format is widely used in game development, virtual reality, augmented reality, and simulation environments because it’s an open standard that can store 3D assets along with their associated metadata.
For the specific refrigerator model mentioned in the SDF file, it was likely created using a 3D modeling software as part of the AWS RoboMaker residential models set. The modelers would have created both a detailed visual mesh and a simplified collision mesh, exporting each as separate DAE files for use in the simulation environment.
The model.sdf and model.config files are typically created manually using a basic text editor.
Create a URDF File
Now let’s create our URDF file. A URDF (Unified Robot Description Format) file is an XML format file used to describe the physical configuration and properties of a robot in a structured and standard way.
This line specifies that we’re describing a robot named “mycobot_280” and it uses the xacro namespace, which allows for more dynamic URDF creation
World Link
<link name="world"/>
This defines a “world” link, which represents the fixed world frame.
If this robotic arm was attached to a humanoid robot, the “world” link would likely be replaced by a link representing the part of the humanoid robot to which the arm is attached. This could be:
In this case, “torso_link” would be the parent link to which the arm is attached. The xyz values in the joint would define where on the torso the arm is mounted, and the rpy (roll, pitch, yaw) values would define its orientation relative to the torso.
This change would effectively integrate the robotic arm into the larger structure of the humanoid robot, allowing for coordinated movement between the arm and the rest of the robot’s body.
This means the joint mimics another joint’s movement, useful for synchronized gripper finger movement.
Gazebo Plugins
You will notice I added plugins at the end of the URDF File. Here is a list of plugins that can be added to our URDF file to create extra functionality.
The two plugins I added are the JointStatePublisher and the JointPositionController.
The JointStatePublisher publishes the state of all joints in a robot to the “/joint_states” topic, including positions (radians or meters), velocities (radians per second or meters per second), and efforts (Nm or N) as a sensor_msgs/JointState message.
The JointPositionController subscribes to target joint angles (i.e. positions) as a std_msgs/Float64 message (i.e. a floating-point number like 0.36).
What Would Happen on a Real Robot?
In a real robotic arm like the myCobot 280 or Kinova Gen3 lite, you wouldn’t need to add Gazebo plugins to your URDF if you’re not using Gazebo for simulation. Instead, you would interface directly with the real hardware. Here’s how the system would typically work:
Robotic Arm Physical Components (myCobot 280 robotic arm with Arduino is the example here)
The robotic arm has servo motors for each joint (i.e. the movable pieces of the robotic arm). For example, the myCobot 280 uses custom servos, while an arm like Kinova Gen3 lite uses proprietary actuators.
Each servo motor has a built-in encoder that measures the joint’s position (e.g. ticks, degree, or radians).
The servo positions are controlled by firmware running on a microcontroller (e.g. Arduino).
The robotic arm also has a physical motor controller (i.e. an electronic circuit) that controls the voltage and current supplied to move the motor to reach a target angular position based on the commands it receives from Arduino.
A USB or Ethernet connection links the microcontroller (e.g. Arduino) to your computer.
Microcontroller (e.g. Arduino):
Receives commands from the computer and sends back to the computer the “state” or position (typically in radians) of each joint.
ROS 2 Control Hardware Interface:
ROS 2 Control is a software component that bridges between ROS 2 and your myCobot 280’s hardware (via the Arduino).
You would create a custom hardware interface package for your specific arm.
This ROS 2 Control hardware interface would communicate with the Arduino over USB, sending commands to the Arduino and receiving joint states from the Arduino:
Defines command interfaces for each joint (for sending new position commands).
Defines state interfaces for each joint (for reading positions).
This hardware interface acts an important link between the physical robot (controlled by the Arduino) and the ROS 2 control system. It abstracts away the hardware-specific details, allowing the rest of the ROS 2 system to work with standardized interfaces.
Controller Manager:
Moving further away from the hardware and up the software chain, we have the “Controller Manager”.
Provided by the ros2_control package.
Load and manage the controllers specified in your configuration (e.g., a joint trajectory controller for coordinated joint motion).
Handle the lifecycle of these controllers (configure, activate, deactivate).
Manages the flow of data between the hardware interface and the controllers.
Joint State Broadcaster:
Part of ros2_controllers.
Not actually a controller, but a specialized component that reads joint states from the hardware interface.
The Joint State Broadcaster reads joint states from your custom hardware interface and reports them on the /joint_states topic for components like MoveIt 2 to understand the current state of the robotic arm.
Joint Trajectory Controller:
Also from ros2_controllers.
JointTrajectoryController is one of many available ROS 2 controllers. This software controller receives desired trajectories for the arm joints.
It generates appropriate commands and sends them to the physical motor controllers (i.e. Arduino) through the ROS 2 control hardware interface.
Responsible for:
Receiving desired joint trajectories (e.g., from MoveIt 2 or a custom node) on a topic like /joint_trajectory_controller/joint_trajectory as a trajectory_msgs/msg/JointTrajectory message. Each message contains a series of waypoints, each specifying joint positions (and optionally velocities and accelerations) at specific time points.
Interpolating between trajectory points to create smooth motion.
Sending position commands to the hardware interface at a high frequency (typically 100-1000 Hz).
These commands are then sent to the Arduino, which controls the actual servo motors.
MoveIt 2:
A motion planning framework that integrates with ROS 2 Control.
Uses the published joint states to maintain an internal model of the robot’s current state.
For the myCobot280, it would:
Receive a goal pose for the gripper (e.g., from a user interface or task planner).
Use its knowledge of the robot’s kinematics to plan a collision-free path to the goal pose
Generate a joint trajectory to achieve this path.
Send this trajectory to the Joint Trajectory Controller for execution.
Summary of the Workflow for an Example Pick and Place System Using ROS 2 Control and MoveIt 2
Here is how it would all work for a real robotic arm with perhaps a depth camera (like the Intel RealSense) involved to see objects in the world.
Hardware Interface Operation:
Polls Arduino via USB (e.g., 100Hz), requesting joint positions.
Receives raw data (degrees), converts to radians, updates internal state.
Topic: N/A (internal communication)
Joint State Broadcasting:
Joint State Broadcaster (50Hz):
Reads latest joint positions from Hardware Interface.
(Optionally) Processes /camera/depth/color/points for obstacle avoidance
Motion Planning with MoveIt 2:
When new goal set (e.g., “move gripper to object detected by camera”):
Plans collision-free path using robot state and point cloud data
Generates trajectory (series of time-stamped joint positions)
Publishes to topic: /joint_trajectory_controller/joint_trajectory
Message type: trajectory_msgs/msg/JointTrajectory
Joint Trajectory Controller Execution:
Subscribes to /joint_trajectory_controller/joint_trajectory
Interpolates between trajectory points
High-frequency loop (100Hz):
Calculates desired joint positions
Sends joint position commands to Hardware Interface
Hardware Interface Command Translation:
Receives commands from Joint Trajectory Controller
Converts radians to degrees
Packages commands for Arduino
Sends via USB
Arduino Execution:
Listens for USB commands
Parses data, extracts joint positions
Sets servo motor positions
Gripper Control:
Separate node for gripper control subscribes to custom topic
Topic: /gripper_command
Message type: std_msgs/msg/Float32 (0.0 for fully open, 1.0 for fully closed)
Publishes commands to Hardware Interface for gripper servo
Visual Servoing (optional):
Custom node processes /camera/color/image_raw
Detects objects or fiducial markers
Publishes gripper goals to MoveIt 2
Topic: /move_group/goal
Message type: moveit_msgs/msg/MoveGroupActionGoal
This setup integrates the depth camera data into the control loop, allowing for:
Object detection and localization
Dynamic obstacle avoidance
Visual servoing for precise gripper positioning
The system maintains the abstraction of the myCobot280’s Arduino control, while adding the capability to react to its environment using the RealSense camera. This makes it suitable for tasks like pick-and-place operations, where the robot can locate objects with the camera and plan motions to grasp them with the gripper.
The first part of the file deals with a topic called joint_states, which is published by the JointState plugin. This means that data about the joint states of the robot is being sent from Gazebo to ROS.
The rest of the file lists topics that the JointPositionController plugin subscribes to. This means that these topics are used to send commands to specific joints of the robot from ROS to Gazebo.
Create the Launch File
Let’s create a launch file to spawn both our world and our robotic arm.
Go to the CMakeLists.txt, and make sure it looks like this:
cd ~/ros2_ws/src/mycobot_ros2/mycobot_gazebo/
gedit CMakeLists.txt
You can comment out or remove any folders or scripts in CMakeLists.txt that we have not yet created in this tutorial (e.g. test_set_joint_position_publisher.py). Be sure to do this before running “colcon build” below.
Here is the output using the house.world file. You might see a pop up asking if you would like to Force Quit Gazebo…just wait and be patient for everything to load (house.world is full of SDF models):
To see a list of active topics, type:
ign topic -l
To see the options, you can type:
ign topic -help
Gazebo often doesn’t close cleanly, so if you have trouble loading Gazebo, reboot your computer.
Move the Robotic Arm Manually
I could not get the JointTrajectoryController to work in this new Gazebo. As I mentioned earlier, things are moving fast over at Gazebo, and not all the plugins have been transitioned over. However, I was able to get the JointPositionController to work and make the arm move.
To send joint angles to the robot using ROS 2, type this in a terminal window:
Let’s create a ROS 2 node that can loop through a list of joint positions to simulate the robotic arm moving from the home position to a goal location and then back to home, repeatedly.
#! /usr/bin/env python3
"""
Description:
ROS 2: Executes a sample trajectory for a robotic arm in Gazebo
-------
Publishing Topics (ROS 2):
Desired goal positions for joints on a robotic arm
/model/mycobot_280/joint/link1_to_link2/cmd_pos - std_msgs/Float64
/model/mycobot_280/joint/link2_to_link3/cmd_pos - std_msgs/Float64
/model/mycobot_280/joint/link3_to_link4/cmd_pos - std_msgs/Float64
/model/mycobot_280/joint/link4_to_link5/cmd_pos - std_msgs/Float64
/model/mycobot_280/joint/link5_to_link6/cmd_pos - std_msgs/Float64
/model/mycobot_280/joint/link6_to_link6flange/cmd_pos - std_msgs/Float64
/model/mycobot_280/joint/gripper_controller/cmd_pos - std_msgs/Float64
/model/mycobot_280/joint/gripper_base_to_gripper_left2/cmd_pos - std_msgs/Float64
/model/mycobot_280/joint/gripper_left3_to_gripper_left1/cmd_pos - std_msgs/Float64
/model/mycobot_280/joint/gripper_base_to_gripper_right3/cmd_pos - std_msgs/Float64
/model/mycobot_280/joint/gripper_base_to_gripper_right2/cmd_pos - std_msgs/Float64
/model/mycobot_280/joint/gripper_right3_to_gripper_right1/cmd_pos - std_msgs/Float64
-------
Author: Addison Sears-Collins
Date: April 18, 2024
"""
import numpy as np
import rclpy # Python client library for ROS 2
from rclpy.node import Node # Handles the creation of nodes
from std_msgs.msg import Float64 # Import the Float64 message
# Define constants
names_of_joints = [ 'link1_to_link2',
'link2_to_link3',
'link3_to_link4',
'link4_to_link5',
'link5_to_link6',
'link6_to_link6flange',
'gripper_controller',
'gripper_base_to_gripper_left2',
'gripper_left3_to_gripper_left1',
'gripper_base_to_gripper_right3',
'gripper_base_to_gripper_right2',
'gripper_right3_to_gripper_right1']
class BasicJointPositionPublisher(Node):
"""This class executes a sample trajectory for a robotic arm
"""
def __init__(self):
""" Constructor.
"""
# Initialize the class using the constructor
super().__init__('basic_joint_position_publisher')
# Create a publisher for each joint
self.position_publishers = [
self.create_publisher(Float64, f'/model/mycobot_280/joint/{name}/cmd_pos', 1)
for name in names_of_joints
]
self.timer_period = 0.05 # seconds
self.timer = self.create_timer(self.timer_period, self.timer_callback)
# Starting position and goal position for the robotic arm joints
self.start_position = [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0]
self.end_position = [-1.345, -1.23, 0.264, -0.296, 0.389, -1.5, -0.7, -0.7, 0.7, 0.7, 0.7, -0.7]
# Number of steps to interpolate between positions
self.num_steps = 50
# Set the desired goal poses for the robotic arm.
self.positions = self.generate_positions(self.start_position, self.end_position)
# Keep track of the current trajectory we are executing
self.index = 0
# Indicate the direction of movement in the list of goal positions.
self.forward = True
def generate_positions(self, start_position, end_position):
"""
Generates positions along a path from start to end positions.
Args:
start_position (list): The starting position of the robotic arm.
end_position (list): The ending position of the robotic arm.
Returns:
list: A complete list of positions including all intermediate steps.
"""
# Example path including start and end, could be expanded to more waypoints
path_positions = [start_position, end_position]
all_positions = []
for i in range(len(path_positions) - 1):
interpolated = self.interpolate_positions(path_positions[i], path_positions[i + 1])
all_positions.extend(interpolated[:-1]) # Exclude the last to avoid duplicates
all_positions.append(path_positions[-1]) # Ensure the last position is included
return all_positions
def interpolate_positions(self, start, end):
"""
Linearly interpolates between two positions.
Args:
start (list): The starting position for interpolation.
end (list): The ending position for interpolation.
Returns:
list: A list of positions including the start, interpolated, and end positions.
"""
interpolated_positions = [start] # Initialize with the start position
step_vector = (np.array(end) - np.array(start)) / (self.num_steps + 1) # Calculate step vector
for step in range(1, self.num_steps + 1):
interpolated_position = np.array(start) + step * step_vector # Compute each interpolated position
interpolated_positions.append(interpolated_position.tolist()) # Append to the list
interpolated_positions.append(end) # Append the end position
return interpolated_positions
def timer_callback(self):
"""Set the goal pose for the robotic arm.
"""
# Publish the current position for each joint
for pub, pos in zip(self.position_publishers, self.positions[self.index]):
msg = Float64()
msg.data = pos
pub.publish(msg)
# Update the trajectory index
if self.forward:
if self.index < len(self.positions) - 1:
self.index = self.index + 1
else:
self.forward = False
else:
if self.index > 0:
self.index = self.index - 1
else:
self.forward = True
def main(args=None):
# Initialize the rclpy library
rclpy.init(args=args)
# Create the node
basic_joint_position_publisher = BasicJointPositionPublisher()
# Spin the node so the callback function is called.
rclpy.spin(basic_joint_position_publisher)
# Destroy the node
basic_joint_position_publisher.destroy_node()
# Shutdown the ROS client library for Python
rclpy.shutdown()
if __name__ == '__main__':
main()
Update your CMakeLists.txt to include this new script.
Now launch the robot. Wait for everything to come up, including RViz.
Run the basic joint position publisher to simulate movement for your robotic arm:
ros2 run mycobot_gazebo test_set_joint_position_publisher.py
Your arm will move from a home position to a goal position over and over again. It won’t be the prettiest movement, but we will fix that in a future tutorial.
Gazebo (classic version)
Now let’s take a look at how to simulate and move a robotic arm using the classic version of Gazebo (which will reach end of life in 2025).
Install gazebo_ros_pkgs
Open a new terminal window, and install the packages that will enable you to use ROS 2 to interface with Gazebo Classic.
The two Gazebo plugins defined in the xacro file that make all this work are libgazebo_ros_joint_state_publisher.so (publishes the joint states from Gazebo to ROS 2) and libgazebo_ros_joint_pose_trajectory.so (subscribes to the desired goal poses for the arm that are sent from ROS 2 to Gazebo). That is all you need for basic arm control.
To close Gazebo, press CTRL + C on your keyboard.
Move the Robotic Arm Using a Script
Let’s create a ROS 2 node that can loop through a list of trajectories to simulate the arm moving from the home position to a goal location and then back to home.
cd ~/ros2_ws/src/mycobot_ros2/mycobot_gazebo/scripts
Run the basic joint trajectory publisher to simulate movement for your robotic arm:
ros2 run mycobot_gazebo test_set_joint_trajectory_publisher.py
The output should look like the animated image at the beginning of this blog post.
Note that even though we have the “mimic” tag in the URDF file for the gripper, we still have to explicitly define all non-fixed joints in the URDF Gazebo plugins section in order for everything to work properly in Gazebo.