How to Make an Autonomous Wheeled Robot Using ROS

In this tutorial, we will build an autonomous, obstacle-avoiding wheeled robot from scratch using ROS (Robot Operating System), the popular robotics development platform. I decided to write this tutorial because a lot of introductory books and tutorials on ROS, including the official ROS tutorials, have you learn ROS by working with robots in simulation; but you never learn how to apply what you have learned to an actual physical robot that senses, thinks, and acts in the real world.

obstacle_avoiding_robot

Our goal is to build the cheapest, most complete robot we could possibly build using ROS. 

  • We will use low-cost components to build the robot “body” (I don’t want to spend hundreds of dollars for a robot kit).
  • The “brain” of the robot will be Arduino. Arduino is a popular microcontroller (think of it as a small computer) for building electronics projects. 
  • The robot’s “nervous system” — the communication lines that enable the brain to transmit signals and sensory information to and from different parts of its body — will be some inexpensive jumper wires and small electronic components. 

All of the parts you need are listed below in the “You Will Need” section. 

There are a lot of steps in this tutorial. Have fun, be patient, and be persistent. Don’t give up! If something doesn’t work the first time around (as is normally the case in robotics), try again. You will learn a lot more by fighting through to the end of this project. Stay relentless!

By the end of this tutorial, you will have rock-solid confidence and will know how to use ROS to design and develop a robot that moves around in the real world (not just on your computer screen).

Without further ado, let’s get started!

Table of Contents

Prerequisites

  • You have ROS running on Ubuntu Linux
    • I’m running my Ubuntu Linux inside a virtual machine on Windows 10. If you have MacOS or Linux, that will work just fine. Just make sure you have ROS installed.
  • You have the Arduino IDE (Integrated Development Environment) installed on either your PC (Windows, MacOS, or Linux) or Within Your Virtual Box.
  • If you have experience building a basic wheeled robot using either Arduino or Raspberry Pi, you will find this tutorial easier to follow. If you don’t have that experience, don’t worry. I’ll explain everything as we go.
  • Also, if you did the Hello World ROS project (to create a basic ROS Publisher and Subscriber node), you will find this tutorial easier to follow.

Return to Table of Contents

You Will Need

Here are the components you will need for this project:

Robot’s Body

1-wheeled-robot-parts

Robot’s Brain

arduino-uno
  • Arduino Uno (Elegoo Uno works just fine and is cheaper than the regular Arduino)

Robot’s Nervous System

jumper-wires

Soldering Equipment

soldering_iron

Soldering is a fundamental skill in robotics. It is the process of joining two metal wires or surfaces together using heat, with the use of metal called “solder”. 

External Bluetooth Transmitter and Receiver for Your PC

20-bluetooth-module

That’s it! Once you have purchased all the parts above, continue to the next section to build the robot’s body.

Return to Table of Contents

Assemble the “Body” of the Robot

Let’s build the body of our robot, step-by-step. 

First, open up your robot car chassis kit. You won’t be needing the little switch or the 4 x 1.5V AA battery pack that comes with the robot car chassis, so you can set that aside.

Follow this video below to assemble the robot’s frame: 

Below are some photos of the assembly of the frame of my robot:

2-secure-both-motors-to-the-robot-body-using-screws
3-secure-both-motors
4-add-the-wheels
5-add-the-wheels
6-add-round-plastic-disc

Once you have assembled the robot’s frame, mount the 4 x 1.5V AA battery holder with switch (the one that you purchased) to the rear of the robot. The rear of the robot is the end with the single roller wheel. We will secure it with a few layers of Scotch permanent mounting tape.

7-add-battery-pack

Since the leads of the 4×1.5V AA battery pack are kind of short, you can extend the length of them by wrapping each lead with a male-to-male jumper wire. If you know how to solder wires together (just YouTube “How to Solder Wires Together” for some great video tutorials), you can solder these jumper wires to your battery pack leads.

8a-battery-pack-on-off-switch
On-Off Switch of the 4×1.5V AA Battery Pack
8-add-battery-pack
See how I have extended the lead wires on my battery pack by using male-to-male jumper wires

Mount the Arduino (mine is inside a protective case) to the top of the battery pack using Scotch permanent mounting tape or Velcro fasteners.

10-mount-the-arduino
11-mount-the-arduino

Mount the 400-point solderless breadboard to the front of the robot. The back of the solderless breadboard has peel-off tape, but I prefer to use Velcro fasteners so that I can remove the solderless breadboard whenever I want to.

9-add-solderless-breadboard

The next thing to do is to connect two male-to-male jumper wires to one of the motors. One jumper wire will thread through the metallic hole on one side of the motor, and the other jumper wire will thread through the hole on the other end of that same motor.

12-wiring-the-motors

Now wire up the other motor the same way. Connect a male-to-male jumper wire to one of the metallic holes on that motor. Thread another wire through the metallic hole on the other side. 

To make sure the jumper wires stick to the metal leads, I recommend you solder them to the leads. Soldering means joining the wire with the metal surface of the motor using hot metal.

13-soldering-the-motors

Soldering sounds complicated if you have never done it before. It might even seem scary working with hot metal. Don’t worry. I felt the same way before I did my first soldering job. Once you have done one though, you will realize that it is a quick process (lasts no more than a few seconds).

If you have never soldered before, you can check out this video tutorial:

You can also check out my video below where I solder some metal pins to an electronic board. All the soldering equipment used in this video below is listed in the “You Will Need” section earlier in this tutorial: 

Return to Table of Contents

Assemble the “Nervous System” of the Robot

Now that the robot has its brain (Arduino mounted on the back of the robot) and a body, it needs a “nervous system,” communication lines that enable the brain to transmit signals to and from different parts of its body. In the context of this project, those communication lines are the wires that we need to connect between the different parts of the robot we’re building.

Connect the L293D to the Solderless Breadboard

First, we need to connect the L293D motor controller. The job of this component is to control both of your motors. You can think of an L293D motor controller as “air traffic control” for moving electrons. 

In order for a motor to move (or for a light bulb to light…heck any object which needs moving electrons (i.e. electricity) to operate), it needs electrons to flow through it. If we move electrons through a motor in one direction, a motor will spin in one direction. If we reverse the direction electrons travel through a motor, we can make a motor spin the other direction. How can we make electrons change directions? That is the function of the L293D motor controller. 

By sending electrons to different combinations of pins of the L293D motor controller, we can make the robot car’s motors go forwards and reverse. You don’t need to know the details of how all this works, but just on a high level know that an L293D motor controller accepts electric signals (i.e. moving electrons) from your Arduino board as well as your batteries (think of batteries and your Arduino as “electron pumps”) and gets them to your motors in a way that causes them to spin either clockwise or counter-clockwise to make the wheels turn.

If you want to deep dive into how H-bridges like the L293D motor controller work, check out this article on Wikipedia.

If you want to understand how electricity (moving electrons) works. Check out this video, which covers the basics.

Ok, with that little bit of theory out of the way, let’s start building again.

Sink the 16 pins of the L293D motor controller down into the holes of the solderless breadboard so that the controller straddles the gap that runs the length of the breadboard. If this is the first time you have used a solderless breadboard, check out a quick tutorial on how to read a solderless breadboard. There are a lot of good tutorials on YouTube. Here is one I like:

Here is the diagram of the L293D.

L293D-with-motors-1

Put pin 1 (the pin just to the left of the half-circle notch in the L293D) into pin e3 of the solderless breadboard. You’ll have to bend the legs a bit on the L293D to get it to sink down all the way. 

14-add-the-L293D
16a-l293d_bb

With the L293D settled down firmly into your solderless breadboard, let’s hook everything up. We are going to go from top to bottom on one side of the L293D, and then we will go from top to bottom on the other side of the L293D. We will connect all 16 legs of the L293D, one step at a time, starting from Pin 1. 

There are a lot of connections, and you need to get all of them correct in order to get the motors going, so proceed slowly and carefully to make sure you get everything right. No need to hurry.

Here is the Arduino with its numbered pins.

arduino-uno

Here is L293D.

L293D-with-motors-1

And here is the diagram of all the connections we are about to make (sorry the image is so small…just follow the connections I’ve written below):

16b-connect_l293d
zoom-in-connections
arduino_zoom-in

Connect Side 1 (Left Motor) of the L293D

  • Connect Pin 1 of the L293D to Pin 5 of the Arduino. 
    • Pin 1 is the Enable pin of the L293D. It is like a switch that turns the motor ON. 
    • Pin 1 doesn’t make the motor move directly…it just turns the motor on that side to ON so that it is able to move when it receives signals from pins 3 and 6.)
  • Connect Pin 2 of the L293D to Pin 6 of the Arduino.
    • Pin 6 on the L293D receives an input signal from the Arduino board, either HIGH (5 volts) or LOW (0 volts) voltage.
  • Connect Pin 3 of the L293D to one of the leads of Motor A (doesn’t matter which motor, just keep track which one is A and which one is B)
    • Pin 3 of the L293D outputs a signal to Motor A to make it move.
  • Connect Pin 4 of the L293D to the blue Ground power rail of your solderless breadboard (the one labeled with a negative (-) sign).
    • Pin 4 is connected to electric ground (to make sure that the electric charge in the L293D has somewhere to go and dissipate).
  • Connect Pin 5 of the L293D to the blue Ground power rail of your solderless breadboard (the one labeled with a negative (-) sign).
    • Pin 5 is connected to electric ground (to make sure that the electric charge in the L293D has somewhere to go and dissipate).
  • Connect Pin 6 of the L293D to one of the leads of Motor A.
    • Pin 6 of the L293D outputs a signal to Motor A to make it move.
  • Connect Pin 7 of the L293D to Pin 7 of the Arduino.
    • Pin 7 receives an input signal from the Arduino board, either HIGH (5 volts) or LOW (0 volts) voltage.
  • Connect Pin 8 of the L293D to the red Positive power rail of your solderless breadboard (the one labeled with a positive (+) sign).
    • This pin requires at least a 5V input power supply (which will come from your batteries…more on this later) to power the motors.

Connect Side 2 (Right Motor) of the L293D

  • Connect Pin 16 of the L293D to the positive (red) power rail of the breadboard. Then connect the positive (red) power rail to the 5V pin of the Arduino.
    • This pin is the 5V power supply for the L293D itself. It is not the power supply used to power your motors.
  • Connect Pin 15 of the L293D to Pin 10 of the Arduino.
  • Connect Pin 14 of the L293D to one of the leads of Motor B
  • Connect Pin 13 of the L293D to the blue Ground power rail of your solderless breadboard (the one labeled with a negative (-) sign).
  • Connect Pin 12 of the L293D to the blue Ground power rail of your solderless breadboard (the one labeled with a negative (-) sign).
  • Connect Pin 11 of the L293D to one of the leads of Motor B.
  • Connect Pin 10 of the L293D to Pin 9 of the Arduino.
  • Connect Pin 9 of the L293D to Pin 8 of the Arduino.

Connect the Power Rails

Now we need to connect the power rails of your breadboard.

  • Get a jumper wire and connect both blue Ground negative (-) rails together.
  • Connect the black (negative) lead of the 4×1.5V AA battery pack to the blue Ground rail (note there are two AA batteries in the image…you will need 4).
  • Connect the red (positive) lead of the battery pack to the red positive power rail of the solderless breadboard.
  • Connect the blue Ground (negative) rail to the GND pin on the Arduino.
  • Connect the 5V pin of the Arduino to the red (positive) rail of the solderless breadboard.
15-wire-the-L293D

Here is what the final connection should look like:

16c-final_connection_l293d

Test Your Connections

16-wire-the-L293D

Now let’s test our connections.

Plug in your Arduino to the USB port on your PC.

Open up the Arduino IDE.

We are going to write a program that makes the wheels of your robot go forward, backwards, and then stop. Open a new sketch, and type the following code:

/**
* Bruno Santos, 2013
* feiticeir0@whatgeek.com.pt
* Small code to test DC motors - 2x with a L298 Dual H-Bridge Motor Driver
* Free to share
**/

//Testing the DC Motors with
// L293D

//Define Pins
//Motor A
int enableA = 5;
int MotorA1 = 6;
int MotorA2 = 7;
 
//Motor B
int enableB = 8;
int MotorB1 = 9;
int MotorB2 = 10;

void setup() {
  
  Serial.begin (9600);
  //configure pin modes
  pinMode (enableA, OUTPUT);
  pinMode (MotorA1, OUTPUT);
  pinMode (MotorA2, OUTPUT);  
  
  pinMode (enableB, OUTPUT);
  pinMode (MotorB1, OUTPUT);
  pinMode (MotorB2, OUTPUT);  
  
}

void loop() {
  //enabling motor A and B
  Serial.println ("Enabling Motors");
  digitalWrite (enableA, HIGH);
  digitalWrite (enableB, HIGH);
  delay (3000);
  //do something

  Serial.println ("Motion Forward");
  digitalWrite (MotorA1, LOW);
  digitalWrite (MotorA2, HIGH);

  digitalWrite (MotorB1, LOW);
  digitalWrite (MotorB2, HIGH);

  //3s forward
  delay (3000);
  
  Serial.println ("Motion Backwards");
  //reverse
  digitalWrite (MotorA1,HIGH);
  digitalWrite (MotorA2,LOW);  
  
  digitalWrite (MotorB1,HIGH);
  digitalWrite (MotorB2,LOW);  

  //3s backwards
  delay (3000);

  Serial.println ("Stoping motors");
  //stop
  digitalWrite (enableA, LOW);
  digitalWrite (enableB, LOW);
  delay (3000);
}

Before you upload your code to your Arduino, hold your robot in your hand because the wheels are about to move, and you don’t want your robot to rip away from your computer!

You can now upload the code to your Arduino, and turn the 4×1.5V AA battery pack to the ON position.

When you have had enough, upload a blank, new sketch to your Arduino board (this will stop the program).

Right after you upload the code to your board, the first movement your wheels should make is forward. If a wheel is not moving forward on that first segment of the loop, you need to switch the holes that the two leads from that wheel are connected to. In this case, if it is motor A that is not moving like it should, the leads connected to Pin 3 and Pin 6 of the L293D need to switch places.

Return to Table of Contents

Connect the HC-SR05 Ultrasonic Sensor (the “Eyes”)

17-add-ultrasonic-sensor

Now we need to connect the HC-SR05 ultrasonic sensor to the solderless breadboard in order to be able to detect obstacles in the robot’s path. I recommend you sink the ultrasonic sensor down into available holes of your solderless breadboard. You want the ultrasonic sensor to face the front of your robot.

17a-ultrasonic_sensor_wiring
usound-colors-2
arduino-12-13

Here are the connections:

  • VCC on the sensor connects to the positive (red) rail of the solderless breadboard, which is connected to the 5V pin on the Arduino
  • Echo on the sensor connects to Digital Pin 13 on the Arduino
  • Trig (stands for trigger) on the sensor connects to Digital Pin 12 on the Arduino
  • GND (stands for Ground) on the sensor connects to ground on the solderless breadboard (blue negative rail).
18-add-ultrasonic-sensor

Let’s test the ultrasonic sensor.

Plug in your Arduino to the USB port on your laptop computer.

Open the Arduino IDE.

Upload the following sketch to the Arduino to test the ultrasonic sensor.

/**
 *  This program tests the ultrasonic
 *  distance sensor
 * 
 * @author Addison Sears-Collins
 * @version 1.0 2019-05-13
 */
 
/* Give a name to a constant value before
 * the program is compiled. The compiler will 
 * replace references to Trigger and Echo with 
 * 7 and 8, respectively, at compile time.
 * These defined constants don't take up 
 * memory space on the Arduino.
 */
#define Trigger 12
#define Echo 13
 
/*   
 *  This setup code is run only once, when 
 *  Arudino is supplied with power.
 */
void setup(){
 
  // Set the baud rate to 9600. 9600 means that 
  // the serial port is capable of transferring 
  // a maximum of 9600 bits per second.
  Serial.begin(9600);
 
  // Define each pin as an input or output.
  pinMode(Echo, INPUT);
  pinMode(Trigger, OUTPUT);
}
 
void loop(){
 
  // Make the Trigger LOW (0 volts) 
  // for 2 microseconds
  digitalWrite(Trigger, LOW);
  delayMicroseconds(2);
 
  // Emit high frequency 40kHz sound pulse
  // (i.e. pull the Trigger) 
  // by making Trigger HIGH (5 volts) 
  // for 10 microseconds
  digitalWrite(Trigger, HIGH);
  delayMicroseconds(10);
  digitalWrite(Trigger, LOW); 
 
  // Detect a pulse on the Echo pin 8. 
  // pulseIn() measures the time in 
  // microseconds until the sound pulse
  // returns back to the sensor.
  int distance = pulseIn(Echo, HIGH);
 
  // Speed of sound is:
  // 13511.811023622 inches per second
  // 13511.811023622/10^6 inches per microsecond
  // 0.013511811 inches per microsecond
  // Taking the reciprocal, we have:
  // 74.00932414 microseconds per inch 
  // Below, we convert microseconds to inches by 
  // dividing by 74 and then dividing by 2
  // to account for the roundtrip time.
  distance = distance / 74 / 2;
 
  // Print the distance in inches
  Serial.println(distance);
 
  // Pause for 100 milliseconds
  delay(100);
}

As soon as uploading is finished and with the USB cable still connected to the Arduino, click on the green magnifying glass in the upper right of the IDE to open the Serial Monitor.

24-magnifying-glass-arduino

Make sure you have the following settings:

  • Autoscroll: selected
  • Line ending: No Line ending
  • Baud: 9600 baud

Place any object in front of the sensor and move it back and forth. You should see the distance readings (in inches) on the Serial Monitor change accordingly.

19-test_ultrasonic_sensorJPG

Return to Table of Contents

Connect the HC-05 Wireless Bluetooth RF Transceiver (the “Mouth”)

Now we need to connect the HC-05 Wireless Bluetooth RF Transceiver (i.e. bluetooth module).

20a-bluetooth-connection-diagram
ble-module
  • Connect the VCC pin of the bluetooth module to the red (positive) power rail of your solderless breadboard (the rail connected to the 5V pin of the Arduino). 
    • Note that the bluetooth module can accept an input power supply of 3.6 to 6V, so we could have also connected it to the rail connected to the 6V battery pack (i.e. 1.5V * 4 batteries).
  • Connect GND to the negative (blue) ground power rail of the solderless breadboard.
  • Connect the TXD pin (transmitter) of the bluetooth module to digital pin 2 (this will be the receiver RX) on the Arduino.
  • Connect the RXD pin (receiver) of the bluetooth module to a 1K ohm resistor. 
    • We have to use a resistor because this pin can only handle 3.3V, but the Arduino generates 5V. We don’t want to burn out our bluetooth module!
  • Connect the 1K ohm resistor to digital pin 3 (this will be the transmitter TX) on the Arduino.
  • Connect the RXD pin (receiver) of the bluetooth module to a 2K ohm resistor. 
    • This whole 1K ohm + 2K ohm resistor set up is used to divide the 5V input voltage from the Arduino. It is formally called a voltage divider.
  • Connect the 2K ohm resistor to the negative (blue) ground power rail of the solderless breadboard.

There are a lot of wires and components connected. Double check that everything is wired correctly.

20b-add-bluetooth-module

Once you have the HC-05 connected, let’s test it. First, download a bluetooth terminal app on your smartphone. We want to speak with the Arduino via our smartphone. I will download the Serial Bluetooth Terminal app from the Google Play store.

23-serial_bluetooth_terminal_app

Next, we write the following code and upload it to our Arduino board.

#include <SoftwareSerial.h>
SoftwareSerial EEBlue(2, 3); // RX | TX
void setup()
{
 
  Serial.begin(9600);
  EEBlue.begin(9600);  //Default Baud for comm, it may be different for your Module. 
  Serial.println("The bluetooth gates are open.\n Connect to HC-05 from any other bluetooth device with 1234 as pairing key!.");
 
}
 
void loop()
{
 
  // Feed any data from bluetooth to Terminal.
  if (EEBlue.available())
    Serial.write(EEBlue.read());
 
  // Feed all data from terminal to bluetooth
  if (Serial.available())
    EEBlue.write(Serial.read());
}

Click the magnifying glass in the upper right of the IDE to start the program.

Now, on your smartphone, open the Serial Bluetooth Terminal app. 

Turn on Bluetooth on your smartphone.

Pair with the HC-05.

Within the Serial Bluetooth Terminal app, go to the menu on the left-hand side and select Devices.

21c-ble-pairing

Select HC-05. Your smartphone will now connect to your HC-05.

You are now ready to send messages to your Arduino. Type in a message and click the arrow key to send the message to your Arduino.

21b-test-pairing

The message should show up on the Serial Monitor of your Arduino.

21-test_bluetooth_pairing

Congratulations! You have Bluetooth all set up on your Arduino.

Return to Table of Contents

Simulate the 3D Model of the Robot Using URDF

You might be wondering…what the heck does URDF mean? URDF stands for Unified Robot Description Format. URDF is a text-based format (i.e. XML format or Xacro format to be more specific) that is used in ROS to describe all of the parts of a particular robot, including sensors, controllers, actuators, joints, links, etc. 

A URDF file tells a computer what a robot looks like in real life (i.e. its physical description). ROS can use the URDF file to create simulations of a robot before the roboticist builds and deploys the robot in the real world.

In this section, we’re going to focus on how to use a URDF file to simulate your wheeled robot. We will use a ready-made URDF file rather than building one from scratch. 

If you want to learn how to build a URDF file from scratch, check out these tutorials on the ROS website: http://wiki.ros.org/urdf/Tutorials. You don’t need to go through those tutorials now. I do recommend, however, taking a look at this page to see a “hello world” example of URDF in ROS.

Ok, now we are going to copy a ready-made mobile robot description package (that contains the URDF file we want) into our catkin_ws/src folder. Credit to Lentin Joseph, author of Robot Operating System (ROS) for Absolute Beginners for creating this package.

Open up a new Linux terminal window.

cd catkin_ws/src

Download the mobile_robot_description package from Github to the catkin_ws/src folder.

svn checkout https://github.com/Apress/Robot-Operating-System-Abs-Begs/trunk/chapter_6/mobile_robot_description
25_download_mobile_robot_package

Build the package.

cd ~/catkin_ws
catkin_make

Now, type the following command to see a crude visualization of the wheeled robot in the visualization tool RViz.

roslaunch mobile_robot_description view_robot.launch

Here is what you should see:

26_crude_model_of_robot
27-crude-model-of-robot-2

You can use your mouse to see the robot from different angles. It is kind of clunky trying to figure out how to maneuver about, but it is what it is.

To see the actual code of the launch file we just ran, go to the directory the file is located in.

cd ~/catkin_ws/src/mobile_robot_description/launch
gedit view_robot.launch

So now that you have seen how to run a URDF file, let’s take a look at how we can get our robot to do something useful by writing some code for it.

Return to Table of Contents

Program the Arduino (i.e. the “Brain” of the robot)

Now let’s get our hands dirty with some code. We need to program the Arduino board so that it can:

  1. Read the data from the HC-SRO5 ultrasonic sensor.
  2. Control the motion of the robot.
  3. Communicate with us on our PC.

Here is the code:

// Project Name: Autonomous Obstacle-Avoiding Wheeled Robot

// Author: Addison Sears-Collins

// This code is used to drive a two-wheeled differential drive robot. 
// You will need to modify the pins according
// to the wiring connections you made when assembling the robot. 

// Set up Serial connection with Bluetooth module
#include <SoftwareSerial.h>
SoftwareSerial EEBlue(2, 3); // RX | TX (Receiver | Trasmitter)

///////////////////////////////////////////////////////////////////////////////

//Module to interface with the ultrasonic sensor
#define TRIGGER_PIN  12  // Arduino pin connected to TRIG on ultrasonic sensor.
#define ECHO_PIN     13  // Arduino pin connected to ECHO on ultrasonic sensor.

///////////////////////////////////////////////////////////////////////////////

//Setup Ultrasonic sensor
void Setup_Ultrasonic()
{

  // Define each pin as an input or output.
  pinMode(ECHO_PIN, INPUT);
  pinMode(TRIGGER_PIN, OUTPUT);
  
}

///////////////////////////////////////////////////////////////////////////////

 /* Motor driver pin definitions and mappings to the Arduino
  * 
  *  
  */

/* MOTOR PIN DEFINITIONS 

ARDUINO DIGITAL PIN    ||||   MOTOR DRIVER (L298 PIN)

          5                           ENA (Enable 1 - Left Motor)
     
          6                           IN1
          
          7                           IN2
          
          8                           ENB (Enable 2 - Right Motor)
          
          10                          IN4
          
          9                           IN3
          


*/


#define enableA 5   // Connected to Left Motor
#define MotorA1 6
#define MotorA2 7

#define enableB 8  //Connected to Right Motor
#define MotorB1 9
#define MotorB2 10


///////////////////////////////////////////////////////////////////////////////

//This function initializes the motor pins that are defined as MACROS
void Setup_Motors()
{
  
  // Set up left motor
  pinMode(enableA,OUTPUT);
  pinMode(MotorA1,OUTPUT);
  pinMode(MotorA2,OUTPUT);

  // Set up right motor
  pinMode(enableB,OUTPUT);
  pinMode(MotorB1,OUTPUT);
  pinMode(MotorB2,OUTPUT);
 
  delay(200);     // Pause 200 milliseconds 
 
  go_forward();  // Move forward
  
}

///////////////////////////////////////////////////////////////////////////////

//Setup Serial communication
void Setup_Serial(int baud_rate)
{

  Serial.begin(9600);  
  EEBlue.begin(9600);  //Default Baud for communications
  
}


///////////////////////////////////////////////////////////////////////////////
// Returns the distance to the obstacle as an integer
int Update_Ultrasonic()
{

  int distance = 0;
  int average = 0;
 
  // Grab four measurements of distance and calculate
  // the average.
  for (int i = 0; i < 4; i++) {
 
    // Make the TRIGGER_PIN LOW (0 volts) 
    // for 2 microseconds
    digitalWrite(TRIGGER_PIN, LOW);
    delayMicroseconds(2); 
     
    // Emit high frequency 40kHz sound pulse
    // (i.e. pull the TRIGGER_PIN) 
    // by making TRIGGER_PIN HIGH (5 volts) 
    // for 10 microseconds
    digitalWrite(TRIGGER_PIN, HIGH);
    delayMicroseconds(10);
    digitalWrite(TRIGGER_PIN, LOW);
      
    // Detect a pulse on the ECHO_PIN pin 8. 
    // pulseIn() measures the time in 
    // microseconds until the sound pulse
    // returns back to the sensor.    
    distance = pulseIn(ECHO_PIN, HIGH);
 
    // Speed of sound is:
    // 13511.811023622 inches per second
    // 13511.811023622/10^6 inches per microsecond
    // 0.013511811 inches per microsecond
    // Taking the reciprocal, we have:
    // 74.00932414 microseconds per inch 
    // Below, we convert microseconds to inches by 
    // dividing by 74 and then dividing by 2
    // to account for the roundtrip time.
    distance = distance / 74 / 2;
 
    // Compute running sum
    average += distance;
 
    // Wait 10 milliseconds between pings
    delay(10);
  }
  
  distance = average / 4;

  Serial.print("u ");
  Serial.print(distance);
  Serial.print("\n"); 

  int distance_copy = distance;

  // Initialize string
  char str[] = "u ";
  char str_dist[10];

  // Convert distance integer into a string
  sprintf(str_dist, "%d", distance_copy);

  // Add a new line
  char add_new_line[] = "\n";

  // Concatenate to produce the new string
  strcat(str_dist, add_new_line);  
  strcat(str, str_dist); 

  // Output data to bluetooth
  EEBlue.write(str);

  return distance;  

}

///////////////////////////////////////////////////////////////////////////////
// The following function controls the motion of the robot

void Move_Robot(int distance)
{
  
  // If obstacle <= 2 inches away
  if (distance >= 0 &amp;&amp; distance <= 2) {    
    go_backwards();   // Move in reverse for 0.5 seconds
    delay(500);
 
    /* Go left or right to avoid the obstacle*/
    if (random(2) == 0) {  // Generates 0 or 1, randomly        
      go_right();  // Turn right for one second
    }
    else {
      go_left();  // Turn left for one second
    }
    delay(1000);
    go_forward();  // Move forward
  }
  delay(50); // Wait 50 milliseconds before pinging again 
}

/*   
 *  Forwards, backwards, right, left, stop.
 */
void go_forward() {
  //enabling motor A and B
  digitalWrite (enableA, HIGH);
  digitalWrite (enableB, HIGH);
  
  // Move forward
  digitalWrite (MotorA1, LOW);
  digitalWrite (MotorA2, HIGH);
  digitalWrite (MotorB1, LOW);
  digitalWrite (MotorB2, HIGH);

}
void go_backwards() {
  //enabling motor A and B
  digitalWrite (enableA, HIGH);
  digitalWrite (enableB, HIGH);
  
  // Go backwards
  digitalWrite (MotorA1,HIGH);
  digitalWrite (MotorA2,LOW);  
  digitalWrite (MotorB1,HIGH);
  digitalWrite (MotorB2,LOW);  
  
}
void go_right() {
  //enabling motor A and B
  digitalWrite (enableA, HIGH);
  digitalWrite (enableB, HIGH);
  
  // Turn right
  digitalWrite (MotorA1, LOW);
  digitalWrite (MotorA2, HIGH);
  digitalWrite (MotorB1,HIGH);
  digitalWrite (MotorB2,LOW); 
}
void go_left() {
  //enabling motor A and B
  digitalWrite (enableA, HIGH);
  digitalWrite (enableB, HIGH);
  
  // Turn left
  digitalWrite (MotorA1,HIGH);
  digitalWrite (MotorA2,LOW);  
  digitalWrite (MotorB1, LOW);
  digitalWrite (MotorB2, HIGH);
}
void stop_all() {
  digitalWrite (enableA, LOW);
  digitalWrite (enableB, LOW);
}

///////////////////////////////////////////////////////////////////////////////
//Read from Serial Function

void Read_From_Serial()
{
  // Read data from Serial terminal of Arduino IDE
  while(Serial.available() > 0)
    {      
     EEBlue.write(Serial.read());        
    } 

   // Read data from Bluetooth module
   //while(EEBlue.available() > 0)
   // {
   //   Serial.write(EEBlue.read());     
   //  int data = Serial.read();       
     
   // } 

}

///////////////////////////////////////////////////////////////////////////////
//Update all
void Update_all()
{
  
  int distance = Update_Ultrasonic();
  
  Read_From_Serial();
  
  Move_Robot(distance);
  
}


///////////////////////////////////////////////////////////////////////////////
// Setup function for Arduino
void setup() {

  // Initializes the pseudo-random number generator
  // Needed for the robot to wander around the room
  randomSeed(analogRead(3));
  
  Setup_Ultrasonic();

    Setup_Serial(9600);
    
    Setup_Motors();    

}
/////////////////////////////////////////////////////////////////////////////////
// This part loops over and over again
void loop() {


  Update_all();

    
}

Let’s test the code. With your robot connected to your PC via the USB cord, upload the code to your Arduino board.

Unplug the Arduino from your computer.

Arduino can handle an input supply voltage from 7 – 12V, so let’s add a 9V battery to the board using Velcro fasteners. You can also use some multi-purpose black cable ties.

29-9v-battery

Before you plug the battery into your Arduino, make sure your Arduino is somewhere on the floor with a lot of space. Hardwood or smooth floors work best. The robot’s motors are not powerful enough to move through thick carpet.

Turn on the motors by switching on the 4×1.5V AA battery pack.

Now plug in the Arduino. The Arduino program you burned into your board will start automatically whenever power is supplied.

29c-9v-battery-connection
29b-9v-battery-connection-to-arduino

If your car does not automatically start, put your hand in front of the ultrasonic sensor to get the car started. 

You should see your robot moving around the floor autonomously, avoiding obstacles anytime it gets within two inches of an object. Yay!

30-complete-robot

Now, open up your smartphone, and launch the Serial Bluetooth Terminal App. You should see the distance measurements (prefixed with “u “, which means ultrasonic sensor) being printed to your phone. 

21d-ble-data-flowing

Whew! That was a lot of work. We are not done yet, but we have come a long way so far. 

Return to Table of Contents

Find the MAC Address of Your Bluetooth Module

Now we need to get ROS integrated into our project. Specifically, we want to have our master computer (i.e. PC…desktop computer with Ubuntu Linux installed) “listen” to the raw ultrasonic sensor distance data and publish that data as a message to a ROS topic. All this communication will happen via Bluetooth. 

The first thing we need to do is to find the MAC address (i.e. Bluetooth Address) of our HC-05 bluetooth module.

Make sure your Arduino board is powered on (so that the HC-05 Bluetooth light is blinking), and the 4×1.5V AA battery pack is turned off.

If you are on a Windows 10 computer like I am, go to search for Bluetooth and Other Devices and then: 

  • Click Add Bluetooth or other device
  • Click Bluetooth
  • Click HC-05
  • Type the password: 1234
  • Click Connect
  • Click Done and close out all windows.
  • Right-click on the Windows icon in the bottom left of your desktop.
  • Go to the Device Manager.
  • Expand the Bluetooth options
  • Find HC-05
  • Right-click on the Bluetooth device
  • Click Properties
  • Go to the Details tab
  • Under “Property” select “Bluetooth device address”
  • The MAC address for my Bluetooth device is 98:D3:B1:FD:48:FF. Write that address down. We will need it later. 
dev-manager

Return to Table of Contents

Connect the Bluetooth Module to Ubuntu Linux

Now, to get Bluetooth enabled in Ubuntu Linux, first, unplug any Bluetooth device that is connected to your PC (i.e. your laptop, personal computer). 

Start your PC.

Plug the USB Bluetooth dongle into the USB port on your computer. You cannot use your built-in Bluetooth for Virtual Box. It won’t work. That is why you need the external USB Bluetooth dongle.

Restart your PC.

If you are on Windows machine like I am, search for the Device Manager on your PC by right-clicking the Windows icon on your desktop.

30a-device-manager-bluetooth

Open up the Bluetooth option.

30b-working-bluetooth-dongle

Make sure the Bluetooth dongle is installed on your computer. There should be no exclamation points or weird error messages near it. If you do see that, restart your PC.

30c-bluetooth_enabled

If your dongle is still showing weird messages, it is likely because the built-in Bluetooth on your computer is conflicting with it. Bluetooth dongles are notoriously hard to set up on PCs. Restart your computer again, but, before you do that, insert your dongle into a different USB port. 

Be persistent in getting your Bluetooth dongle to work (Don’t give up! Robotics requires ironclad persistence and patience to get things working). If everything looks good, it should look like this:

You can also try disabling any Bluetooth options other than the Bluetooth dongle. On a Windows machine, you do this through the Bluetooth option on the Device Manager as well.

Eventually, you will get your dongle enabled. Once you do, disable it by right-clicking on it and clicking “Disable device”. You will see a tiny down arrow over the Bluetooth icon.

30d-bluetooth_disabled

Now, launch Ubuntu Linux in your Virtual Machine.

30e-launch_ubuntu

Go back to the Device Manager in Windows, and enable the Bluetooth adapter (right-click on the Bluetooth device and select “Enable device”).

Now return to Ubuntu Linux and, at the menu at the top, go to Devices -> USB. 

devices

Select the Bluetooth dongle to enable it. Mine is labeled Broadcom Corp BCM.

To make sure everything is working, open a new terminal window and type the following command:

hciconfig -a
31-hci-config

Make sure the output says “UP RUNNING”. That is how you know everything is working properly.

Now that Bluetooth is enabled, we need to pair our Ubuntu Linux with the robot’s HC-05 Bluetooth module. 

Power up your robot (just the Arduino…NOT the motors of your robot).

Open the Bluetooth settings by going to your system settings:

gnome-control-center

Select Bluetooth.

Now that your Bluetooth panel is open, your computer will begin searching for Bluetooth devices to connect to. Wait until it has found “HC-05”, which is the robot’s Bluetooth. It may take a while, and you might need to restart this Bluetooth panel in System Settings multiple times to get it to work. Bluetooth is fickle like that.

34-found_hc_05

Click the device under the Devices list.

Eventually a panel will show up. Type in the PIN and click confirm. The PIN is 1234, and is the same for all HC-05s.

35-confirm-bt-pin

You will establish a brief connection, then it will get Disconnected. You can click on the HC-05, and it should say “Paired”. 

36-paired-hc-05

Now, open a new terminal window and download blueman, the Bluetooth Manager. This package helps us to double check to see if Ubuntu Linux is setup to connect to the robot’s Bluetooth. 

Type:

sudo apt-get install blueman

Next, go to Activities on your Desktop, and search for Bluetooth Manager.

search
37-search_for_bluetooth_manager

Click Install.

Launch the application and look for the HC-05 (make sure your robot is powered on, otherwise it won’t be visible).

Hover your cursor over HC-05, and it should say “Trusted and Bonded”. You should also see a little key on the upper-left of the Bluetooth icon.

38-key-for-bluetooth-manager

Test the Bluetooth Feed

Let’s see if we can read the ultrasonic sensor data transmitting from our robot.

Open a new terminal window in Ubuntu Liunx, and create a new directory called sandbox.

mkdir sandbox

Move to that directory.

cd sandbox

Create a new file:

touch bluetooth_test.py

Open the file.

gedit bluetooth_test.py

Add this code. Make sure you modify my code with your own robot_bluetooth_mac_address.

#!/usr/bin/env python

'''
File name: bluetooth_test.python

This program tests the Bluetooth connection between your PC and your robot.
The PC receives messages from the robot via Bluetooth and prints
those messages to your screen.

Modified from https://people.csail.mit.edu/albert/bluez-intro/x232.html

Author: Addison Sears-Collins
'''

import bluetooth # Import the python-bluez library
import time

###############################################################################
# Bluetooth parameters

robot_bluetooth_mac_address = '98:D3:B1:FD:48:FF'
port = 1
pc_bluetooth_handle = None
data_size = 300

###############################################################################
# Connect the PC's Bluetooth to the robot's Bluetooth

def connect():
  global pc_bluetooth_handle	
  
  while(True):    
    try:
      pc_bluetooth_handle = bluetooth.BluetoothSocket(bluetooth.RFCOMM)
      pc_bluetooth_handle.connect((robot_bluetooth_mac_address, port))
      break;
    except bluetooth.btcommon.BluetoothError as error:
      pc_bluetooth_handle.close()
      print ("Could not connect: ", error, "; Retrying in 10s...")
      time.sleep(10)
  
  return pc_bluetooth_handle
  
pc_bluetooth_handle = connect() # Try to connect to the robot's Bluetooth
 
###############################################################################
# Main code

# If this file is the main (driver) program you are executing
if __name__ == '__main__': 

  while(True):
    try:
      # Keep reading data from the robot
      incoming_data_from_robot = pc_bluetooth_handle.recv(data_size)
      time.sleep(0.05)
      print(incoming_data_from_robot)

    except bluetooth.btcommon.BluetoothError as error:
      print ("Caught BluetoothError: ", error)
      time.sleep(5)
      pc_bluetooth_handle = connect()
      pass

  pc_bluetooth_handle.close()

Save the file and then go back to the terminal.

Install the following Bluetooth library. This library is called python-bluez. It handles all the Bluetooth functionalities, including accessing the robot’s Bluetooth that is connected to your PC.

sudo apt-get install python-bluez

Now, let’s change the access permissions on the bluetooth_test.py file so that we can run it.

chmod +x bluetooth_test.py

Now, run the program.

python bluetooth_test.py
39-distance-to-object-from-robot

Click on the terminal window, and press CTRL+C at any time to stop the program from running.

40-ctrl-c-to-stop-program

To rerun the program, you can press the up arrow on your keyboard until you find ‘python bluetooth_test.py’. Then press ENTER to rerun it.

Troubleshooting Tips

41-bluetooth-connection-errors

If your program is not working, try the following:

  • Unplug your Arduino and plug it back in.
  • Launch a new terminal, and move to the sandbox folder.
  • Launch bluetooth_test.py (using the python bluetooth_test.py command)
  • In a terminal window, launch Bluetooth settings using this command: gnome-control-center

As I mentioned previously, I have no idea why Bluetooth is so fickle. Just keep trying the steps I’ve outlined above until you get the distance data printed to your screen. 

As far as the data feed is concerned, the u means ultrasonic sensor, and the number after that is the distance to the object in front of the robot, in inches. I’m sure there is a way more efficient way to get Bluetooth connected, but this process works for me.

Now, we need to get this program integrated with ROS. We want it to publish that distance data (i.e. ROS message) to a topic and have a Subscriber node subscribe to that topic so that it can receive the distance  data. The setup will be very similar to what we did in the hello world program.

Return to Table of Contents

Create a ROS Package

First, let’s create a new ROS package.

Open a new terminal window, and move to your catkin workspace:

cd ~/catkin_ws/src

Create a new package named “wheeled_robot_arduino”.

catkin_create_pkg wheeled_robot_arduino std_msgs rospy roscpp
42-create-ros-package

Build the package by opening a new terminal window and type:

cd ~/catkin_ws
catkin_make

Now, navigate to that ROS package.

roscd wheeled_robot_arduino

You should now be in your ~/catkin_ws/src/wheeled_robot_arduino folder.

Let’s add a scripts directory, where we will keep all our Python scripts.

mkdir scripts

That’s it for creating a package. Now let’s create a Publisher node.

Return to Table of Contents

Create a ROS Publisher Node

Here we’ll create the publisher (“talker”) node which will continually broadcast a message. In plain English, this is a Python program that will read the incoming distance data from Bluetooth and publish that data to a ROS topic named ‘obstacle_distance’. We will name this Publisher node talker.py.

So that we don’t have to start from scratch, copy bluetooth_test.py into your  ~/catkin_ws/src/wheeled_robot_arduino/scripts folder.

cd
cd sandbox
cp bluetooth_test.py ~/catkin_ws/src/wheeled_robot_arduino/scripts
43-copy-bluetooth-test
roscd wheeled_robot_arduino/scripts
dir

Now let’s rename bluetooth_test.py. Its new name will be talker.py. talker.py will be the Publisher node.

mv bluetooth_test.py talker.py

Now, edit the file.

gedit talker.py

Here is the full code.

#!/usr/bin/env python

import rospy # ROS Python library
from std_msgs.msg import String

import bluetooth # Import the python-bluez library

###############################################################################
# Bluetooth parameters

robot_bluetooth_mac_address = '98:D3:B1:FD:48:FF'
port = 1
pc_bluetooth_handle = None
data_size = 300

###############################################################################
# Publisher List

# Ultrasonic distance sensor data will be published to a ROS topic named
# obstacle_distance using the message type String. Other data types like
# Float32, Int64, etc. are possible in other applications. Here we use String.
ultrasonic_handle = rospy.Publisher('obstacle_distance', String, queue_size=10)

###############################################################################
# Launch the ROS node

rospy.init_node('talker', anonymous=True)
rospy.loginfo("Starting Talker Node")


###############################################################################
# Connect the PC's Bluetooth to the robot's Bluetooth

def connect():
  global pc_bluetooth_handle	
  
  while(True):    
    try:
      pc_bluetooth_handle = bluetooth.BluetoothSocket(bluetooth.RFCOMM)
      pc_bluetooth_handle.connect((robot_bluetooth_mac_address, port))
      break;
    except bluetooth.btcommon.BluetoothError as error:
      pc_bluetooth_handle.close()
      rospy.logwarn("Could not connect: ", error, "; Retrying in 10s...")
      rospy.sleep(10)
  
  return pc_bluetooth_handle
  
pc_bluetooth_handle = connect() # Try to connect to the robot's Bluetooth
 
###############################################################################
# Main code

# If this file is the main (driver) program you are executing
if __name__ == '__main__': 

  while not rospy.is_shutdown():
    try:
      # Keep reading data from the robot
      incoming_data_from_robot = pc_bluetooth_handle.recv(data_size)
      rospy.loginfo(incoming_data_from_robot)
      ultrasonic_handle.publish(incoming_data_from_robot)
      rospy.sleep(0.05)

    except bluetooth.btcommon.BluetoothError as error:
      rospy.logerr("Caught BluetoothError: ", error)
      time.sleep(5)
      pc_bluetooth_handle = connect()
      pass

  pc_bluetooth_handle.close()

Save the file and then close the window.

Now, we need to build the node.

cd  ~/catkin_ws
catkin_make

Open a new terminal window.

Plug in the Arduino board on your robot to get Bluetooth started.

Launch ROS.

roscore

Open a new terminal tab and run your ROS publisher node named talker.py.

rosrun wheeled_robot_arduino talker.py

As soon as you run the command above (you have to act within about 10 seconds), open up a new terminal window and type:

gnome-control-center

Make sure you are on your Bluetooth settings. The Bluetooth panel looks like this:

44-bluetooth-panel

You might need to try executing this command numerous times, opening and closing your Bluetooth panel while the code is trying to execute. As I’ve mentioned before in this tutorial, Bluetooth is fickle and doesn’t often work on the first try (but don’t give up! It WILL work).

45-ros-distance-data-flowing-thru

Let’s check out the obstacle_distance ROS topic now to see what messages are publishing to it. While the code is still running, open up a new terminal tab and type:

rostopic echo obstacle_distance

Here is the output. We use the u separator (which stands for ultrasonic) to separate the distance readings. 

46-obstacle-distance-topic

Congratulations! You have build a complete ROS Publisher Node from scratch. 

Now, instead of opening up a new window to check out the obstacle_distance topic using the command above, how about we build a ROS Subscriber node that subscribes to the topic and prints out what it sees? We’ll call this Subscriber node listener.py. Let’s build it now!

Press CTRL+C on all open tabs and windows to kill all processes. You can also disconnect power from the Arduino on your robot.

Return to Table of Contents

Create a ROS Subscriber Node

Open a new terminal, and go to your ~/catkin_ws/src/wheeled_robot_arduino/scripts folder.

Create a new file named listener.py.

gedit listener.py

Type the following code and save.

#!/usr/bin/env python
import rospy
from std_msgs.msg import String

def callback(data):

    # Print the data that is heard from the ROS topic
    rospy.loginfo(rospy.get_caller_id() + " I heard %s", data.data)
    
def listener():

    # Initialize the node
    rospy.init_node('listener', anonymous=True)

    # Subscribe to the obstacle_distance topic
    rospy.Subscriber("obstacle_distance", String, callback)

    # spin() simply keeps python from exiting until this node is stopped
    rospy.spin()

if __name__ == '__main__':
    listener()

Change its permissions.

chmod +x  listener.py

Now, we need to build the node.

cd  ~/catkin_ws
catkin_make

Open a new terminal window.

Plug in the Arduino board on your robot to get Bluetooth started.

Launch ROS.

roscore

Open a new terminal tab and run your ROS publisher node named talker.py.

rosrun wheeled_robot_arduino talker.py

Immediately, go to a new terminal window, and open your Bluetooth panel.

gnome-control-center

Now, in a new terminal window, run the ROS subscriber node named listener.py.

rosrun wheeled_robot_arduino listener.py
47-response-from-the-listener

When you are finished, press CTRL+C.

Return to Table of Contents

Create a ROS Launch File

Launching talker.py and listener.py separately can be a bit tedious. How about we execute both files from a single script? Let’s do that. We will use a ROS launch file, which will speed up the launch process of our programs.

Go to your wheeled_robot_arduino package.

roscd wheeled_robot_arduino

Create a folder called ‘launch’.

mkdir launch

Move to the launch folder.

cd launch 

Create a new file called talker_listener.launch.

gedit talker_listener.launch

Type the code below, and save it.  This file will run both Python programs, talker.py and listener.py.  

<launch>
  <node name="talker_node" pkg="wheeled_robot_arduino" type="talker.py" output="screen"/>
  <node name="listener_node" pkg="wheeled_robot_arduino" type="listener.py" output="screen"/>
</launch>

Save the file and go back to the terminal.

Change the permissions of the launch file we just created.

chmod +x talker_listener.launch

Plug in the Arduino on your robot to get Bluetooth started.

Now, in a new terminal window, run the launch file.

roslaunch wheeled_robot_arduino talker_listener.launch

Immediately, go to a new terminal window, and open your Bluetooth panel.

gnome-control-center

Watch the talker.py (ROS publisher node) publishing to the /obstacle_distance topic and listener.py (ROS subscriber node) echoing back what it is hearing. 

39-roslaunch-output-1
40-ros-launch-resultsJPG

You will notice that, unlike when we used the Serial Bluetooth Terminal app, the data isn’t always lined up because the speed at which the program is executing within Ubuntu Linux is lagging relative to the speed at which data (e.g. u 5) is coming in via Bluetooth. This is perfectly OK for our purposes in this tutorial, but for you perfectionists out there, you can go back to your Arduino code and remove the ‘u’ character that prints just prior to the distance data. In this way, the only thing that will print out is the distance value (i.e. ‘5’ instead of ‘u 5’). 

Press CTRL+C to stop all process.

Return to Table of Contents

Grand Finale – Launch Your Autonomous Wheeled Robot

41-grand-finale

Ok, now we are ready to put it all together. 

We will have our robot move around the room autonomously, avoiding obstacles along the way. While it is doing that, it is feeding obstacle distance data back to your PC via Bluetooth. This data is being read by the Publisher node, talker.py. talker.py is publishing this data to the obstacle_distance ROS topic. listener.py is subscribed to the obstacle_distance topic. It ‘hears’ the distance readings and prints these to the screen.

Plug your Arduino into the 9V battery power source.

29c-9v-battery-connection

Place the robot somewhere in an open space on the floor.

Open a new terminal window in Ubuntu Linux and type:

roslaunch wheeled_robot_arduino talker_listener.launch

Immediately, go to a new terminal window, and open your Bluetooth panel.

gnome-control-center
44-bluetooth-panel

Make sure the data is flowing into your Linux terminal.

Now, turn on your robot’s motors by switching the 4 x 1.5V battery pack to ON.

Watch your robot move around the room! 

Check out the terminal to see the printout of the distance data!

39-roslaunch-output-1

Congratulations! We have come a long way. We have successfully designed and developed an autonomous wheeled robot using ROS…from scratch! 

2019-11-16-201152

The complex robots, like the ones you might have seen being built by companies like Amazon, iRobot, or Boston Dynamics, have bigger sensors, more code, more wiring, etc…but, at their core, they work using the same fundamentals that drive the autonomous robot we have just built. They all sense the world, think about what they have sensed, and then act.

 I hope you have learned a lot along the way. Keep building!

Return to Table of Contents

What is the Difference Between Mathematical Morphology Filters and Convolution Filters?

What is the Difference Between Mathematical Morphology Filters and Convolution Filters?

Answer: Linearity

Convolution filters generate output images in which the brightness value at a particular pixel depends on the weighted sum (i.e. linear combination) of the brightness of the neighboring pixels.

Mathematical morphology filters on the other hand perform nonlinear processing on images. These filters depend only on the relative ordering of pixel values as opposed to their numerical values. This property of mathematical morphology filters makes them really good when applied to binary images (a digital image that can only have two possible values for each pixel).

Types of Convolution and Mathematical Morphology Filters

This page at Harrisgeospatial.com has a good overview of the different convolution filters and morphology filters.

The standard convolution filters are:

  • High Pass
  • Low Pass
  • Laplacian
  • Directional
  • Gaussian Low Pass
  • Gaussian High Pass
  • Median 
  • Sobel
  • Roberts 
  • User-Defined Convolution

The standard mathematical morphology filters are:

  • Dilation
  • Erosion
  • Opening
  • Closing

Noise Reduction Using Mathematical Morphology vs. Convolution Filters

Someone asked me this question the other day: What are the benefits and limitations of applying an image processing application such as noise reduction using mathematical morphology vs. convolution?

Before we get into the pros and cons of mathematical morphology and convolutions filters applied to noise reduction in images, let us take a look at the definitions of these terms.

Mathematical morphology is an image processing technique based on two operations: erosion and dilation. Erosion enlarges objects in an image, while dilation shrinks objects in an image.

Convolution filtering involves taking an image as input and generating an output image where each new pixel value is determined by the weighted values of itself and its neighboring pixels.

Noise reduction involves “cleaning up” an image. The goal is to take an image as input and get rid of all the unnecessary elements in that image so that it looks better.

Below are the pros and cons of doing noise reduction using mathematical morphology vs. convolution filters.

Mathematical Morphology

Pros

  • Simplicity from a theoretical perspective (it is based on basic set theory) 
  • Simplicity from an operational perspective (can be implemented with a few lines of code: https://docs.opencv.org/2.4/doc/tutorials/imgproc/erosion_dilatation/erosion_dilatation.html)
  • Computationally efficient
  • Useful for removing noise in grayscale images
  • Useful for detaching two objects that are connected together (erosion)
  • Useful for connecting an object that is broken apart in an image (dilation)
  • Can remove noise without substantially altering the underlying shape of an object

Cons

Convolution Filters

Pros

Cons

  • More complicated from an operational perspective (so many techniques and kernels to choose from…how does one decide which one is best?)
  • Can remove important image gradients because filter output is proportional to the contrast of a given section of an image
  • Shape of an object can become altered or distorted

How to Apply a Mask to an Image Using OpenCV

In this project, we will learn how to apply a mask to an image using OpenCV. Image masking involves highlighting a specific object within an image by masking it.

Requirements

  • Develop a program that takes a color image as input and allows the user to apply a mask.
  • When the user presses “r,” the program masks the image and produces an output image which is the image in black and white (i.e. grayscale) with only the masked area in color.

You Will Need 

  • Python 3.7+

Directions

Let’s say you have the following image:

apple

You want to highlight the apple in the image by applying a mask. The desired output is as follows.

apple_output

You also want to see the process it took to get to that output image above. In other words, you want to have the program output, not only the masked image (as above), but also a table that shows all the steps involved: input image -> mask -> output.

apple_table

To implement what I’ve described above, you will require two programs: common.py and image_masking.py.

common.py is a helper program. image_masking.py is the main driver program. To run it, you will type:

python image_masking.py [<image_file_path>]

For example,

python image_masking.py apple.jpg

Here is the code. I recommend copying and pasting both programs into a directory. Then put your input images into that same directory.

image_masking.py

#!/usr/bin/env python

'''
Welcome to the Image Masking Program!

This program allows users to highlight a specific 
object within an image by masking it.

Usage:
  image_masking.py [<image>]

Keys:
  r     - mask the image
  SPACE - reset the inpainting mask
  ESC   - exit
'''

# Python 2/3 compatibility
from __future__ import print_function

import cv2 # Import the OpenCV library
import numpy as np # Import Numpy library
import matplotlib.pyplot as plt # Import matplotlib functionality
import sys # Enables the passing of arguments
from common import Sketcher

# Project: Image Masking Using OpenCV
# Author: Addison Sears-Collins
# Date created: 9/18/2019
# Python version: 3.7
# Description: This program allows users to highlight a specific 
# object within an image by masking it.

# Define the file name of the image
INPUT_IMAGE = "fruits.jpg"
IMAGE_NAME = INPUT_IMAGE[:INPUT_IMAGE.index(".")]
OUTPUT_IMAGE = IMAGE_NAME + "_output.jpg"
TABLE_IMAGE = IMAGE_NAME + "_table.jpg"

def main():
    """
    Main method of the program.
    """
    # Pull system arguments
    try:
        fn = sys.argv[1]
    except:
        fn = INPUT_IMAGE

    # Load the image and store into a variable
    image = cv2.imread(cv2.samples.findFile(fn))

    if image is None:
        print('Failed to load image file:', fn)
        sys.exit(1)

    # Create an image for sketching the mask
    image_mark = image.copy()
    sketch = Sketcher('Image', [image_mark], lambda : ((255, 255, 255), 255))

    # Sketch a mask
    while True:
        ch = cv2.waitKey()
        if ch == 27: # ESC - exit
            break
        if ch == ord('r'): # r - mask the image
            break
        if ch == ord(' '): # SPACE - reset the inpainting mask
            image_mark[:] = image
            sketch.show()

    # define range of white color in HSV
    lower_white = np.array([0,0,255])
    upper_white = np.array([255,255,255])

    # Create the mask
    mask = cv2.inRange(image_mark, lower_white, upper_white)

    # Create the inverted mask
    mask_inv = cv2.bitwise_not(mask)

    # Convert to grayscale image
    gray = cv2.cvtColor(image, cv2.COLOR_BGR2GRAY)

    # Extract the dimensions of the original image
    rows, cols, channels = image.shape
    image = image[0:rows, 0:cols]

    # Bitwise-OR mask and original image
    colored_portion = cv2.bitwise_or(image, image, mask = mask)
    colored_portion = colored_portion[0:rows, 0:cols]

    # Bitwise-OR inverse mask and grayscale image
    gray_portion = cv2.bitwise_or(gray, gray, mask = mask_inv)
    gray_portion = np.stack((gray_portion,)*3, axis=-1)

    # Combine the two images
    output = colored_portion + gray_portion

    # Save the image
    cv2.imwrite(OUTPUT_IMAGE, output)

    # Create a table showing input image, mask, and output
    mask = np.stack((mask,)*3, axis=-1)
    table_of_images = np.concatenate((image, mask, output), axis=1)
    cv2.imwrite(TABLE_IMAGE, table_of_images)

    # Display images, used for debugging
    #cv2.imshow('Original Image', image)
    #cv2.imshow('Sketched Mask', image_mark)
    #cv2.imshow('Mask', mask)
    #cv2.imshow('Output Image', output)
    cv2.imshow('Table of Images', table_of_images)
    cv2.waitKey(0) # Wait for a keyboard event

if __name__ == '__main__':
    print(__doc__)
    main()
    cv2.destroyAllWindows()

common.py

#!/usr/bin/env python

'''
This module contains some common routines used by other samples.
'''

# Python 2/3 compatibility
from __future__ import print_function
import sys
PY3 = sys.version_info[0] == 3

if PY3:
    from functools import reduce

import numpy as np
import cv2 as cv

# built-in modules
import os
import itertools as it
from contextlib import contextmanager

image_extensions = ['.bmp', '.jpg', '.jpeg', '.png', '.tif', '.tiff', '.pbm', '.pgm', '.ppm']

class Bunch(object):
    def __init__(self, **kw):
        self.__dict__.update(kw)
    def __str__(self):
        return str(self.__dict__)

def splitfn(fn):
    path, fn = os.path.split(fn)
    name, ext = os.path.splitext(fn)
    return path, name, ext

def anorm2(a):
    return (a*a).sum(-1)
def anorm(a):
    return np.sqrt( anorm2(a) )

def homotrans(H, x, y):
    xs = H[0, 0]*x + H[0, 1]*y + H[0, 2]
    ys = H[1, 0]*x + H[1, 1]*y + H[1, 2]
    s  = H[2, 0]*x + H[2, 1]*y + H[2, 2]
    return xs/s, ys/s

def to_rect(a):
    a = np.ravel(a)
    if len(a) == 2:
        a = (0, 0, a[0], a[1])
    return np.array(a, np.float64).reshape(2, 2)

def rect2rect_mtx(src, dst):
    src, dst = to_rect(src), to_rect(dst)
    cx, cy = (dst[1] - dst[0]) / (src[1] - src[0])
    tx, ty = dst[0] - src[0] * (cx, cy)
    M = np.float64([[ cx,  0, tx],
                    [  0, cy, ty],
                    [  0,  0,  1]])
    return M


def lookat(eye, target, up = (0, 0, 1)):
    fwd = np.asarray(target, np.float64) - eye
    fwd /= anorm(fwd)
    right = np.cross(fwd, up)
    right /= anorm(right)
    down = np.cross(fwd, right)
    R = np.float64([right, down, fwd])
    tvec = -np.dot(R, eye)
    return R, tvec

def mtx2rvec(R):
    w, u, vt = cv.SVDecomp(R - np.eye(3))
    p = vt[0] + u[:,0]*w[0]    # same as np.dot(R, vt[0])
    c = np.dot(vt[0], p)
    s = np.dot(vt[1], p)
    axis = np.cross(vt[0], vt[1])
    return axis * np.arctan2(s, c)

def draw_str(dst, target, s):
    x, y = target
    cv.putText(dst, s, (x+1, y+1), cv.FONT_HERSHEY_PLAIN, 1.0, (0, 0, 0), thickness = 2, lineType=cv.LINE_AA)
    cv.putText(dst, s, (x, y), cv.FONT_HERSHEY_PLAIN, 1.0, (255, 255, 255), lineType=cv.LINE_AA)

class Sketcher:
    def __init__(self, windowname, dests, colors_func):
        self.prev_pt = None
        self.windowname = windowname
        self.dests = dests
        self.colors_func = colors_func
        self.dirty = False
        self.show()
        cv.setMouseCallback(self.windowname, self.on_mouse)

    def show(self):
        cv.imshow(self.windowname, self.dests[0])

    def on_mouse(self, event, x, y, flags, param):
        pt = (x, y)
        if event == cv.EVENT_LBUTTONDOWN:
            self.prev_pt = pt
        elif event == cv.EVENT_LBUTTONUP:
            self.prev_pt = None

        if self.prev_pt and flags &amp; cv.EVENT_FLAG_LBUTTON:
            for dst, color in zip(self.dests, self.colors_func()):
                cv.line(dst, self.prev_pt, pt, color, 5)
            self.dirty = True
            self.prev_pt = pt
            self.show()


# palette data from matplotlib/_cm.py
_jet_data =   {'red':   ((0., 0, 0), (0.35, 0, 0), (0.66, 1, 1), (0.89,1, 1),
                         (1, 0.5, 0.5)),
               'green': ((0., 0, 0), (0.125,0, 0), (0.375,1, 1), (0.64,1, 1),
                         (0.91,0,0), (1, 0, 0)),
               'blue':  ((0., 0.5, 0.5), (0.11, 1, 1), (0.34, 1, 1), (0.65,0, 0),
                         (1, 0, 0))}

cmap_data = { 'jet' : _jet_data }

def make_cmap(name, n=256):
    data = cmap_data[name]
    xs = np.linspace(0.0, 1.0, n)
    channels = []
    eps = 1e-6
    for ch_name in ['blue', 'green', 'red']:
        ch_data = data[ch_name]
        xp, yp = [], []
        for x, y1, y2 in ch_data:
            xp += [x, x+eps]
            yp += [y1, y2]
        ch = np.interp(xs, xp, yp)
        channels.append(ch)
    return np.uint8(np.array(channels).T*255)

def nothing(*arg, **kw):
    pass

def clock():
    return cv.getTickCount() / cv.getTickFrequency()

@contextmanager
def Timer(msg):
    print(msg, '...',)
    start = clock()
    try:
        yield
    finally:
        print("%.2f ms" % ((clock()-start)*1000))

class StatValue:
    def __init__(self, smooth_coef = 0.5):
        self.value = None
        self.smooth_coef = smooth_coef
    def update(self, v):
        if self.value is None:
            self.value = v
        else:
            c = self.smooth_coef
            self.value = c * self.value + (1.0-c) * v

class RectSelector:
    def __init__(self, win, callback):
        self.win = win
        self.callback = callback
        cv.setMouseCallback(win, self.onmouse)
        self.drag_start = None
        self.drag_rect = None
    def onmouse(self, event, x, y, flags, param):
        x, y = np.int16([x, y]) # BUG
        if event == cv.EVENT_LBUTTONDOWN:
            self.drag_start = (x, y)
            return
        if self.drag_start:
            if flags &amp; cv.EVENT_FLAG_LBUTTON:
                xo, yo = self.drag_start
                x0, y0 = np.minimum([xo, yo], [x, y])
                x1, y1 = np.maximum([xo, yo], [x, y])
                self.drag_rect = None
                if x1-x0 > 0 and y1-y0 > 0:
                    self.drag_rect = (x0, y0, x1, y1)
            else:
                rect = self.drag_rect
                self.drag_start = None
                self.drag_rect = None
                if rect:
                    self.callback(rect)
    def draw(self, vis):
        if not self.drag_rect:
            return False
        x0, y0, x1, y1 = self.drag_rect
        cv.rectangle(vis, (x0, y0), (x1, y1), (0, 255, 0), 2)
        return True
    @property
    def dragging(self):
        return self.drag_rect is not None


def grouper(n, iterable, fillvalue=None):
    '''grouper(3, 'ABCDEFG', 'x') --> ABC DEF Gxx'''
    args = [iter(iterable)] * n
    if PY3:
        output = it.zip_longest(fillvalue=fillvalue, *args)
    else:
        output = it.izip_longest(fillvalue=fillvalue, *args)
    return output

def mosaic(w, imgs):
    '''Make a grid from images.

    w    -- number of grid columns
    imgs -- images (must have same size and format)
    '''
    imgs = iter(imgs)
    if PY3:
        img0 = next(imgs)
    else:
        img0 = imgs.next()
    pad = np.zeros_like(img0)
    imgs = it.chain([img0], imgs)
    rows = grouper(w, imgs, pad)
    return np.vstack(map(np.hstack, rows))

def getsize(img):
    h, w = img.shape[:2]
    return w, h

def mdot(*args):
    return reduce(np.dot, args)

def draw_keypoints(vis, keypoints, color = (0, 255, 255)):
    for kp in keypoints:
        x, y = kp.pt
        cv.circle(vis, (int(x), int(y)), 2, color)

How to Blend Multiple Images Using OpenCV

In this project, we will blend multiple images using OpenCV. “Blending” means that we compute a weighted average of the pixel values for a set of color images which have the same dimensions.

You Will Need 

  • Python 3.7+
  • A bunch of images that you want to blend together.

Directions

Let’s say you have a set of images. You would like to create a new image which is the average of all the images.

For example, I have about 10 images which I obtained from this weather forecast website. I’m interested in seeing the areas which will receive the most snow on average over the next 10 days (i.e. the darkest blues). In order to do that I need to create a single image which blends the weather forecast frames for the next 10 days.

Below is a slide show of the images I would like to blend.

blended-images-movie

Let’s blend all those images so that we create an “average” image. Here is the code:

# Python program for blending multiple images using OpenCV

import glob
import numpy as np
import cv2

# Import all image files with the .jpg extension
files = glob.glob ("*.jpg")
image_data = []
for my_file in files:
    this_image = cv2.imread(my_file, 1)
    image_data.append(this_image)

# Calculate blended image
dst = image_data[0]
for i in range(len(image_data)):
	if i == 0:
		pass
	else:
		alpha = 1.0/(i + 1)
		beta = 1.0 - alpha
		dst = cv2.addWeighted(image_data[i], alpha, dst, beta, 0.0)

# Save blended image
cv2.imwrite('weather_forecast.png', dst)

What I’m doing above is importing all images in the current directory that have the .jpg extension.

I then put each image into a list.

I multiply each image by a weight. The weight depends on how many images there are. So for example, if I have 10 images in total, each image gets multiplied by 1/10.

After computing the “average” image, I save it as weather_forecast.png. Here is the result:

weather_forecast

Pretty cool! We can see that the snowiest areas will be in Utah, central Arizona, and southwest portions of Colorado. Now I know where to hit the slopes!

Keep building!

How to Display an Image Using OpenCV

In this project, I will show you how to display an image using OpenCV.

You Will Need 

  • Python 3.7+

Directions

Let’s say you have an image like the one below. The file name is 1.jpg.

1

To display it using OpenCV, go to your favorite IDE or text editor and create the following Python program:

# Display a color image using OpenCV
import numpy as np
import cv2

# Load an color image in grayscale
img = cv2.imread('1.jpg',1)

cv2.imshow('image',img)
cv2.waitKey(0)
cv2.destroyAllWindows()

Save the program into the same directory as 1.jpg.

Run the file.

run-display-image

Watch the image display to your computer. That’s it!

display-squirrel-picJPG

How to Install TensorFlow 2 on Windows 10

In this post, I will show you how to install TensorFlow 2 on Windows 10. TensorFlow2 is a free software library used for machine learning applications. It comes integrated with Keras, a neural-network library written in Python. If you want to work with neural networks and deep learning, TensorFlow 2 should be your software of choice because of its popularity both in academia and in industry. Let’s get started!

Table of Contents

You Will Need 

Directions

Install TensorFlow 2

Here are the official instructions for downloading TensorFlow 2, but I will walk you through the process step-by-step.

Open an Anaconda command prompt terminal.

1-open-promptJPG

Type the command below to create a virtual environment named tf_2 with the latest version of Python installed. A virtual environment is like an independent Python workspace which has its own set of libraries and Python version installed. For example, you might have a project that needs to run using an older version of Python, like Python 2.7. You might have another project that requires Python 3.7. You can create separate virtual environments for these projects.

conda create -n tf_2 python

Press y and then ENTER.

2-type-yJPG

Wait for the software to download.

3-activate-tensorflow-2JPG

Once the download is finished, activate the virtual environment using this command:

conda activate tf_2

Check which version of Python you have installed on your system. I have Python 3.8.0.

python --version
4-python-versionJPG

Choose a TensorFlow package. I’ll install TensorFlow CPU. Let’s type the following command:

5-choose-a-packageJPG
pip install --upgrade tensorflow

You might see this error:

ERROR: Could not find a version that satisfies the requirement tensorflow (from versions: none)

ERROR: No matching distribution found for tensorflow

If you do, you need to downgrade your version of Python. TensorFlow is not yet compatible with your newest version of Python.

conda install python=3.6

Press y and then ENTER.

Check which version of Python you have installed on your system. I have Python 3.6.9 now.

python --version
6-downgrade-pythonJPG

Now install TensorFlow 2.

pip install --upgrade tensorflow

Wait for Tensorflow CPU to finish installing. Once it is finished installing, verify the installation by typing:

python -c "import tensorflow as tf; x = [[2.]]; print('tensorflow version', tf.__version__); print('hello, {}'.format(tf.matmul(x, x)))"

Here is the output:

9-voilaJPG

You should see your TensorFlow version in the output.

You might see this message:

“I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2”

Don’t worry, TensorFlow is working just fine. To get rid of that message, you can set the environment variables inside the virtual environment. Type the following command:

set TF_CPP_MIN_LOG_LEVEL=2

Now run this command:

python -c "import tensorflow as tf; x = [[2.]]; print('tensorflow version', tf.__version__); print('hello, {}'.format(tf.matmul(x, x)))"

Voila! Message gone. 

9-voilaJPG-1

Return to Table of Contents

Create a Basic Neural Network Using TensorFlow 2

To really see what TensorFlow 2 can do, let’s do the following:

  • Build a neural network that classifies images of clothing.
  • Train this neural network.
  • And, finally, evaluate the accuracy of the model.

We are going to roughly follow the TensorFlow beginner tutorial.

First, install the Matplotlib library.

pip install matplotlib

I’m now going to open up a text editor and type a Python program. I will save it to my D drive as fashion_mnist.py. Here is the code:

from __future__ import absolute_import, division, print_function, unicode_literals

# Import the key libraries
import matplotlib.pyplot as plt
import tensorflow as tf
import numpy as np

# Rename tf.keras.layers
layers = tf.keras.layers

# Print the TensorFlow version
print(tf.__version__)

# Load and prepare the MNIST dataset. 
# Convert the samples from integers to floating-point numbers:
mnist = tf.keras.datasets.fashion_mnist
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0

# Let's plot the data so we can see it
class_names = ['T-shirt/top', 'Trouser', 'Pullover', 'Dress', 'Coat', 'Sandal',
 'Shirt', 'Sneaker', 'Bag', 'Ankle boot']
plt.figure(figsize=(10,10))

for i in range(25):
 plt.subplot(5,5,i+1)
 plt.xticks([])
 plt.yticks([])
 plt.grid(False)
 plt.imshow(x_train[i], cmap=plt.cm.binary)
 plt.xlabel(class_names[y_train[i]])
plt.show()

Within your virtual environment in the Anaconda terminal, navigate to where you saved your code. I will type.

D:

Then:

cd D:\<YOUR_PATH>\install_tensorflow2

Type dir to see if the Python (.py) file is in that directory.

Now run the code:

python fashion_mnist.py

You should see this graphic pop up.

10-fashion-datasetJPG

In the terminal window, press CTRL+C on your keyboard to stop the code from running.

Let’s add to our code. Open up the Python file again in the text editor and type the following code. If you are new to neural networks, don’t worry what everything means at this stage.

from __future__ import absolute_import, division, print_function, unicode_literals

# Import the key libraries
import matplotlib.pyplot as plt
import tensorflow as tf
import numpy as np

# Rename tf.keras.layers
layers = tf.keras.layers

# Print the TensorFlow version
print(tf.__version__)

# Load and prepare the MNIST dataset. 
# Convert the samples from integers to floating-point numbers:
mnist = tf.keras.datasets.fashion_mnist
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0

# Let's plot the data so we can see it
class_names = ['T-shirt/top', 'Trouser', 'Pullover', 'Dress', 'Coat', 'Sandal',
 'Shirt', 'Sneaker', 'Bag', 'Ankle boot']
plt.figure(figsize=(10,10))

for i in range(25):
 plt.subplot(5,5,i+1)
 plt.xticks([])
 plt.yticks([])
 plt.grid(False)
 plt.imshow(x_train[i], cmap=plt.cm.binary)
 plt.xlabel(class_names[y_train[i]])
plt.show()

# Build the neural network layer-by-layer
model = tf.keras.Sequential()
model.add(layers.Flatten()) # Make the input layer one-dimensional
model.add(layers.Dense(64, activation='relu')) # Layer has 64 nodes; Uses ReLU
model.add(layers.Dense(64, activation='relu')) # Layer has 64 nodes; Uses ReLU
model.add(layers.Dense(10, activation='softmax')) # Layer has 64 nodes; Uses Softmax

# Choose an optimizer and loss function for training:
model.compile(optimizer='adam',
 loss='sparse_categorical_crossentropy',
 metrics=['accuracy'])
 
# Train and evaluate the model's accuracy
model.fit(x_train, y_train, epochs=5)
model.evaluate(x_test,  y_test, verbose=2)

Run the code:

python fashion_mnist.py

When you see the plot of the clothes appear, just close that window so that the neural network build and run.

Here is the output.

11-accuracyJPG

The accuracy of classifying the clothing items was 87.5%. Pretty cool huh! Congratulations! You’ve built and run your first neural network on TensorFlow 2.

To deactivate the virtual environment, type:

conda deactivate

Then to exit the terminal, type:

exit

At this stage, I encourage you to go through the TensorFlow tutorials to get more practice using this really powerful tool.

Return to Table of Contents

Difference Between Supervised and Unsupervised Learning

In this post, I will explain the difference between supervised and unsupervised learning.

Table of Contents

What is Supervised Learning?

new_home_for_sale_4

Imagine you have a computer. The computer is really good at doing math and making complex calculations. You want to “train” your computer to predict the price for any home in the United States. So you search around the Internet and find a dataset. The dataset contains the following information  for 100,000 houses that sold during the last 30 days in various cities across the United States:

  1. Square footage
  2. Number of bathrooms
  3. Number of bedrooms
  4. Number of garages
  5. Year the house was constructed
  6. Average size of the house’s windows
  7. Sale price

Variables 1 through 6 above are the dataset’s features (also known as input variables, attributes, independent variables, etc.). Variable 7, the house sale price, is the output variable (or target variable) that we want our computer to get good at predicting. 

So the question is…how do we train our computer to be a good house price predictor?  We need to write a software program. The program needs to take as input the 100,000-house dataset that I mentioned earlier. This program needs to then find a mathematical relationship between variables 1-6 (features) and variable 7 (output variable). Once the program has found a relationship between the features (input) and the output, it can predict the sale price of a house it has never seen before.

1-ipo

Let’s take a look at an analogy. Supervised learning is like baking a cake. 

Suppose you had a cake machine that was able to cook many different types of cake from the same set of ingredients. All you have to do as the cake chef is to just throw the ingredients into the machine, and the cake machine will automatically make the cake.

The “features,” the inputs to the cake machine, are the following ingredients:

  1. Butter
  2. Sugar
  3. Vanilla Extract
  4. Flour
  5. Chocolate
  6. Eggs
  7. Salt

The output is the type of cake:

  1. Vanilla Cake
  2. Pound Cake
  3. Chocolate Cake
  4. Dark Chocolate Cake
cake_cakes_sweet_bake

Different amounts of each ingredient will produce different types of cake. How does the cake machine know what type of cake to produce given a set of ingredients? 

Fortunately, a machine learning engineer has written a software program (containing a supervised learning algorithm) that is running inside the cake machine. The program was pre-trained on a dataset containing 1 million cakes. Each entry (i.e. example or training instance) in this gigantic dataset contained two key pieces of data: 

  1. How much of each ingredient was used in the making of that given cake
  2. The type of cake that was produced

During the training phase of the program, the program found a fairly accurate mathematical relationship between the amount of each ingredient and the cake type. And now, when the cake machine is provided with a new set of ingredients by the chef, it automatically “knows” what type of cake to produce. Pretty cool huh!

2-ipo-2

What I described above is called supervised learning. It is called supervised learning because the input data (which the supervised learning algorithm used to train) is already labeled with the “correct” answers (e.g. the type of cake in our example above; or the sale price values for those 100,000 homes from the earlier example I presented in this post.). We supervised the learning algorithm by “telling” it what the output (cake type) should be for 1 million different sets of input values (ingredients). 

The fundamental idea of a supervised learning algorithm is to learn a mathematical relationship between inputs and outputs so that it can predict the output value given an entirely new set of input values. 

Let’s take a look at a common supervised learning algorithm: linear regression. The goal of linear regression is to find a line that best fits the relationship between input and output. For example, the learning algorithm for linear regression could be trained on square footage and sale price data for 100,000 homes. It would learn the mathematical relationship (e.g. a straight line in the form y = mx + b) between square footage and the sale price of a home. 

3-square-footage

With this relationship (i.e. line of best fit) in hand, the algorithm can now easily predict the sale price of any home just by being provided with the home’s square footage value. 

For example, let’s say we wanted to find the price of a house that is 2000 ft2. We feed 2,000 ft2 into the algorithm. The algorithm predicts a sale price of $500,000.

4-price-square-footage

As you can imagine, before we make any predictions using a supervised learning algorithm, it is imperative to train it on a lot of data. Lots and lots of data. The more the merrier.

In the example above, to get that best fit line, we want to feed it with as many examples of houses as possible during training. The more data it has, the better its predictions will be when given new data. This, my friends, is supervised learning, and it is incredibly powerful. In fact, supervised learning is the bread and butter of most of the state-of-the-art machine learning techniques today, such as deep learning.

Now, let’s look at unsupervised learning.

Return to Table of Contents

What is Unsupervised Learning?

Let’s suppose you have the following dataset for a group of 13 people. For each person, we have the following features:

  • Height (in inches)
  • Hair Length (in inches)

Let’s plot the data to see what it looks like:

5-height

In contrast to supervised learning, in this case there is no output value that we are trying to predict (e.g. house sale price, cake type, etc.). All we have are features (inputs) with no corresponding output variables. In machine learning jargon, we say that the data points are unlabeled

So instead of trying to force the dataset to fit a straight line or some sort of predetermined mathematical model, we let an unsupervised learning algorithm find a pattern all on its own. And here is what we get:

6-hair-length-vs-height

Aha! It looks like the algorithm found some sort of pattern in the data. The data is clustered into two distinct groups. What these clusters mean, we do not know because the data points are unlabeled. However, what we do suspect given our prior knowledge of this dataset, is that the blue dots are males, and the red dots are females given the attributes are height and hair length. 

What I have described above is known as unsupervised learning. It is called unsupervised because the input dataset is unlabeled. There is no output variable we are trying to predict. There is no prior mathematical model we are trying to fit the data to. All we want to do is let the algorithm find some sort of structure or pattern in the data. We let the data speak for itself. 

Any time you are given a dataset and want to group similar data points into clusters, you’re going to want to use an unsupervised learning algorithm.

Return to Table of Contents

How to Connect Arduino to ROS

How do we connect ROS to an actual embedded system that operates in the real, physical world? I’ll show you how to do this now using Arduino

Arduino is a popular microcontroller for building electronics projects. A microcontroller is a bunch of circuits that do stuff, such as accepting data input, doing calculations, and producing output. With respect to robotics development, Arduino would be the “brain” of a robot.

Fortunately, ROS can integrate with Arduino. We will install some software that will enable your Arduino to be a bonafide ROS node that can do everything a normal node can do, such as publish and subscribe to ROS messages.

Here are the official steps for interfacing Arudino with ROS, but I’ll walk you through the process below.

Table of Contents

Directions

How to Install Arduino on Ubuntu Linux

First, let’s download the Arduino IDE (Linux 64 bit version) to our computer. I will follow these instructions for installing the Arduino IDE on Ubuntu. Go to this website, and download the software:

63-download-arduino-file

Save the file. It will be saved as tar.xz format to your Downloads folder.

Open a new terminal window.

Move to the Downloads folder (or wherever you saved the tar.xz file).

cd Downloads
64-change-to-downloads-directory

Run this command to extract the files (substitute FILENAME with the name of the file you just downloaded):

tar xvf <FILENAME>

In my case, I will run:

tar xvf arduino-1.8.10-linux64.tar.xz

You will see a bunch of file names print out to your screen.

If you type the dir command, you will see the new folder. Let’s move that folder to the home directory. You can cut and paste it into the home directory using the file manager (the file cabinet on the left of the screen. Go there, then go to the Downloads folder, cut the file and paste it into the Home directory.

65-downloads-dir
66-arduino-in-home-directory

Now open up a new terminal window and type:

cd arduino-1.8.10

Type dir to see what files are inside.

67-what-files-are-inside

To install the IDE, type:

sudo ./install.sh
68-install-ideJPG

Here is the output. You should see a “done” message. A desktop icon will also be present. You can activate it by clicking on it and allowing permissions at the prompt.

Now, get your Arduino and connect it to the USB port on your computer.

Start Arduino by going into a new terminal window and typing:

arduino

You might see an error message like this:

69-arduino-startupJPG

Failed to load module “canberra-gtk-module” … but already installed

To resolve that error, press CTRL+C and close the terminal window.

Open a new terminal window, and type:

sudo apt-get install libcanberra-gtk-module

Now open a new terminal window, and type:

arduino

Let’s see if everything works by trying to blink the light-emitting diode (LED) on your computer.

Go to File – > Examples -> 01.Basics, and choose Blink.

Try to upload that code to your Arduino by clicking the right arrow on the upper left of your screen.

70-right-arrow-to-uploadJPG

You should see an error about the “Serial port not selected”.

Close out of your Ubuntu Virtual Machine completely. Do not save the state.

Set Up the Serial Port for VirtualBox With Ubuntu

Assuming you are using Windows, go to your Device Manager. Search for that in your computer in the bottom left of your desktop.

Under Device Manager, you should see Ports (COM & LPT). Note that Arduino is port 3 (Make sure your Arduino board is connected to the USB port of your computer).

71-arduino-com-portJPG

Open your VirtualBox.

Click Settings and go down to Serial Ports.

72-settings-serial-portsJPG

Make sure your settings look exactly like this:

73-settings-after-serial-portJPG

Note that the little box next to “Enable Serial Port” is checked.

Side Note: Any time you unplug your Arduino….say perhaps after using it within Ubuntu Linux or if you shutdown your PC….be sure to disable the “Enable Serial Port” option before restarting Ubuntu Linux in your Virtual Box. Otherwise, your Ubuntu Linux session will NOT launch. I’ve made this mistake numerous times, and it is frustrating.

After you are done, click OK.

Restart the VirtualBox with Ubuntu.

Open a new terminal window and type:

ls -al /dev/ttyS0

Here is the output:

74-serial-port-outputJPG

Now we need to give the IDE permissions to access the device.

In a new terminal window, find out your username. Type:

whoami

Now type the following commands, replacing YOUR_USER_NAME with what you found above:

sudo usermod -a -G dialout YOUR_USER_NAME
sudo chmod 660 /dev/ttyS0
76-add-dialoutJPG

Reboot your machine:

sudo reboot

Open up a terminal window and launch Arduino by typing:

arduino

Go to Tools -> Port, and you should see /dev/ttyS0. This is your Arduino board that is connected to the USB port of your computer. Make sure /dev/ttyS0 is checked.

77-arduino-board-selectedJPG

Now open the Blink sketch again. Go to File – > Examples -> 01.Basics, and choose Blink.

Click the Upload button…the right arrow in the upper left of your screen.

The LED on your Arduino should be blinking! If it is not, go back to the beginning of this tutorial and follow the steps carefully.

To turn off the blinking light, open up a new sketch and upload it to your board. Go to File -> New.

Integrate Arduino With ROS

Now that we know Arduino is working, we need to integrate it with ROS. Here are the official instructions. I’ll walk you through the steps below.

Let’s install the necessary packages. 

Close Arduino. Then type the following commands in a new terminal window (these will take a while to download so be patient):

sudo apt-get install ros-melodic-rosserial-arduino
sudo apt-get install ros-melodic-rosserial

Open the IDE by typing arduino and go to File -> Preferences. Make a note of the Sketchbook location. Mine is:

/home/ros/Arduino

78-sketchbook-locationJPG

Open a new terminal window and go to the sketchbook location you noted above. I’ll type:

cd arduino
79-libraries-folderJPG

Type the dir command to see the list of folders.

Go to the libraries directory.

cd libraries

Within that directory, run the following command to build the Arduino library that will be used by ROS (don’t leave out that period that comes at the end of the command):

rosrun rosserial_arduino make_libraries.py .

Type the dir command to see the list of folders. You should now see the ros_lib library.

80-ros-lib-libraryJPG

Make sure the Arduino IDE is closed. Now open it again.

You should see some sample code. Now, let’s take a look at the Blink example.

Go to File -> Examples -> ros_lib

81-ros-lib-on-examplesJPG

Return to Table of Contents

How to Blink an LED (Light-Emitting Diode) Using ROS and Arduino

The Blink example is analogous to a “Hello World” program. Blinking an LED is the most basic thing we can do to make sure the hardware is working properly and that it accepts the software we are developing on our laptop. The goal of the Blink example is to toggle an LED on and off. 

In this example, Arduino is going to be considered a Subscriber node. It will subscribe to a topic called toggle_led. Publishing a message to that topic causes the LED to turn on. Publishing a message to the topic again causes the LED to turn off. 

Go to File -> Examples -> ros_lib and open the Blink sketch.

Now we need to upload the code to Arduino. Make sure your Arduino is plugged into the USB port on your computer.

Upload the code to your Arduino using the right arrow button in the upper left of your screen. When you upload the code, your Arduino should flicker a little bit.

Open a new terminal window and type:

roscore

In a new terminal window, launch the ROS serial server. This command is explained here on the ROS website. It is necessary to complete the integration between ROS and Arduino:

rosrun rosserial_python serial_node.py /dev/ttyS0
83-rosrun-rosserial-pythonJPG

Now let’s turn on the LED by publishing a single empty message to the /toggle_led topic. Open a new terminal window and type:

rostopic pub toggle_led std_msgs/Empty --once
84-publish-message-led-onJPG

The LED on the Arduino should turn on. Note the yellow light is on (my Arduino is inside a protective case).

2019-10-21-193316

Now press the Up arrow in the terminal and press ENTER to run this code again. You should see the LED turn off. You might also see a tiny yellow light blinking as well. Just ignore that one…you’re interested in the big yellow light that you’re able to turn off and on by publishing single messages to the /toggle_led topic.

85-publish-message-led-offJPG
2019-10-21-193535

Return to Table of Contents

That’s it! You have now seen how you can integrate Arduino with ROS. To turn off your Arduino, all you need to do is disconnect it.

How to Launch the TurtleBot3 Simulation With ROS

In this tutorial, we will work with a virtual robot called TurtleBot3. TurtleBot3 is a low-cost, personal robot kit with open-source software. You can read more about TurtleBot here at the ROS website.

TurtleBot3 is designed to run using just ROS and Ubuntu. It is a popular robot for research and educational purposes.

48-turtlebotsJPG

Table of Contents

Directions

I’m assuming you have ROS installed and are using Linux. If you don’t have ROS installed, install ROS now.

Let’s install the TurtleBot3 simulator now.

Open a terminal window and install the dependent packages. Enter the following commands, one right after the other:

cd ~/catkin_ws/src/
git clone https://github.com/ROBOTIS-GIT/turtlebot3_msgs.git
git clone https://github.com/ROBOTIS-GIT/turtlebot3.git
cd ~/catkin_ws && catkin_make

TurtleBot3 has three models, Burger, Waffle, and Waffle Pi, so you have to set which model you want to use before you launch TurtleBot3. Type this command to open the bashrc file to add this setting:

gedit ~/.bashrc

Add this line at the bottom of the file:

export TURTLEBOT3_MODEL=burger
49-update-bash-settingsJPG

Save the file and close it.

Now reload .bashrc so that you do not have to log out and log back in.

source ~/.bashrc

Now, we need to download the TurtleBot3 simulation files.

cd ~/catkin_ws/src/
git clone https://github.com/ROBOTIS-GIT/turtlebot3_simulations.git
50-turtlebot-simulationsJPG
cd ~/catkin_ws && catkin_make
51-run-catkin-makeJPG

Return to Table of Contents

Simulate TurtleBot3 Using RViz

Now that we have the TurtleBot3 simulator installed, let’s launch the virtual robot using RViz. Type this command in your terminal window:

roslaunch turtlebot3_fake turtlebot3_fake.launch

If you want to move TurtleBot3 around the screen, open a new terminal window, and type the following command:

roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch

Click the terminal window and use the keys below to control the movement of your TurtleBot (e.g. press W key to move forward, X key to move backward and S to stop).

52-move-turtlebot-aroundJPG
52-turtlebot-rvizJPG
53-move-turtlebotJPG

And remember, use the keyboard to move the robot around.

54-how-to-control-turtlebotJPG

Press CTRL+C in all terminal windows.

Return to Table of Contents

Simulate TurtleBot3 Using Gazebo

Now let’s use Gazebo to do the TurtleBot3 simulation.

First, let’s launch TurtleBot3 in an empty environment. Type this command:

roslaunch turtlebot3_gazebo turtlebot3_empty_world.launch

Wait for Gazebo to load. It could take a while. Here is what your screen should look like:

55-gazebo-simulationJPG

Press CTRL+C and close out all windows.

Return to Table of Contents

How to Change the Simulation Environment for TurtleBot3

Let’s look at our TurtleBot3 in a different environment. This environment is often used for testing SLAM and navigation algorithms. Simultaneous localization and mapping (SLAM) concerns the problem of a robot building or updating a map of an unknown environment while simultaneously keeping track its location in that environment.

In a new terminal window type:

roslaunch turtlebot3_gazebo turtlebot3_world.launch
56-gazebo-slam-navigation-viewJPG
57-the-burgerJPG

Press CTRL+C and close out all windows.

We can also simulate TurtleBot3 inside a house. Type this command and wait a few minutes for the environment to load.

roslaunch turtlebot3_gazebo turtlebot3_house.launch
58-turtlebot-in-a-houseJPG

To move the TurtleBot with your keyboard, use this command in another terminal tab:

roslaunch turtlebot3_teleop turtlebot3_teleop_key.launch

Press CTRL+C and close out all windows.

Return to Table of Contents

Autonomous Navigation and Obstacle Avoidance With TurtleBot3

Now let’s implement obstacle avoidance for the TurtleBot3 robot. The goal is to have TurtleBot3 autonomously navigate around a room and avoid colliding into objects.

Open a new terminal and type:

roslaunch turtlebot3_gazebo turtlebot3_world.launch

In another terminal window type:

roslaunch turtlebot3_gazebo turtlebot3_simulation.launch

You should see TurtleBot3 autonomously moving about the world and avoiding obstacles along the way.

59-autonomous-navigation-robotJPG

We can open RViz to visualize the LaserScan topic while TurtleBot3 is moving about in the world. In a new terminal tab type:

roslaunch turtlebot3_gazebo turtlebot3_gazebo_rviz.launch
60-open-rviz-visualize-laserscanJPG

Press CTRL+C and close out all windows.

Return to Table of Contents

Simulating SLAM With TurtleBot3

Let’s take a look at how we can simulate SLAM with TurtleBot3. As a refresher, Simultaneous localization and mapping (SLAM) concerns the problem of a robot building or updating a map of an unknown environment while simultaneously keeping track its location in that environment.

Install the SLAM module in a new terminal window.

sudo apt install ros-melodic-slam-gmapping

Start Gazebo in a new terminal window.

roslaunch turtlebot3_gazebo turtlebot3_world.launch

Start SLAM in a new terminal tab.

roslaunch turtlebot3_slam turtlebot3_slam.launch slam_methods:=gmapping

Start autonomous navigation in a new terminal tab:

roslaunch turtlebot3_gazebo turtlebot3_simulation.launch

Watch the robot create a map of the environment as it autonomously moves from place to place!

61-slam-using-turtlebot3JPG
62-slam-2-using-turtlebot3JPG
62-slam-3-using-turtlebot3JPG

And that is all there is to it.

When you’ve had enough, press CTRL+C and close out all windows.

Return to Table of Contents

That’s it for TurtleBot3. In case you want to try other commands and learn more, check out the official TurtleBot3 tutorials.