Liberate Your Robot With Ad-Hoc Networking

Don’t let your robot be trapped by it’s WiFi network… go with Robot Ad-Hoc Networking!

Ad-Hoc Wireless Connection
Ad-Hoc Wireless Connection

Robot Ad-Hoc Networking is easy to configure and means that you, your robot, and other pre-configured devices like a laptop or tablet, can go anywhere and not need a WiFi (router / internet) connection. This configuration allows you to enter a new location and operate the robot without access to any local networks.

The Scenario

When going “into-the-field”, robot operators want:

a static IP address (for the robot)

Ad-Hoc wireless network

DHCP server (for connecting clients to the robot, eg. laptop or tablet)

When back at the office, operators want:

dynamic IP address

regular wireless network

disable DHCP server

The robot should automatically configure itself on boot up, based on what WiFi signals it finds.

The Solution

The file ‘rc.local’ calls ‘/home/ubuntu/scripts/go_dhcp_ad_hoc.sh’. This script checks to see if we got an IP address for WiFi (‘wlan0’). If yes, do nothing, otherwise configure Robot Ad-Hoc Networking and the DHCP server.

To use this method, install and configure:

  • DHCP server
  • ‘go_dhcp_ad_hoc.sh’ script
  • call script from rc.local

On boot, system will try to connect to the WiFi, if it fails then it will create the Ad-Hoc network and launch the DHCP server.

Install and Run

Note: please backup your robot software before making any modifications

1. Install required modules (run these commands on the robot):

$ sudo apt-get update

$ sudo apt install iw

$ sudo apt-get install isc-dhcp-server

2. Edit ‘/etc/dhcp/dhcpd.conf’, add the following to the end of the file:

subnet 192.168.0.0 netmask 255.255.255.0 {
     range 192.168.0.10 192.168.0.40;
     option broadcast-address 191.168.0.255;
     default-lease-time 600;
     max-lease-time 7200;
     authoritative;
 }

3. Create DHCP init scripts:

$ sudo cp /etc/default/isc-dhcp-server /etc/default/isc-dhcp-server.orig

$ sudo cp /etc/default/isc-dhcp-server /etc/default/isc-dhcp-server.wlan

$ sudo vi /etc/default/isc-dhcp-server.wlan

Change the last line of ‘etc/default/isc-dhcp-server.wlan’ to:

INTERFACES="wlan0"

4. Install script (see below):

$ vi /home/ubuntu/scripts/go_dhcp_ad_hoc.sh

  (copy the content from below)

$ sudo chmod 777 /home/ubuntu/scripts/go_dhcp_ad_hoc.sh

5. Edit ‘/etc/rc.local’ to call the script, by adding the following to the end of the file:

sleep 2
sudo /home/ubuntu/scripts/go_dhcp_ad_hoc.sh

You’re done!

Connecting to the Ad-Hoc Network

If the WiFi specified in /etc/network/interfaces’ is not found, robot will auto-configure Ad-Hoc network and present the Wifi SSID:

  MiniTurtyAdHoc

Connect your laptop or tablet using the Ad-Hoc WiFi connection, and then use the robot in the normal manner. The IP adress of the robot in Ad-Hoc mode is:

  192.168.0.1

To disable this functionality (prevent boot into Ad-Hoc mode), simply comment out (using ‘#’) the line in the ‘rc.local’ file.

The ‘go_dhcp_ad_hoc.sh’ Script

$ cat ~/scripts/go_dhcp_ad_hoc.sh:

#!/bin/sh
if ! wpa_cli -i wlan0 status | grep ip_address
then
  echo 'Not got IP address, setting up Ad-Hoc'
  if pstree | grep wpa_supplicant
  then
    sudo killall wpa_supplicant
  fi
  if pstree | grep avahi-daemon
  then
    sudo killall avahi-daemon
  fi
  if pstree | grep dhclient
  then
    sudo killall dhclient
  fi
  sleep 1

  sudo ifconfig wlan0 down
  sudo iwconfig wlan0 mode Ad-Hoc
  sudo iwconfig wlan0 essid MiniTurtyAdHoc
  sudo ifconfig wlan0 192.168.0.1
  sudo ifconfig wlan0 up

  sleep 1
  sudo cp /etc/default/isc-dhcp-server.wlan /etc/default/isc-dhcp-server
  sudo systemctl restart isc-dhcp-server
  sudo cp /etc/default/isc-dhcp-server.orig /etc/default/isc-dhcp-server
fi

Need more help?… Ask a question in the comments below!

Navigating Mobile Robot Mapping

To understanding the Mini Turty ROS mobile robot mapping operation, we begin from the simple script that is used to start the robot.

The Starting Point

Typically, for doing mapping users will call:

$ ./mini_turty_mapping.sh

The above file is contained in the ubuntu home directory (‘/home/ubuntu’). If you look in that file, you will see the line similar to:

$ roslaunch mini_turty3 hector_ydlidar_demo.launch

The above is what starts the robot doing mobile robot mapping. Let’s break that down into the pieces. The first part is the ‘roslaunch’ program.

Navigating in a map showing obstacle inflation
Navigating in a map showing obstacle inflation

This program is part of the ROS system itself. It’s used to launch many different ROS nodes via one or more launch files. In the above case, roslaunch is launching a launch file called ector_ydlidar_demo.launch’, and this launch file is contained in the ROS package called ‘mini_turty3’.

So where is this ‘hector_ydlidar_demo.launch’ file, you may be asking. There are a couple of ways of finding that, but one convenient method is to use the ROS bash shell command ‘roscd’. Type the following into a console:

$ source catkin_ws/devel/setup.bash

$ roscd mini_turty3

This will take you to the directory containing the mini_turty3 package. If
you then run ‘pwd’:

$ pwd

You will see:

/home/ubuntu/catkin_ws/src/mini_turty3

So ‘roscd’ has taken us to the directory containing the ROS package ‘mini_turty3’. Run ‘ls’:

$ ls

This will show the directory contents similar to:

CMakeLists.txt include launch package.xml param scripts src

Note that we have a sub-directory called ‘launch’ here. This is where the previously mentioned ‘hector_ydlidar_demo.launch’ launch file exists. If you run:

$ ls launch

You will see output similar to:

amcl_rd_demo.launch hector_ydlidar_demo.launch
minimu9-ahrs.launch mini_turty.urdf teleop.launch test.xacro
hector_move_base_demo.launch includes mini_turty3.urdf
robot_pose_ekf.launch test.urdf

Which lists all the launch files in the mini_turty3 package, including the one we are currently interested in (‘hector_ydlidar_demo.launch’). Let’s open our launch file and examine its contents:

$ vi launch/hector_ydlidar_demo.launch

The ‘hector_ydlidar_demo.launch’ launch file launches several ROS nodes.

The LiDAR Node

The first ROS node to be launched is:

  <node name="ydlidar_node"  pkg="ydlidar"  type="ydlidar_node"
output="screen">

This node is the ROS driver for the lidar. It handles all interaction with the lidar device and is responsible for starting/stopping, gathering the data from the lidar, and publishing the lidar data to ROS.

The G2 lidar
The G2 lidar

The Mapping Node

The next node to be launched is:

  <node pkg="hector_mapping" type="hector_mapping" name="hector_mapping"
output="screen">

This node is the “Hector Mapping” node. It’s responsible for the creation of the map, using the data from the lidar. It takes lidar scans and adds them into the map. The map is then published to ROS for other nodes to access.

The Robot Base Node

Moving on, the next important node is:

  <node pkg="mini_turty3" type="mini_turty3" name="mini_turty3"
required="true">

This node is used to control the robot base. It’s responsible for responding to ROS tele-op commands, and for publishing odometry.

Each of the above launched nodes has a ROS package under the ‘catkin_ws’
directory. If you run the command:

$ ls ~/catkin_ws/src/

You will see several ROS packages (for some robots, there could be many more!):

minimu9-ahrs mini_turty3 raspicam_node ydlidar

You may ‘roscd’ into any of these packages. But let’s stay in our ‘mini_turty3’ package and examine the source code of this node:

$ ls src

You will see:

mini_turty3.cpp

This is the source file for the ‘mini_turty3’ base. You could open this using many different editors, but let’s use ‘vi’ again:

$ vi src/mini_turty3.cpp

Here you will find the ‘main()’ program for the mini_turty3 base. Please examine this source code to understand how the base operates. Here’s the code snippet of the main function:

int main(int argc, char *argv[])
{
  int pi;
  int enable_odom_transform;
  
  pi = init_gpio();
  if(pi) {
    printf("Error initializing pigpio!\n");
    return pi;
  }

  ros::init(argc, argv, "mini_turty");
  ros::NodeHandle private_node_handle_("~");
  ros::Rate loop_rate(50);

  private_node_handle_.param("enable_odom_transform", enable_odom_transform, int(0));

  MiniTurtyBase mini(pi);
  mini.setCmdVelTopic("cmd_vel");
  mini.setEnableOdomTransform(enable_odom_transform);

  while (ros::ok()) 
  {
    if (got_ctrl_c) {
      ROS_WARN("Got Ctrl-C");
      break;
    }

    ros::spinOnce();
    loop_rate.sleep();
  }

  // set SLEEP (active low)
  gpio_write(pi, GPIO_SLEEP, 0);
  pigpio_stop(pi);

  ROS_WARN("Exiting.");

  exit(0);
}

Note that most ROS nodes have a ‘src’ directory, and that is where the code for a node is usually kept. For example, you may find the code for the lidar in ‘catkin_ws/src/ydlidar/src’.

Binary Install of ROS Packages

Be aware that many ROS packages are by default ‘binary install’. This means that you currently have pre-built binaries only (to save build time and storage space!). This is the case for ‘hector_mapping’. However, you may access the source code, and even install it onto the robot by going to the github location. To see the source code for hector_mapping, first go
to the ROS wiki:

http://wiki.ros.org/hector_mapping

Then click on the:

Source: git https://github.com/tu-darmstadt-ros-pkg/hector_slam.git (branch: melodic-devel)

This immediately takes you to the github repository containing the Hector Mapping source code.

If you decide to install any source code, please make sure to back up your SD card first, and then restore it to the original if you find any problem. This will be important if you require support, as it is not possible for us to debug a user-modified installation.

As you can see, navigating your way through the Mini Turty ROS packages is not difficult. Just take it one step at a time and follow the path through in a manner similar to that which is described above. You will soon be getting oriented and making your way in the world of ROS autonomous mobile robot mapping!

Visualizing The LiDAR Data On Your Mapping Robot

You’ve setup your robot and remote PC, now the fun begins. Let’s visualize the robot laser data on our autonomous LiDAR mapping robot!

If you’ve not already done so, power up your robot, initialize it and run the mapping script (see our Tutorials for more details). With the robot now running (and the on-robot lidar device scanning), follow this procedure:

First, make sure that the remote PC is configured to work with your robot. Switch to your remote PC and open up a terminal window in Ubuntu. In the terminal, type:

env | grep ROS

In our example, here is what we got:

ROS environment variables
ROS environment variables

From this, we can see that our remote PC is currently configured to work with a ROS master that would be located on the same machine. Specifically, the line:

ROS_MASTER_URI=http://localhost:11311

That shows us that this host expects to find its ROS master locally. Well, that’s not how we want to work with our robot (we want to make our robot the ROS master). So let’s go ahead and change that, by running the following command:

export ROS_MASTER_URI=http://ubuntu:11311

The above command tells our remote PC that the ROS master is to be found on the host with the hostname ‘ubuntu’. So now, when we run ROS commands, these commands will go to our robot (whose default name is: ubuntu). If you have renamed your robot, you will want to replace the robot name with whatever name you used.

Examining ROS Topics

Having configured our remote PC to use the robot as the ROS master, let’s quickly examine what ROS topics are being published by our robot:

rostopic list

The above command yields the following response:

ROS topics
ROS topics

This lists out several ROS topics that are running, including the ‘/scan’ topic which is where our laser data is presented. Let’s have a quick look at that laser data, by typing the following:

rostopic echo /scan

Which shows:

Echo topic 'scan'
Echo topic ‘scan’

That’s the tail end of a ROS topic ‘/scan’ showing the data point “intensities”. We’ll skip over the details of just what that is.

Visualizing the LiDAR Data

For now, let’s visualize the scan data in the graphical package RViz, by typing:

rosrun rviz rviz

This will start the ROS RViz program, where we will be visualizing the robot laser data from our autonomous lidar mapping robot…

Visualizing the LiDAR data
Visualizing the LiDAR data

If all went well, you should be seeing something similar to the above. If not, you will want to check your ROS connectivity (see our Tutorials for more details). Enjoy!

Register with Rhoeby to Win a Mini-Turty Flex!

Win a ROS navigation-capable robot! Advanced robotics navigation, map building, tele-op, tele-viewing, frontier exploration and computer vision. Little robots with BIG capabilities.

Register for a new account here:

https://rhoeby.com/my-account/

for a chance to win one of these marvelous robots.

Mini-turty Flex Robot

The raffle runs until the end of the Bay Area Maker Faire 2019, so hurry for a chance to win. Winner will be announced in this blog on 5/24/19.

See us and the robots at the Bay Area Maker Faire 2019!

See us and the robots at the Bay Area Maker Faire 2019!

Things You Can Do With The Mini-Turty Flex Robot

There are many things you can do with your Mini-Turty Flex robot, including:

  • Basics: ROS learning
  • Teleop: robot remote control
  • Map Building: make maps of your home or office for the robot to use
  • Navigation: the robot moves autonomously around your home or office
  • Tele-Viewing: see what your robot sees, even from another room
  • Frontier Exploration: the robot autonomously explores unknown terrain
  • Computer Vision: Mini-Turty Flex recognizes objects in its environment

Finding Your Robot IP Address

So your robot is powered up and hopefully it’s properly connected to the WiFi network, but you can’t seem to figure out the IP address.

There are several ways to find out what IP address your robot is using. One involves using a program called “ping”, and requires that you know your robot hostname. Another involves plugging in an HDMI monitor and keyboard to the robot. Yet another requires logging into your router to search for connected devices. We’ll detail all of these techniques here.

Using Ping To Determine Your Robot IP Address

This method is quick and simple assuming you know your robot hostname and your network is properly configured. It goes like this:

1. Open a terminal window on any host connected to the network. For this example, we’ll use a Windows PC and the Command Prompt (but the command is the same regardless). In the terminal window, type:

ping ubuntu

You should see output similar to:

As can be seen above, the IP address of this host (hostname: ubuntu) is ‘192.168.0.33’.

Getting The IP Address Using Monitor And Keyboard

This method is often the easiest if you don’t know (or have mislaid!) your robot network hostname. It’s also the most reliable method, as there is no potential for hostname conflicts that might cause issues with the ping or router method. So to begin:

1. Plug in your HDMI monitor into the HDMI port on the Raspberry Pi.

2. Then plug in a USB keyboard, and power up your robot. You should be able to observe the Raspberry Pi booting up on the HDMI monitor, after which, it will present you with a login prompt.

3. Login using the usual credentials.

4. Then type:

ifconfig

Your robot will report its IP address.

You can now disconnect the keyboard and monitor and continue to use your robot. Note that most networks provide for some “stickiness” for the IP address, meaning that the robot could retain the same address even after shutdown and reboot, if the period between shutdown and reboot is not too long (up to 1 day, is not uncommon).

Getting The IP Address From The Router

This method uses the router itself to find out what devices are connected to the network. Most routers can provide this information along with the hostname of the connected device. Begin by logging into your router (this process is router specific, but you could find one example here).

After you login to your router, navigate to the “Attached Devices” panel (or where-ever your particular router stores this information), and check for a wireless device name that matches the hostname of your robot (default is: ubuntu). Then simply lookup the IP address associated with your robot name, as presented by the router (usually in some sort of lookup table).

Encoder-Free Robot Odometry Using Stepper Motors

Many ground-based robots employ motor shaft encoders to help provide a reasonable estimate of the robot movement along the surface of the ground, also known as robot odometry.

This technique enables robot odometry via feedback from the motor shaft. But what if we could do away with that and adopt a simpler approach? Could similar results be achieved whilst lowering costs and simplifying the system? The answer to that question is a resounding yes, and the solution is to use stepper motors.

Conventional DC Motor and Encoder-Based Systems

Conventional methods for wheeled-robot odometry employ shaft rotation feedback encoders that measure that actual rotation of the wheels on the robot. Typically using hall-effect or optical sensors, data is acquired as a series of pulses from the encoders, the frequency of which increases as the wheel turns faster. Owing to the nature of many electrical motor systems, it’s otherwise not possible to accurately predict how much a wheel will turn, given a certain electro-motive force. Thus shaft encoders are a mandatory feature on most ground-based robots that require odometry.

DC motor with shaft encoder
A typical DC motor with shaft encoder

Wheel encoders typically use 3-4 wires per encoder (plus the motor connections), along with the use of a software-based Proportional Integral Derivative (PID) controller to convert that into a usable odometry.

On a two-wheel drive robot, that then necessitates up to twelve connections for the data and power lines, and the creation of two PID filters. That’s a lot of resources just to be able to know what your wheels are doing!

Using Stepper Motors

Stepper motors are in common use today in many applications, with one of the most notable examples being that of 3D printers. A 3D printer requires highly accurate positioning abilities and the ability to hold the motor in any given position. That’s a nice feature of stepper motors for 3D printing, but they can also be used to drive robot wheels. Robots that utilize stepper motors for locomotion produce highly predictable and accurate motion, and can also be put to “Sleep” when any given wheel is stationary, thus saving power.

A stepper motor
A typical stepper motor

Notable is the fact that stepper motors turn in discrete steps (thus the name!). That is to say, that under the control of a dedicated circuit known as a stepper controller, the motor shaft can be positioned anywhere around the 360 degree rotation, and typically with 1.8 degree step size.

Inferring the Motor Rotation

By providing a continuous pulse train to the stepper controller, the on-robot computer can cause the stepper to produce a continuous rotation, the speed of which is proportional to the frequency of the pulse train. A low frequency pulse train causes the wheel to turn slowly, whilst a high frequency pulse train causes the wheel to turn faster. In a properly designed system, there will be no missed steps, meaning that each pulse will result in a single step of the motor, and so the host computer can infer exactly what the speed of rotation of the wheel is, essentially by counting the pulses it is sending to the motor controller.

A stepper motor controller
The stepper motor controller

Most modern host computers (including the Raspberry Pi) have dedicated hardware that can produce the necessary pulse trains to the stepper motor controllers. This hardware is know as Pulse Width Modulation (PWM), and is commonly used to create the necessary signals. As such, this requires very little intervention by the host computer. Furthermore, once setup, speed control and odometry are achieved with no additional resources. Thus encoder-free odometry using stepper motors is both simple and easy to implement, both at the hardware level (no need for encoder wiring) and at the software level (no need for PID controller).

Examining the Code

Underside of controller
Underside of controller showing connections

On the Mini-Turty host computer, using a timer callback, we periodically update the speed of the motor according to what is instructed by the robot teleop / velocity commands.

For our differential drive robot, we use stepper motors driven by stepper controllers, which are directly controlled by the Raspberry Pi host computer. To understand how we derive the odometry, consider the following code snippet:

void MiniTurtyBase::timerCallback(const ros::TimerEvent)
{
  double vx;
  double vth;
  
  updateMotorControl(DIFFDRIVE_MOTOR_LEFT);
  updateMotorControl(DIFFDRIVE_MOTOR_RIGHT);

  vx = (motors[DIFFDRIVE_MOTOR_RIGHT].currentSpeed + 
        motors[DIFFDRIVE_MOTOR_LEFT].currentSpeed) / 2;
  vth = (motors[DIFFDRIVE_MOTOR_RIGHT].currentSpeed - 
         motors[DIFFDRIVE_MOTOR_LEFT].currentSpeed) * 8;

  publishOdom(vx, vth);
}

The above timer callback function is attached to a ROS timer, and hence is called periodically. On each invocation, the callback updates the motor control with the new motor speeds for the left and right motors. It then calculates the velocity ‘vx’, which is the speed that the robot is moving in the forward (or backward) direction. It also calculates the velocity ‘vth’, which is the speed at which the robot is rotating about the z-axis (aka turning!). These two velocities are derived from the current speed that the motors are rotating at, which in turn is derived from the speed that was set. Thus we achieve a quasi-feedback from the motors that is purely software-based. These calculated linear and rotational velocities are then passed to the function ‘publishOdom()’, whose purpose is to send the odometry information back to ROS.

Taking Into Account Slew Rate

At this point, the astute reader might be wondering: why not just use the speed that was set by the incoming velocity command and publish that? Well the answer to that is we need to take into account the fact that stepper motors do not instantaneously change speed. For most velocity alterations, it is necessary to “slew” the rate of the motor. Slewing, in this context, is the process of gradually changing from one speed to another. Thus a motor that is going from say 20 RPM to 100 RPM might make that change as a series of steps, going from 20, to 30, to 40, and so on up to 100. So in order to provide accurate odometry information, it’s important that the actual speed of the motor is used, rather than any target speed that we may be currently slewing toward.

The full code for the Mini-Turty series of robots, including the code for the robot odometry, can be found at our github repository.

Summary

It turns out that practical systems for robot odometry can be achieved through the use of steppers motors as the primary means of locomotion. When combined with modern LiDAR, this then enables robot localization, mapping and navigation. Moreover, this use of stepper motors gives rise to a simpler implementation that’s easier to create, cheaper to realize, and provides a result comparable to that achieved with the more conventional system using DC motors and encoders.

Backup Your Robot Software Regularly

Whenever you get a new robot, or make major changes to the configuration, it’s a good idea to make a backup of the software.

It’s an important step that will allow you to restore to a known good state, in case there is any difficult to resolve configuration problem, or if the SD card that holds the software gets corrupted. One of the best and most reliable methods is to simply make a copy of the entire SD card. You can do this by purchasing a similar card (the closer the better).

Copy Your SD Card
Copy Your SD Card

The card we use on Mini-Turty is the SanDisk Ultra 16GB Ultra Micro SDHC UHS-I/Class 10. You could easily find one online.

How To Backup Your SD Card

To make a copy:

1. Begin by shutting down your robot, switching it off, and then carefully removing the SD card from the Raspberry Pi board

2. Insert the card into a Linux PC with sufficient disk space (you’ll need at least 4 GB)

3. Check which device your SD card is mounted to, by typing:

df

Show devices
Show devices

The last two lines show the SD card (in our case, yours may differ), which is mounted as ‘/dev/sdb1’ and ‘/dev/sdb2’.

For additional information and confirmation, directly after you inserted the SD card, you could also type:

dmesg | tail

Show device mount points
Show device mount points

Again, this shows that the card was mounted as sdb1 and sdb2. There are two mount points because there are two partitions on the SD card (the boot partition and the Linux partition).

4. Unmount the SD card, by typing:

umount /dev/sdb1

umount /dev/sdb2

The above is just an example. Your SD card may appear as a different device.

5. Begin the copy from the SD card to the PC, by running the following command:

sudo dd if=/dev/sdb of=mini_turty2.img bs=4M count=5000

Be careful to get the above command right, because it could corrupt your PC if you accidentally modify the PC HDD. Be patient. The above command will take a long time to complete, and it may look as if nothing is happening. Also, you may need to adjust the parameters to suit your system (eg. your SD card might appear as ‘sdc’).

6. Once the copy to the PC is completed, remove the original SD card and insert the new card.

7. Now copy from the PC to the new card by typing:

sudo dd if=mini_turty2.img of=/dev/sdb bs=4M count=5000

When the copy is completed, reboot your robot with the new card to confirm that your backup card is working.

We’ll cover another method to backup your robot software on a Windows-based system in a later post.