Adding ROS Raspberry Pi 4B Camera

This post describes how to get the Raspberry Pi 4B camera working under ROS. Try following these steps to get it working on your Pi 4B-based robot too!

Where to Begin?

The basic config of the robot is:

The previous employed Ubiquity approach failed on Pi 4. So I fall back to trying to install similar to:

https://github.com/fpasteau/raspicam_node,

and,

https://github.com/raspberrypi/userland

The Details

Get the userland code:

git clone https://github.com/raspberrypi/userland.git /home/ubuntu/userland

Then…

cd userland
./buildme –aarch64

The build appeared to succeed.

My quitter, lame attempt to see if it worked failed:

raspicam

Command not found.

But,

ubuntu@ubuntu-pi4:~/userland$ sudo find . -name raspicam
./host_applications/linux/apps/raspicam

So the code appears to be there but is perhaps not getting built. Note the comment on ‘https://github.com/raspberrypi/userland’…

“Whilst 64-bit userspace is not officially supported, some of the libraries will work for it.”

There might be a ray of hope here:

https://github.com/6by9/userland/tree/64bit_mmal

Trying it…

git clone https://github.com/6by9/userland.git

Step-by-step commands:

cd src/
git clone https://github.com/6by9/userland.git
cd userland/
git checkout 64bit_mmal
./buildme –aarch64
ls /opt/vc/bin/

raspivid
/opt/vc/bin/raspivid
find . -name libmmal_core.so

This now runs:

LD_PRELOAD=”/opt/vc/lib/libmmal_vc_client.so /opt/vc/lib/libvcsm.so /opt/vc/lib/libmmal_core.so /opt/vc/lib/libmmal_util.so” /opt/vc/bin/raspistill -o cam.jpg

But gives errors:

mmal: Cannot read camera info, keeping the defaults for OV5647
mmalipc: mmal_vc_dump_client_components: mmal_vc_dump_client_components: 0 entries in use
mmalipc: mmal_vc_dump_client_contexts: mmal_vc_dump_client_contexts: 0 entries in use
mmal: mmal_vc_component_create: failed to create component ‘vc.ril.camera’ (1:ENOMEM)
mmal: mmal_component_create_core: could not create component ‘vc.ril.camera’ (1)
mmal: Failed to create camera component
mmal: main: Failed to create camera component
mmal: Camera is not enabled in this build. Try running “sudo raspi-config” and ensure that “camera” has been enabled

Fixing the Runtime Errors

So next… I added:

start_x=1
gpu_mem=128

to ‘/boot/firmware/config.txt’.

And added:

SUBSYSTEM=vchiq,GROUP=video,MODE=0660

to ‘/etc/udev/rules.d/10-vchiq-permissions.rules’.

sudo vi /etc/udev/rules.d/10-vchiq-permissions.rules
sudo usermod -a -G video ubuntu
sudo reboot
LD_PRELOAD=”/opt/vc/lib/libmmal_vc_client.so /opt/vc/lib/libvcsm.so /opt/vc/lib/libmmal_core.so /opt/vc/lib/libmmal_util.so” /opt/vc/bin/raspistill -o cam.jpg
680 LD_PRELOAD=”/opt/vc/lib/libmmal_vc_client.so /opt/vc/lib/libvcsm.so /opt/vc/lib/libmmal_core.so /opt/vc/lib/libmmal_util.so” /opt/vc/bin/raspivid -o video.h264 -t 10000

(Note: there are some messages to the console that don’t inspire confidence, but they seem to actually be harmless!)

export LD_LIBRARY_PATH=/opt/vc/lib
/opt/vc/bin/raspivid -o video.h264 -t 10000

sudo ln -s /opt/vc/bin/raspivid /usr/bin/raspivid
raspivid -o video.h264 -t 10000

(added ‘export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/vc/lib’ to .bashrc)

sudo apt install ffmpeg

Direct encode to .mp4:

raspivid -ih -t 0 -o – | ffmpeg -i – -vcodec copy test.mp4

Enable ROS raspicam_node

To enable raspicam_node, do the following steps:

cd catkin_ws/src/
git clone https://github.com/fpasteau/raspicam_node.git raspicam
cd ..
catkin_make
source devel/setup.bash
export LD_PRELOAD=”/opt/vc/lib/libmmal_vc_client.so /opt/vc/lib/libvcsm.so /opt/vc/lib/libmmal_core.so /opt/vc/lib/libmmal_util.so”

rosrun raspicam raspicam_node _height:=1280 _width:=960

In another terminal:

rosservice call /camera/start_capture

Success!

With the above in place and working, getting the video presented in ROS is via standard practice.

Troubleshooting

If you run into problems, it would be best to start from a fresh install of Ubuntu/ROS, see:

Install ROS on Raspberry Pi 4B

Let me know your experience in the comments…

3D LiDAR SLAM – Quick Preview

Check out Rhoeby Dynamics new video showing recent work done at our lab using 3D LiDAR with 2D mapping (SLAM) on our Mini-Turty II robot platform.

See a quick preview at: 3D LiDAR SLAM.

For more details, to buy a turn-key fully assembled robot/kit, or integrate Rhoeby Dynamics mapping, navigation and frontier exploration solutions into your platform, contact us today!

Rhoeby 3D – Coming soon!

We’re excited about the future and it’s 3D!

Stay tuned.

Install ROS on Raspberry Pi 4B

This post covers getting ROS Melodic up and running on Raspberry Pi 4B using the Ubuntu 18.04 (Bionic) distribution of Linux.

To begin the install of ROS on Raspberry Pi 4B, we need to consider: which version of ROS should we use?

To answer this question, we could consider several factors:

  • most recent available?
  • best supported / documented in the community?
  • most well-tried and tested?
  • supports the hardware?

We also must take into consideration the Ubuntu version.

A quick review of the Ubuntu distros available for Pi 4B, shows that Ubuntu 19.02 is the officially supported version. However, the corresponding ROS version (for that version of Ubuntu) is not yet available from Open Robotics, the creators of ROS.

ROS distributions let developers work against a relatively stable codebase, and the ROS Wiki shows the ROS Melodic distro is targeted at the Ubuntu 18.04 (Bionic) release. For more details of which ROS releases inter-operate with which OS release, see:

http://wiki.ros.org/Distributions

Given the above considerations, Ubuntu 18.04 LTS is the best choice. Note the chain of dependency from hardware to OS to ROS version. Use of the Pi 4 results in the selection of a particular Ubuntu release (18.04), which then implies ROS Melodic.

Pre-built images for Ubuntu and ROS

To keep things simple and minimize effort, it’s preferable to use pre-built images for the ROS and Ubuntu distros (if available). This avoids the need to build ROS or Linux from source, but also raises a couple of questions:

  • 32-bit or 64-bit?
  • which official / unofficial image?

A quick search suggests it would be best to choose a 64-bit image for maximum performance and the ability to access all 4 GB of memory (if you have it!). More about this later.

The table below (shamelessly copied from http://wiki.ros.org/melodic/#installation) list the Platforms/Architecture combination for which there are ROS install binary packages available.

Ubuntu Artful Ubuntu Bionic Debian Stretch
x86_64 X X X
armhf X
arm64 X X

As can be seen, the ROS distro for Ubuntu Bionic supports arm64! This means that we can use a ROS binary on a 64-bit OS image to install ROS on Raspberry Pi 4B.

Need to update the on-board firmware

As of Raspberry Pi 4, we now have additional boot code considerations. On Pi 4 the boot firmware is stored on a separate EEPROM chip, not the SD card.

As of Raspberry Pi 4, we now have additional boot code considerations. On Pi 4 the boot firmware is stored on a separate EEPROM chip, not the SD card.

The boot firmware that ships with a new Pi 4 will not contain the latest code, because boards ship with the initial release of the firmware installed, which then gets revised later. Consequently, it is highly recommended, well, mandatory really, to update the firmware. See…

https://jamesachambers.com/raspberry-pi-4-bootloader-firmware-updating-recovery-guide/

Unfortunately, we must boot from Raspbian (the Linux distro from the Raspberry Pi Foundation) not Ubuntu, in order to update the firmware! That sucks potatoes, but it is what it is.

Create Raspbian SD card

So the first step towards updating the boot firmware is to create an SD card with Raspbian installed. Fret not, the process is simple and reliable. Get the zip file from:

https://downloads.raspberrypi.org/raspbian_lite_latest

Insert the SD card into a linux PC and create Raspbian SD card using the following command:

sudo unzip -p /home/jjordan/tmp/2019-09-26-raspbian-buster-lite.zip | sudo dd of=/dev/sdb bs=4M conv=fsync

If you’re not sure what to do here, or run into difficulties, you could refer to https://rhoeby.com/blog/backup-your-robot-software/ for some additional information.

After the above command completes, remove the card from the PC and re-insert it into your Pi 4. Power up your pi 4, and it should boot into Raspbian OS!

Update the firmware

To update the firmware, boot with Raspbian SD card and the Pi 4 connected to a TV/monitor, keyboard and ethernet connection with access to the internet. The default login credentials for Raspbian are:

login: pi
password: raspberry

For more complete details of updating the firmware, see…

https://jamesachambers.com/raspberry-pi-4-ubuntu-server-desktop-18-04-3-image-unofficial/

I’ll summarize the steps here. Using the keyboard and monitor attached to the Pi 4, manually type in the following commands:

sudo apt-get update && sudo apt-get dist-upgrade -y

sudo rpi-update

sudo rpi-eeprom-update -a

That’s it!

Create the 64-bit Ubuntu SD card

Now that we’ve gotten the work of updating the firmware out of the way, we’re free to move forward with getting Ubuntu installed. We’re going to do this on a separate SD card, as we would like to keep our Raspbian OS for any future firmware updates.

Begin by getting the unofficial Ubuntu image from:

https://github.com/TheRemote/Ubuntu-Server-raspi4-unofficial/releases/download/v22/ubuntu-18.04.3-preinstalled-server-arm64+raspi4.img.xz

If you run into a problem with the ‘.xz’ file, you may need to unzip it and re-zip it to ‘.zip’ file format. I had to do this because my ancient Samsung laptop that I use for low-level tasks like burning SD cards does not understand ‘.xz’ files (nor will it ever!)

Having downloaded the Ubuntu binary image, we then burn it to the SD card:

sudo unzip -p /home/jjordan/tmp/ubuntu-18.04.3-preinstalled-server-arm64+raspi4.zip | sudo dd of=/dev/sdb bs=4M conv=fsync

Transfer the SD card to the Pi 4 and it should boot into Ubuntu!… But, no WiFi yet, of course!

Bring up WiFi

In more recent incarnations of Ubuntu, the way WiFi is configured has changed. If you’re going from Ubuntu 16.04 to 18.04, you will need to get on-board (forgive the pun) with this new method. Not to worry, it’s really simple.

For an overview of this method, known as “netplan”, here’s a good reference:

https://www.idmworks.com/netplan-for-newbies/

To setup the Pi 4 WiFi, we simply edit the following file:

ubuntu@ubuntu:~$ cat /etc/netplan/50-cloud-init.yaml

# This file is generated from information provided by
# the datasource.  Changes to it will not persist across an instance.
# To disable cloud-init's network configuration capabilities, write a file
# /etc/cloud/cloud.cfg.d/99-disable-network-config.cfg with the following:
# network: {config: disabled}
network:
    ethernets:
        eth0:
            dhcp4: true
            match:
                macaddress: dc:a6:32:27:8e:41
            set-name: eth0
    version: 2

    wifis:
        wlan0:
            dhcp4: yes
            dhcp6: no
            access-points:
                NETGEAR20:
                    password: xxx

Ignore the confusing comment about changes not persisting across an instance. It’s not relevant to our situation. Just set the WiFi SSID and password, then reboot. The above example shows how to change the file given the following:

SSID: NETGEAR20
password: xxx

Simply change to above to suit whatever your WiFi credentials happen to be.

Reboot your Pi 4. WiFi is up!

Install ROS on Raspberry Pi 4B

We’re finally ready to Install ROS on Raspberry Pi 4B! For more complete information, see…

http://wiki.ros.org/melodic/Installation/Ubuntu

Here’s the quick summary of the steps:

sudo sh -c ‘echo “deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main” > /etc/apt/sources.list.d/ros-latest.list’

sudo apt-key adv –keyserver ‘hkp://keyserver.ubuntu.com:80’ –recv-key C1CF6E31E6BADE8868B172B4F42ED6FBAB17C654

sudo apt update

Deviating from my previous installs (Indigo and Kinetic, where I used ‘sudo apt-get install ros-kinetic-robot’):

sudo apt install ros-melodic-ros-base

sudo rosdep init
rosdep update

echo “source /opt/ros/melodic/setup.bash” >> ~/.bashrc
source ~/.bashrc

sudo apt install python-rosinstall python-rosinstall-generator python-wstool build-essential

sudo apt install ros-melodic-slam-gmapping
sudo apt install ros-melodic-navigation

Update (07/03/2020):

On running hector mapping, I had the following problem:

ERROR: cannot launch node of type [joint_state_publisher/joint_state_publisher]: joint_state_publisher

The above is no doubt because I did a ‘base’ install, not a ‘robot’ install!

sudo apt install ros-melodic-robot

… after which, it’s working fine.

That’s all. We’re done!

Liberate Your Robot With Ad-Hoc Networking

Don’t let your robot be trapped by it’s WiFi network… go with Robot Ad-Hoc Networking!

Ad-Hoc Wireless Connection
Ad-Hoc Wireless Connection

Robot Ad-Hoc Networking is easy to configure and means that you, your robot, and other pre-configured devices like a laptop or tablet, can go anywhere and not need a WiFi (router / internet) connection. This configuration allows you to enter a new location and operate the robot without access to any local networks.

The Scenario

When going “into-the-field”, robot operators want:

a static IP address (for the robot)

Ad-Hoc wireless network

DHCP server (for connecting clients to the robot, eg. laptop or tablet)

When back at the office, operators want:

dynamic IP address

regular wireless network

disable DHCP server

The robot should automatically configure itself on boot up, based on what WiFi signals it finds.

The Solution

The file ‘rc.local’ calls ‘/home/ubuntu/scripts/go_dhcp_ad_hoc.sh’. This script checks to see if we got an IP address for WiFi (‘wlan0’). If yes, do nothing, otherwise configure Robot Ad-Hoc Networking and the DHCP server.

To use this method, install and configure:

  • DHCP server
  • ‘go_dhcp_ad_hoc.sh’ script
  • call script from rc.local

On boot, system will try to connect to the WiFi, if it fails then it will create the Ad-Hoc network and launch the DHCP server.

Install and Run

Note: please backup your robot software before making any modifications

1. Install required modules (run these commands on the robot):

$ sudo apt-get update

$ sudo apt install iw

$ sudo apt-get install isc-dhcp-server

2. Edit ‘/etc/dhcp/dhcpd.conf’, add the following to the end of the file:

subnet 192.168.0.0 netmask 255.255.255.0 {
     range 192.168.0.10 192.168.0.40;
     option broadcast-address 191.168.0.255;
     default-lease-time 600;
     max-lease-time 7200;
     authoritative;
 }

3. Create DHCP init scripts:

$ sudo cp /etc/default/isc-dhcp-server /etc/default/isc-dhcp-server.orig

$ sudo cp /etc/default/isc-dhcp-server /etc/default/isc-dhcp-server.wlan

$ sudo vi /etc/default/isc-dhcp-server.wlan

Change the last line of ‘etc/default/isc-dhcp-server.wlan’ to:

INTERFACES="wlan0"

4. Install script (see below):

$ vi /home/ubuntu/scripts/go_dhcp_ad_hoc.sh

  (copy the content from below)

$ sudo chmod 777 /home/ubuntu/scripts/go_dhcp_ad_hoc.sh

5. Edit ‘/etc/rc.local’ to call the script, by adding the following to the end of the file:

sleep 2
sudo /home/ubuntu/scripts/go_dhcp_ad_hoc.sh

You’re done!

Connecting to the Ad-Hoc Network

If the WiFi specified in /etc/network/interfaces’ is not found, robot will auto-configure Ad-Hoc network and present the Wifi SSID:

  MiniTurtyAdHoc

Connect your laptop or tablet using the Ad-Hoc WiFi connection, and then use the robot in the normal manner. The IP adress of the robot in Ad-Hoc mode is:

  192.168.0.1

To disable this functionality (prevent boot into Ad-Hoc mode), simply comment out (using ‘#’) the line in the ‘rc.local’ file.

The ‘go_dhcp_ad_hoc.sh’ Script

$ cat ~/scripts/go_dhcp_ad_hoc.sh:

#!/bin/sh
if ! wpa_cli -i wlan0 status | grep ip_address
then
  echo 'Not got IP address, setting up Ad-Hoc'
  if pstree | grep wpa_supplicant
  then
    sudo killall wpa_supplicant
  fi
  if pstree | grep avahi-daemon
  then
    sudo killall avahi-daemon
  fi
  if pstree | grep dhclient
  then
    sudo killall dhclient
  fi
  sleep 1

  sudo ifconfig wlan0 down
  sudo iwconfig wlan0 mode Ad-Hoc
  sudo iwconfig wlan0 essid MiniTurtyAdHoc
  sudo ifconfig wlan0 192.168.0.1
  sudo ifconfig wlan0 up

  sleep 1
  sudo cp /etc/default/isc-dhcp-server.wlan /etc/default/isc-dhcp-server
  sudo systemctl restart isc-dhcp-server
  sudo cp /etc/default/isc-dhcp-server.orig /etc/default/isc-dhcp-server
fi

Need more help?… Ask a question in the comments below!

Navigating Mobile Robot Mapping

To understanding the Mini Turty ROS mobile robot mapping operation, we begin from the simple script that is used to start the robot.

The Starting Point

Typically, for doing mapping users will call:

$ ./mini_turty_mapping.sh

The above file is contained in the ubuntu home directory (‘/home/ubuntu’). If you look in that file, you will see the line similar to:

$ roslaunch mini_turty3 hector_ydlidar_demo.launch

The above is what starts the robot doing mobile robot mapping. Let’s break that down into the pieces. The first part is the ‘roslaunch’ program.

Navigating in a map showing obstacle inflation
Navigating in a map showing obstacle inflation

This program is part of the ROS system itself. It’s used to launch many different ROS nodes via one or more launch files. In the above case, roslaunch is launching a launch file called ector_ydlidar_demo.launch’, and this launch file is contained in the ROS package called ‘mini_turty3’.

So where is this ‘hector_ydlidar_demo.launch’ file, you may be asking. There are a couple of ways of finding that, but one convenient method is to use the ROS bash shell command ‘roscd’. Type the following into a console:

$ source catkin_ws/devel/setup.bash

$ roscd mini_turty3

This will take you to the directory containing the mini_turty3 package. If
you then run ‘pwd’:

$ pwd

You will see:

/home/ubuntu/catkin_ws/src/mini_turty3

So ‘roscd’ has taken us to the directory containing the ROS package ‘mini_turty3’. Run ‘ls’:

$ ls

This will show the directory contents similar to:

CMakeLists.txt include launch package.xml param scripts src

Note that we have a sub-directory called ‘launch’ here. This is where the previously mentioned ‘hector_ydlidar_demo.launch’ launch file exists. If you run:

$ ls launch

You will see output similar to:

amcl_rd_demo.launch hector_ydlidar_demo.launch
minimu9-ahrs.launch mini_turty.urdf teleop.launch test.xacro
hector_move_base_demo.launch includes mini_turty3.urdf
robot_pose_ekf.launch test.urdf

Which lists all the launch files in the mini_turty3 package, including the one we are currently interested in (‘hector_ydlidar_demo.launch’). Let’s open our launch file and examine its contents:

$ vi launch/hector_ydlidar_demo.launch

The ‘hector_ydlidar_demo.launch’ launch file launches several ROS nodes.

The LiDAR Node

The first ROS node to be launched is:

  <node name="ydlidar_node"  pkg="ydlidar"  type="ydlidar_node"
output="screen">

This node is the ROS driver for the lidar. It handles all interaction with the lidar device and is responsible for starting/stopping, gathering the data from the lidar, and publishing the lidar data to ROS.

The G2 lidar
The G2 lidar

The Mapping Node

The next node to be launched is:

  <node pkg="hector_mapping" type="hector_mapping" name="hector_mapping"
output="screen">

This node is the “Hector Mapping” node. It’s responsible for the creation of the map, using the data from the lidar. It takes lidar scans and adds them into the map. The map is then published to ROS for other nodes to access.

The Robot Base Node

Moving on, the next important node is:

  <node pkg="mini_turty3" type="mini_turty3" name="mini_turty3"
required="true">

This node is used to control the robot base. It’s responsible for responding to ROS tele-op commands, and for publishing odometry.

Each of the above launched nodes has a ROS package under the ‘catkin_ws’
directory. If you run the command:

$ ls ~/catkin_ws/src/

You will see several ROS packages (for some robots, there could be many more!):

minimu9-ahrs mini_turty3 raspicam_node ydlidar

You may ‘roscd’ into any of these packages. But let’s stay in our ‘mini_turty3’ package and examine the source code of this node:

$ ls src

You will see:

mini_turty3.cpp

This is the source file for the ‘mini_turty3’ base. You could open this using many different editors, but let’s use ‘vi’ again:

$ vi src/mini_turty3.cpp

Here you will find the ‘main()’ program for the mini_turty3 base. Please examine this source code to understand how the base operates. Here’s the code snippet of the main function:

int main(int argc, char *argv[])
{
  int pi;
  int enable_odom_transform;
  
  pi = init_gpio();
  if(pi) {
    printf("Error initializing pigpio!\n");
    return pi;
  }

  ros::init(argc, argv, "mini_turty");
  ros::NodeHandle private_node_handle_("~");
  ros::Rate loop_rate(50);

  private_node_handle_.param("enable_odom_transform", enable_odom_transform, int(0));

  MiniTurtyBase mini(pi);
  mini.setCmdVelTopic("cmd_vel");
  mini.setEnableOdomTransform(enable_odom_transform);

  while (ros::ok()) 
  {
    if (got_ctrl_c) {
      ROS_WARN("Got Ctrl-C");
      break;
    }

    ros::spinOnce();
    loop_rate.sleep();
  }

  // set SLEEP (active low)
  gpio_write(pi, GPIO_SLEEP, 0);
  pigpio_stop(pi);

  ROS_WARN("Exiting.");

  exit(0);
}

Note that most ROS nodes have a ‘src’ directory, and that is where the code for a node is usually kept. For example, you may find the code for the lidar in ‘catkin_ws/src/ydlidar/src’.

Binary Install of ROS Packages

Be aware that many ROS packages are by default ‘binary install’. This means that you currently have pre-built binaries only (to save build time and storage space!). This is the case for ‘hector_mapping’. However, you may access the source code, and even install it onto the robot by going to the github location. To see the source code for hector_mapping, first go
to the ROS wiki:

http://wiki.ros.org/hector_mapping

Then click on the:

Source: git https://github.com/tu-darmstadt-ros-pkg/hector_slam.git (branch: melodic-devel)

This immediately takes you to the github repository containing the Hector Mapping source code.

If you decide to install any source code, please make sure to back up your SD card first, and then restore it to the original if you find any problem. This will be important if you require support, as it is not possible for us to debug a user-modified installation.

As you can see, navigating your way through the Mini Turty ROS packages is not difficult. Just take it one step at a time and follow the path through in a manner similar to that which is described above. You will soon be getting oriented and making your way in the world of ROS autonomous mobile robot mapping!

Visualizing The LiDAR Data On Your Mapping Robot

You’ve setup your robot and remote PC, now the fun begins. Let’s visualize the robot laser data on our autonomous LiDAR mapping robot!

If you’ve not already done so, power up your robot, initialize it and run the mapping script (see our Tutorials for more details). With the robot now running (and the on-robot lidar device scanning), follow this procedure:

First, make sure that the remote PC is configured to work with your robot. Switch to your remote PC and open up a terminal window in Ubuntu. In the terminal, type:

env | grep ROS

In our example, here is what we got:

ROS environment variables
ROS environment variables

From this, we can see that our remote PC is currently configured to work with a ROS master that would be located on the same machine. Specifically, the line:

ROS_MASTER_URI=http://localhost:11311

That shows us that this host expects to find its ROS master locally. Well, that’s not how we want to work with our robot (we want to make our robot the ROS master). So let’s go ahead and change that, by running the following command:

export ROS_MASTER_URI=http://ubuntu:11311

The above command tells our remote PC that the ROS master is to be found on the host with the hostname ‘ubuntu’. So now, when we run ROS commands, these commands will go to our robot (whose default name is: ubuntu). If you have renamed your robot, you will want to replace the robot name with whatever name you used.

Examining ROS Topics

Having configured our remote PC to use the robot as the ROS master, let’s quickly examine what ROS topics are being published by our robot:

rostopic list

The above command yields the following response:

ROS topics
ROS topics

This lists out several ROS topics that are running, including the ‘/scan’ topic which is where our laser data is presented. Let’s have a quick look at that laser data, by typing the following:

rostopic echo /scan

Which shows:

Echo topic 'scan'
Echo topic ‘scan’

That’s the tail end of a ROS topic ‘/scan’ showing the data point “intensities”. We’ll skip over the details of just what that is.

Visualizing the LiDAR Data

For now, let’s visualize the scan data in the graphical package RViz, by typing:

rosrun rviz rviz

This will start the ROS RViz program, where we will be visualizing the robot laser data from our autonomous lidar mapping robot…

Visualizing the LiDAR data
Visualizing the LiDAR data

If all went well, you should be seeing something similar to the above. If not, you will want to check your ROS connectivity (see our Tutorials for more details). Enjoy!

Register with Rhoeby to Win a Mini-Turty Flex!

Win a ROS navigation-capable robot! Advanced robotics navigation, map building, tele-op, tele-viewing, frontier exploration and computer vision. Little robots with BIG capabilities.

Register for a new account here:

https://rhoeby.com/my-account/

for a chance to win one of these marvelous robots.

Mini-turty Flex Robot

The raffle runs until the end of the Bay Area Maker Faire 2019, so hurry for a chance to win. Winner will be announced in this blog on 5/24/19.

See us and the robots at the Bay Area Maker Faire 2019!

See us and the robots at the Bay Area Maker Faire 2019!

Things You Can Do With The Mini-Turty Flex Robot

There are many things you can do with your Mini-Turty Flex robot, including:

  • Basics: ROS learning
  • Teleop: robot remote control
  • Map Building: make maps of your home or office for the robot to use
  • Navigation: the robot moves autonomously around your home or office
  • Tele-Viewing: see what your robot sees, even from another room
  • Frontier Exploration: the robot autonomously explores unknown terrain
  • Computer Vision: Mini-Turty Flex recognizes objects in its environment

Finding Your Robot IP Address

So your robot is powered up and hopefully it’s properly connected to the WiFi network, but you can’t seem to figure out the IP address.

There are several ways to find out what IP address your robot is using. One involves using a program called “ping”, and requires that you know your robot hostname. Another involves plugging in an HDMI monitor and keyboard to the robot. Yet another requires logging into your router to search for connected devices. We’ll detail all of these techniques here.

Using Ping To Determine Your Robot IP Address

This method is quick and simple assuming you know your robot hostname and your network is properly configured. It goes like this:

1. Open a terminal window on any host connected to the network. For this example, we’ll use a Windows PC and the Command Prompt (but the command is the same regardless). In the terminal window, type:

ping ubuntu

You should see output similar to:

As can be seen above, the IP address of this host (hostname: ubuntu) is ‘192.168.0.33’.

Getting The IP Address Using Monitor And Keyboard

This method is often the easiest if you don’t know (or have mislaid!) your robot network hostname. It’s also the most reliable method, as there is no potential for hostname conflicts that might cause issues with the ping or router method. So to begin:

1. Plug in your HDMI monitor into the HDMI port on the Raspberry Pi.

2. Then plug in a USB keyboard, and power up your robot. You should be able to observe the Raspberry Pi booting up on the HDMI monitor, after which, it will present you with a login prompt.

3. Login using the usual credentials.

4. Then type:

ifconfig

Your robot will report its IP address.

You can now disconnect the keyboard and monitor and continue to use your robot. Note that most networks provide for some “stickiness” for the IP address, meaning that the robot could retain the same address even after shutdown and reboot, if the period between shutdown and reboot is not too long (up to 1 day, is not uncommon).

Getting The IP Address From The Router

This method uses the router itself to find out what devices are connected to the network. Most routers can provide this information along with the hostname of the connected device. Begin by logging into your router (this process is router specific, but you could find one example here).

After you login to your router, navigate to the “Attached Devices” panel (or where-ever your particular router stores this information), and check for a wireless device name that matches the hostname of your robot (default is: ubuntu). Then simply lookup the IP address associated with your robot name, as presented by the router (usually in some sort of lookup table).

Encoder-Free Robot Odometry Using Stepper Motors

Many ground-based robots employ motor shaft encoders to help provide a reasonable estimate of the robot movement along the surface of the ground, also known as robot odometry.

This technique enables robot odometry via feedback from the motor shaft. But what if we could do away with that and adopt a simpler approach? Could similar results be achieved whilst lowering costs and simplifying the system? The answer to that question is a resounding yes, and the solution is to use stepper motors.

Conventional DC Motor and Encoder-Based Systems

Conventional methods for wheeled-robot odometry employ shaft rotation feedback encoders that measure that actual rotation of the wheels on the robot. Typically using hall-effect or optical sensors, data is acquired as a series of pulses from the encoders, the frequency of which increases as the wheel turns faster. Owing to the nature of many electrical motor systems, it’s otherwise not possible to accurately predict how much a wheel will turn, given a certain electro-motive force. Thus shaft encoders are a mandatory feature on most ground-based robots that require odometry.

DC motor with shaft encoder
A typical DC motor with shaft encoder

Wheel encoders typically use 3-4 wires per encoder (plus the motor connections), along with the use of a software-based Proportional Integral Derivative (PID) controller to convert that into a usable odometry.

On a two-wheel drive robot, that then necessitates up to twelve connections for the data and power lines, and the creation of two PID filters. That’s a lot of resources just to be able to know what your wheels are doing!

Using Stepper Motors

Stepper motors are in common use today in many applications, with one of the most notable examples being that of 3D printers. A 3D printer requires highly accurate positioning abilities and the ability to hold the motor in any given position. That’s a nice feature of stepper motors for 3D printing, but they can also be used to drive robot wheels. Robots that utilize stepper motors for locomotion produce highly predictable and accurate motion, and can also be put to “Sleep” when any given wheel is stationary, thus saving power.

A stepper motor
A typical stepper motor

Notable is the fact that stepper motors turn in discrete steps (thus the name!). That is to say, that under the control of a dedicated circuit known as a stepper controller, the motor shaft can be positioned anywhere around the 360 degree rotation, and typically with 1.8 degree step size.

Inferring the Motor Rotation

By providing a continuous pulse train to the stepper controller, the on-robot computer can cause the stepper to produce a continuous rotation, the speed of which is proportional to the frequency of the pulse train. A low frequency pulse train causes the wheel to turn slowly, whilst a high frequency pulse train causes the wheel to turn faster. In a properly designed system, there will be no missed steps, meaning that each pulse will result in a single step of the motor, and so the host computer can infer exactly what the speed of rotation of the wheel is, essentially by counting the pulses it is sending to the motor controller.

A stepper motor controller
The stepper motor controller

Most modern host computers (including the Raspberry Pi) have dedicated hardware that can produce the necessary pulse trains to the stepper motor controllers. This hardware is know as Pulse Width Modulation (PWM), and is commonly used to create the necessary signals. As such, this requires very little intervention by the host computer. Furthermore, once setup, speed control and odometry are achieved with no additional resources. Thus encoder-free odometry using stepper motors is both simple and easy to implement, both at the hardware level (no need for encoder wiring) and at the software level (no need for PID controller).

Examining the Code

Underside of controller
Underside of controller showing connections

On the Mini-Turty host computer, using a timer callback, we periodically update the speed of the motor according to what is instructed by the robot teleop / velocity commands.

For our differential drive robot, we use stepper motors driven by stepper controllers, which are directly controlled by the Raspberry Pi host computer. To understand how we derive the odometry, consider the following code snippet:

void MiniTurtyBase::timerCallback(const ros::TimerEvent)
{
  double vx;
  double vth;
  
  updateMotorControl(DIFFDRIVE_MOTOR_LEFT);
  updateMotorControl(DIFFDRIVE_MOTOR_RIGHT);

  vx = (motors[DIFFDRIVE_MOTOR_RIGHT].currentSpeed + 
        motors[DIFFDRIVE_MOTOR_LEFT].currentSpeed) / 2;
  vth = (motors[DIFFDRIVE_MOTOR_RIGHT].currentSpeed - 
         motors[DIFFDRIVE_MOTOR_LEFT].currentSpeed) * 8;

  publishOdom(vx, vth);
}

The above timer callback function is attached to a ROS timer, and hence is called periodically. On each invocation, the callback updates the motor control with the new motor speeds for the left and right motors. It then calculates the velocity ‘vx’, which is the speed that the robot is moving in the forward (or backward) direction. It also calculates the velocity ‘vth’, which is the speed at which the robot is rotating about the z-axis (aka turning!). These two velocities are derived from the current speed that the motors are rotating at, which in turn is derived from the speed that was set. Thus we achieve a quasi-feedback from the motors that is purely software-based. These calculated linear and rotational velocities are then passed to the function ‘publishOdom()’, whose purpose is to send the odometry information back to ROS.

Taking Into Account Slew Rate

At this point, the astute reader might be wondering: why not just use the speed that was set by the incoming velocity command and publish that? Well the answer to that is we need to take into account the fact that stepper motors do not instantaneously change speed. For most velocity alterations, it is necessary to “slew” the rate of the motor. Slewing, in this context, is the process of gradually changing from one speed to another. Thus a motor that is going from say 20 RPM to 100 RPM might make that change as a series of steps, going from 20, to 30, to 40, and so on up to 100. So in order to provide accurate odometry information, it’s important that the actual speed of the motor is used, rather than any target speed that we may be currently slewing toward.

The full code for the Mini-Turty series of robots, including the code for the robot odometry, can be found at our github repository.

Summary

It turns out that practical systems for robot odometry can be achieved through the use of steppers motors as the primary means of locomotion. When combined with modern LiDAR, this then enables robot localization, mapping and navigation. Moreover, this use of stepper motors gives rise to a simpler implementation that’s easier to create, cheaper to realize, and provides a result comparable to that achieved with the more conventional system using DC motors and encoders.