Disinfecting Killer Robot!

Frontier Exploration For Disinfectant Robot

An in-depth look at the mapping, navigation and frontier exploration capabilities of the Magni robot, with enhanced capabilities from Rhoeby Dynamics. Two case studies are shown; one commercial premises and one domestic. See the robot explore unknown space whilst creating the map. Video includes disinfectant* robot “head” using UV lamp.

Robot intro: spec, including: lidar, sonar integrated into mapping and navigation, UV lamp (simulated disinfectant lamp for demo purposes only)

Robot Frontier Exploration:

  • explores previously unknown environments autonomously
  • new frontiers are established as robot progresses forward
  • explores biggest frontiers first
  • makes smooth turns
  • navigates through center of doorways
  • navigates tight spaces

For more details: Contact Us

Announcing Mini Turty Pro!

Rhoeby Dynamics is pleased to announce the new Mini Turty Pro robot. Our best mini robot yet, it’s the ultimate in a full-featured ROS navigation-capable machine!

The Mini Turty Pro is a compact and powerful robot representing the well-informed robot of choice for mapping, navigation, frontier exploration and much, much more!

Mini Turty Pro

Why Choose The Mini Turty Pro Robot?

You want to own a robot that fully supports your current and future requirements, with features including:

  • Raspberry Pi 4B: faster builds, lower max. CPU loading (<25%) compared to Pi 3, so things like navigation and frontier exploration work smoothly, and work concurrently, leaving ample processing power for additional functions
  • High-quality IMU: full AHRS support, 9-DoF sensor fusion
  • Better planar kinematics: robot moves faster and smoother
  • High-capacity battery: 2200 mAh provides up to 6 hours of continuous operation
  • Better LiDAR: finer angular resolution, more accurate range sensor builds better maps
  • Low center of gravity: not top heavy, like some other robots
  • Castor wheel: not a ball, so robot can go over uneven surface (eg. carpet or rug)
  • Choice of LiDAR: including G2, G4, and SICK TiM 571*
  • Video camera: 2+ hours of internal video recording, remote monitoring, with timestamped position logging
  • Web-browser API: built-in web server, control the robot using your browser
  • High-capacity voltage regulator: 12V / 5V @ 5A provides plenty of power for the robot and the Raspberry Pi 4
  • Charger included: we give you everything you need

Visit our shop to order your robot today!

Adding ROS Raspberry Pi 4B Camera

This post describes how to get the Raspberry Pi 4B camera working under ROS. Try following these steps to get it working on your Pi 4B-based robot too!

Where to Begin?

The basic config of the robot is:

The previous employed Ubiquity approach failed on Pi 4. So I fall back to trying to install similar to:

https://github.com/fpasteau/raspicam_node,

and,

https://github.com/raspberrypi/userland

The Details

Get the userland code:

git clone https://github.com/raspberrypi/userland.git /home/ubuntu/userland

Then…

cd userland
./buildme –aarch64

The build appeared to succeed.

My quitter, lame attempt to see if it worked failed:

raspicam

Command not found.

But,

ubuntu@ubuntu-pi4:~/userland$ sudo find . -name raspicam
./host_applications/linux/apps/raspicam

So the code appears to be there but is perhaps not getting built. Note the comment on ‘https://github.com/raspberrypi/userland’…

“Whilst 64-bit userspace is not officially supported, some of the libraries will work for it.”

There might be a ray of hope here:

https://github.com/6by9/userland/tree/64bit_mmal

Trying it…

git clone https://github.com/6by9/userland.git

Step-by-step commands:

cd src/
git clone https://github.com/6by9/userland.git
cd userland/
git checkout 64bit_mmal
./buildme –aarch64
ls /opt/vc/bin/

raspivid
/opt/vc/bin/raspivid
find . -name libmmal_core.so

This now runs:

LD_PRELOAD=”/opt/vc/lib/libmmal_vc_client.so /opt/vc/lib/libvcsm.so /opt/vc/lib/libmmal_core.so /opt/vc/lib/libmmal_util.so” /opt/vc/bin/raspistill -o cam.jpg

But gives errors:

mmal: Cannot read camera info, keeping the defaults for OV5647
mmalipc: mmal_vc_dump_client_components: mmal_vc_dump_client_components: 0 entries in use
mmalipc: mmal_vc_dump_client_contexts: mmal_vc_dump_client_contexts: 0 entries in use
mmal: mmal_vc_component_create: failed to create component ‘vc.ril.camera’ (1:ENOMEM)
mmal: mmal_component_create_core: could not create component ‘vc.ril.camera’ (1)
mmal: Failed to create camera component
mmal: main: Failed to create camera component
mmal: Camera is not enabled in this build. Try running “sudo raspi-config” and ensure that “camera” has been enabled

Fixing the Runtime Errors

So next… I added:

start_x=1
gpu_mem=128

to ‘/boot/firmware/config.txt’.

And added:

SUBSYSTEM=vchiq,GROUP=video,MODE=0660

to ‘/etc/udev/rules.d/10-vchiq-permissions.rules’.

sudo vi /etc/udev/rules.d/10-vchiq-permissions.rules
sudo usermod -a -G video ubuntu
sudo reboot
LD_PRELOAD=”/opt/vc/lib/libmmal_vc_client.so /opt/vc/lib/libvcsm.so /opt/vc/lib/libmmal_core.so /opt/vc/lib/libmmal_util.so” /opt/vc/bin/raspistill -o cam.jpg
680 LD_PRELOAD=”/opt/vc/lib/libmmal_vc_client.so /opt/vc/lib/libvcsm.so /opt/vc/lib/libmmal_core.so /opt/vc/lib/libmmal_util.so” /opt/vc/bin/raspivid -o video.h264 -t 10000

(Note: there are some messages to the console that don’t inspire confidence, but they seem to actually be harmless!)

export LD_LIBRARY_PATH=/opt/vc/lib
/opt/vc/bin/raspivid -o video.h264 -t 10000

sudo ln -s /opt/vc/bin/raspivid /usr/bin/raspivid
raspivid -o video.h264 -t 10000

(added ‘export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/vc/lib’ to .bashrc)

sudo apt install ffmpeg

Direct encode to .mp4:

raspivid -ih -t 0 -o – | ffmpeg -i – -vcodec copy test.mp4

Enable ROS raspicam_node

To enable raspicam_node, do the following steps:

cd catkin_ws/src/
git clone https://github.com/fpasteau/raspicam_node.git raspicam
cd ..
catkin_make
source devel/setup.bash
export LD_PRELOAD=”/opt/vc/lib/libmmal_vc_client.so /opt/vc/lib/libvcsm.so /opt/vc/lib/libmmal_core.so /opt/vc/lib/libmmal_util.so”

rosrun raspicam raspicam_node _height:=1280 _width:=960

In another terminal:

rosservice call /camera/start_capture

Success!

With the above in place and working, getting the video presented in ROS is via standard practice.

Troubleshooting

If you run into problems, it would be best to start from a fresh install of Ubuntu/ROS, see:

Install ROS on Raspberry Pi 4B

Let me know your experience in the comments…

3D LiDAR SLAM – Quick Preview

Check out Rhoeby Dynamics new video showing recent work done at our lab using 3D LiDAR with 2D mapping (SLAM) on our Mini-Turty II robot platform.

See a quick preview at: 3D LiDAR SLAM.

For more details, to buy a turn-key fully assembled robot/kit, or integrate Rhoeby Dynamics mapping, navigation and frontier exploration solutions into your platform, contact us today!

Rhoeby 3D – Coming soon!

We’re excited about the future and it’s 3D!

Stay tuned.

Install ROS on Raspberry Pi 4B

This post covers getting ROS Melodic up and running on Raspberry Pi 4B using the Ubuntu 18.04 (Bionic) distribution of Linux.

To begin the install of ROS on Raspberry Pi 4B, we need to consider: which version of ROS should we use?

To answer this question, we could consider several factors:

  • most recent available?
  • best supported / documented in the community?
  • most well-tried and tested?
  • supports the hardware?

We also must take into consideration the Ubuntu version.

A quick review of the Ubuntu distros available for Pi 4B, shows that Ubuntu 19.02 is the officially supported version. However, the corresponding ROS version (for that version of Ubuntu) is not yet available from Open Robotics, the creators of ROS.

ROS distributions let developers work against a relatively stable codebase, and the ROS Wiki shows the ROS Melodic distro is targeted at the Ubuntu 18.04 (Bionic) release. For more details of which ROS releases inter-operate with which OS release, see:

http://wiki.ros.org/Distributions

Given the above considerations, Ubuntu 18.04 LTS is the best choice. Note the chain of dependency from hardware to OS to ROS version. Use of the Pi 4 results in the selection of a particular Ubuntu release (18.04), which then implies ROS Melodic.

Pre-built images for Ubuntu and ROS

To keep things simple and minimize effort, it’s preferable to use pre-built images for the ROS and Ubuntu distros (if available). This avoids the need to build ROS or Linux from source, but also raises a couple of questions:

  • 32-bit or 64-bit?
  • which official / unofficial image?

A quick search suggests it would be best to choose a 64-bit image for maximum performance and the ability to access all 4 GB of memory (if you have it!). More about this later.

The table below (shamelessly copied from http://wiki.ros.org/melodic/#installation) list the Platforms/Architecture combination for which there are ROS install binary packages available.

Ubuntu Artful Ubuntu Bionic Debian Stretch
x86_64 X X X
armhf X
arm64 X X

As can be seen, the ROS distro for Ubuntu Bionic supports arm64! This means that we can use a ROS binary on a 64-bit OS image to install ROS on Raspberry Pi 4B.

Need to update the on-board firmware

As of Raspberry Pi 4, we now have additional boot code considerations. On Pi 4 the boot firmware is stored on a separate EEPROM chip, not the SD card.

As of Raspberry Pi 4, we now have additional boot code considerations. On Pi 4 the boot firmware is stored on a separate EEPROM chip, not the SD card.

The boot firmware that ships with a new Pi 4 will not contain the latest code, because boards ship with the initial release of the firmware installed, which then gets revised later. Consequently, it is highly recommended, well, mandatory really, to update the firmware. See…

https://jamesachambers.com/raspberry-pi-4-bootloader-firmware-updating-recovery-guide/

Unfortunately, we must boot from Raspbian (the Linux distro from the Raspberry Pi Foundation) not Ubuntu, in order to update the firmware! That sucks potatoes, but it is what it is.

Create Raspbian SD card

So the first step towards updating the boot firmware is to create an SD card with Raspbian installed. Fret not, the process is simple and reliable. Get the zip file from:

https://downloads.raspberrypi.org/raspbian_lite_latest

Insert the SD card into a linux PC and create Raspbian SD card using the following command:

sudo unzip -p /home/jjordan/tmp/2019-09-26-raspbian-buster-lite.zip | sudo dd of=/dev/sdb bs=4M conv=fsync

If you’re not sure what to do here, or run into difficulties, you could refer to https://rhoeby.com/blog/backup-your-robot-software/ for some additional information.

After the above command completes, remove the card from the PC and re-insert it into your Pi 4. Power up your pi 4, and it should boot into Raspbian OS!

Update the firmware

To update the firmware, boot with Raspbian SD card and the Pi 4 connected to a TV/monitor, keyboard and ethernet connection with access to the internet. The default login credentials for Raspbian are:

login: pi
password: raspberry

For more complete details of updating the firmware, see…

https://jamesachambers.com/raspberry-pi-4-ubuntu-server-desktop-18-04-3-image-unofficial/

I’ll summarize the steps here. Using the keyboard and monitor attached to the Pi 4, manually type in the following commands:

sudo apt-get update && sudo apt-get dist-upgrade -y

sudo rpi-update

sudo rpi-eeprom-update -a

That’s it!

Create the 64-bit Ubuntu SD card

Now that we’ve gotten the work of updating the firmware out of the way, we’re free to move forward with getting Ubuntu installed. We’re going to do this on a separate SD card, as we would like to keep our Raspbian OS for any future firmware updates.

Begin by getting the unofficial Ubuntu image from:

https://github.com/TheRemote/Ubuntu-Server-raspi4-unofficial/releases/download/v22/ubuntu-18.04.3-preinstalled-server-arm64+raspi4.img.xz

If you run into a problem with the ‘.xz’ file, you may need to unzip it and re-zip it to ‘.zip’ file format. I had to do this because my ancient Samsung laptop that I use for low-level tasks like burning SD cards does not understand ‘.xz’ files (nor will it ever!)

Having downloaded the Ubuntu binary image, we then burn it to the SD card:

sudo unzip -p /home/jjordan/tmp/ubuntu-18.04.3-preinstalled-server-arm64+raspi4.zip | sudo dd of=/dev/sdb bs=4M conv=fsync

Transfer the SD card to the Pi 4 and it should boot into Ubuntu!… But, no WiFi yet, of course!

Bring up WiFi

In more recent incarnations of Ubuntu, the way WiFi is configured has changed. If you’re going from Ubuntu 16.04 to 18.04, you will need to get on-board (forgive the pun) with this new method. Not to worry, it’s really simple.

For an overview of this method, known as “netplan”, here’s a good reference:

https://www.idmworks.com/netplan-for-newbies/

To setup the Pi 4 WiFi, we simply edit the following file:

ubuntu@ubuntu:~$ cat /etc/netplan/50-cloud-init.yaml

# This file is generated from information provided by
# the datasource.  Changes to it will not persist across an instance.
# To disable cloud-init's network configuration capabilities, write a file
# /etc/cloud/cloud.cfg.d/99-disable-network-config.cfg with the following:
# network: {config: disabled}
network:
    ethernets:
        eth0:
            dhcp4: true
            match:
                macaddress: dc:a6:32:27:8e:41
            set-name: eth0
    version: 2

    wifis:
        wlan0:
            dhcp4: yes
            dhcp6: no
            access-points:
                NETGEAR20:
                    password: xxx

Ignore the confusing comment about changes not persisting across an instance. It’s not relevant to our situation. Just set the WiFi SSID and password, then reboot. The above example shows how to change the file given the following:

SSID: NETGEAR20
password: xxx

Simply change to above to suit whatever your WiFi credentials happen to be.

Reboot your Pi 4. WiFi is up!

Install ROS on Raspberry Pi 4B

We’re finally ready to Install ROS on Raspberry Pi 4B! For more complete information, see…

http://wiki.ros.org/melodic/Installation/Ubuntu

Here’s the quick summary of the steps:

sudo sh -c ‘echo “deb http://packages.ros.org/ros/ubuntu $(lsb_release -sc) main” > /etc/apt/sources.list.d/ros-latest.list’

sudo apt-key adv –keyserver ‘hkp://keyserver.ubuntu.com:80’ –recv-key C1CF6E31E6BADE8868B172B4F42ED6FBAB17C654

sudo apt update

Deviating from my previous installs (Indigo and Kinetic, where I used ‘sudo apt-get install ros-kinetic-robot’):

sudo apt install ros-melodic-ros-base

sudo rosdep init
rosdep update

echo “source /opt/ros/melodic/setup.bash” >> ~/.bashrc
source ~/.bashrc

sudo apt install python-rosinstall python-rosinstall-generator python-wstool build-essential

sudo apt install ros-melodic-slam-gmapping
sudo apt install ros-melodic-navigation

Update (07/03/2020):

On running hector mapping, I had the following problem:

ERROR: cannot launch node of type [joint_state_publisher/joint_state_publisher]: joint_state_publisher

The above is no doubt because I did a ‘base’ install, not a ‘robot’ install!

sudo apt install ros-melodic-robot

… after which, it’s working fine.

That’s all. We’re done!

Liberate Your Robot With Ad-Hoc Networking

Don’t let your robot be trapped by it’s WiFi network… go with Robot Ad-Hoc Networking!

Ad-Hoc Wireless Connection
Ad-Hoc Wireless Connection

Robot Ad-Hoc Networking is easy to configure and means that you, your robot, and other pre-configured devices like a laptop or tablet, can go anywhere and not need a WiFi (router / internet) connection. This configuration allows you to enter a new location and operate the robot without access to any local networks.

The Scenario

When going “into-the-field”, robot operators want:

a static IP address (for the robot)

Ad-Hoc wireless network

DHCP server (for connecting clients to the robot, eg. laptop or tablet)

When back at the office, operators want:

dynamic IP address

regular wireless network

disable DHCP server

The robot should automatically configure itself on boot up, based on what WiFi signals it finds.

The Solution

The file ‘rc.local’ calls ‘/home/ubuntu/scripts/go_dhcp_ad_hoc.sh’. This script checks to see if we got an IP address for WiFi (‘wlan0’). If yes, do nothing, otherwise configure Robot Ad-Hoc Networking and the DHCP server.

To use this method, install and configure:

  • DHCP server
  • ‘go_dhcp_ad_hoc.sh’ script
  • call script from rc.local

On boot, system will try to connect to the WiFi, if it fails then it will create the Ad-Hoc network and launch the DHCP server.

Install and Run

Note: please backup your robot software before making any modifications

1. Install required modules (run these commands on the robot):

$ sudo apt-get update

$ sudo apt install iw

$ sudo apt-get install isc-dhcp-server

2. Edit ‘/etc/dhcp/dhcpd.conf’, add the following to the end of the file:

subnet 192.168.0.0 netmask 255.255.255.0 {
     range 192.168.0.10 192.168.0.40;
     option broadcast-address 191.168.0.255;
     default-lease-time 600;
     max-lease-time 7200;
     authoritative;
 }

3. Create DHCP init scripts:

$ sudo cp /etc/default/isc-dhcp-server /etc/default/isc-dhcp-server.orig

$ sudo cp /etc/default/isc-dhcp-server /etc/default/isc-dhcp-server.wlan

$ sudo vi /etc/default/isc-dhcp-server.wlan

Change the last line of ‘etc/default/isc-dhcp-server.wlan’ to:

INTERFACES="wlan0"

4. Install script (see below):

$ vi /home/ubuntu/scripts/go_dhcp_ad_hoc.sh

  (copy the content from below)

$ sudo chmod 777 /home/ubuntu/scripts/go_dhcp_ad_hoc.sh

5. Edit ‘/etc/rc.local’ to call the script, by adding the following to the end of the file:

sleep 2
sudo /home/ubuntu/scripts/go_dhcp_ad_hoc.sh

You’re done!

Connecting to the Ad-Hoc Network

If the WiFi specified in /etc/network/interfaces’ is not found, robot will auto-configure Ad-Hoc network and present the Wifi SSID:

  MiniTurtyAdHoc

Connect your laptop or tablet using the Ad-Hoc WiFi connection, and then use the robot in the normal manner. The IP adress of the robot in Ad-Hoc mode is:

  192.168.0.1

To disable this functionality (prevent boot into Ad-Hoc mode), simply comment out (using ‘#’) the line in the ‘rc.local’ file.

The ‘go_dhcp_ad_hoc.sh’ Script

$ cat ~/scripts/go_dhcp_ad_hoc.sh:

#!/bin/sh
if ! wpa_cli -i wlan0 status | grep ip_address
then
  echo 'Not got IP address, setting up Ad-Hoc'
  if pstree | grep wpa_supplicant
  then
    sudo killall wpa_supplicant
  fi
  if pstree | grep avahi-daemon
  then
    sudo killall avahi-daemon
  fi
  if pstree | grep dhclient
  then
    sudo killall dhclient
  fi
  sleep 1

  sudo ifconfig wlan0 down
  sudo iwconfig wlan0 mode Ad-Hoc
  sudo iwconfig wlan0 essid MiniTurtyAdHoc
  sudo ifconfig wlan0 192.168.0.1
  sudo ifconfig wlan0 up

  sleep 1
  sudo cp /etc/default/isc-dhcp-server.wlan /etc/default/isc-dhcp-server
  sudo systemctl restart isc-dhcp-server
  sudo cp /etc/default/isc-dhcp-server.orig /etc/default/isc-dhcp-server
fi

Need more help?… Ask a question in the comments below!

Navigating Mobile Robot Mapping

To understanding the Mini Turty ROS mobile robot mapping operation, we begin from the simple script that is used to start the robot.

The Starting Point

Typically, for doing mapping users will call:

$ ./mini_turty_mapping.sh

The above file is contained in the ubuntu home directory (‘/home/ubuntu’). If you look in that file, you will see the line similar to:

$ roslaunch mini_turty3 hector_ydlidar_demo.launch

The above is what starts the robot doing mobile robot mapping. Let’s break that down into the pieces. The first part is the ‘roslaunch’ program.

Navigating in a map showing obstacle inflation
Navigating in a map showing obstacle inflation

This program is part of the ROS system itself. It’s used to launch many different ROS nodes via one or more launch files. In the above case, roslaunch is launching a launch file called ector_ydlidar_demo.launch’, and this launch file is contained in the ROS package called ‘mini_turty3’.

So where is this ‘hector_ydlidar_demo.launch’ file, you may be asking. There are a couple of ways of finding that, but one convenient method is to use the ROS bash shell command ‘roscd’. Type the following into a console:

$ source catkin_ws/devel/setup.bash

$ roscd mini_turty3

This will take you to the directory containing the mini_turty3 package. If
you then run ‘pwd’:

$ pwd

You will see:

/home/ubuntu/catkin_ws/src/mini_turty3

So ‘roscd’ has taken us to the directory containing the ROS package ‘mini_turty3’. Run ‘ls’:

$ ls

This will show the directory contents similar to:

CMakeLists.txt include launch package.xml param scripts src

Note that we have a sub-directory called ‘launch’ here. This is where the previously mentioned ‘hector_ydlidar_demo.launch’ launch file exists. If you run:

$ ls launch

You will see output similar to:

amcl_rd_demo.launch hector_ydlidar_demo.launch
minimu9-ahrs.launch mini_turty.urdf teleop.launch test.xacro
hector_move_base_demo.launch includes mini_turty3.urdf
robot_pose_ekf.launch test.urdf

Which lists all the launch files in the mini_turty3 package, including the one we are currently interested in (‘hector_ydlidar_demo.launch’). Let’s open our launch file and examine its contents:

$ vi launch/hector_ydlidar_demo.launch

The ‘hector_ydlidar_demo.launch’ launch file launches several ROS nodes.

The LiDAR Node

The first ROS node to be launched is:

  <node name="ydlidar_node"  pkg="ydlidar"  type="ydlidar_node"
output="screen">

This node is the ROS driver for the lidar. It handles all interaction with the lidar device and is responsible for starting/stopping, gathering the data from the lidar, and publishing the lidar data to ROS.

The G2 lidar
The G2 lidar

The Mapping Node

The next node to be launched is:

  <node pkg="hector_mapping" type="hector_mapping" name="hector_mapping"
output="screen">

This node is the “Hector Mapping” node. It’s responsible for the creation of the map, using the data from the lidar. It takes lidar scans and adds them into the map. The map is then published to ROS for other nodes to access.

The Robot Base Node

Moving on, the next important node is:

  <node pkg="mini_turty3" type="mini_turty3" name="mini_turty3"
required="true">

This node is used to control the robot base. It’s responsible for responding to ROS tele-op commands, and for publishing odometry.

Each of the above launched nodes has a ROS package under the ‘catkin_ws’
directory. If you run the command:

$ ls ~/catkin_ws/src/

You will see several ROS packages (for some robots, there could be many more!):

minimu9-ahrs mini_turty3 raspicam_node ydlidar

You may ‘roscd’ into any of these packages. But let’s stay in our ‘mini_turty3’ package and examine the source code of this node:

$ ls src

You will see:

mini_turty3.cpp

This is the source file for the ‘mini_turty3’ base. You could open this using many different editors, but let’s use ‘vi’ again:

$ vi src/mini_turty3.cpp

Here you will find the ‘main()’ program for the mini_turty3 base. Please examine this source code to understand how the base operates. Here’s the code snippet of the main function:

int main(int argc, char *argv[])
{
  int pi;
  int enable_odom_transform;
  
  pi = init_gpio();
  if(pi) {
    printf("Error initializing pigpio!\n");
    return pi;
  }

  ros::init(argc, argv, "mini_turty");
  ros::NodeHandle private_node_handle_("~");
  ros::Rate loop_rate(50);

  private_node_handle_.param("enable_odom_transform", enable_odom_transform, int(0));

  MiniTurtyBase mini(pi);
  mini.setCmdVelTopic("cmd_vel");
  mini.setEnableOdomTransform(enable_odom_transform);

  while (ros::ok()) 
  {
    if (got_ctrl_c) {
      ROS_WARN("Got Ctrl-C");
      break;
    }

    ros::spinOnce();
    loop_rate.sleep();
  }

  // set SLEEP (active low)
  gpio_write(pi, GPIO_SLEEP, 0);
  pigpio_stop(pi);

  ROS_WARN("Exiting.");

  exit(0);
}

Note that most ROS nodes have a ‘src’ directory, and that is where the code for a node is usually kept. For example, you may find the code for the lidar in ‘catkin_ws/src/ydlidar/src’.

Binary Install of ROS Packages

Be aware that many ROS packages are by default ‘binary install’. This means that you currently have pre-built binaries only (to save build time and storage space!). This is the case for ‘hector_mapping’. However, you may access the source code, and even install it onto the robot by going to the github location. To see the source code for hector_mapping, first go
to the ROS wiki:

http://wiki.ros.org/hector_mapping

Then click on the:

Source: git https://github.com/tu-darmstadt-ros-pkg/hector_slam.git (branch: melodic-devel)

This immediately takes you to the github repository containing the Hector Mapping source code.

If you decide to install any source code, please make sure to back up your SD card first, and then restore it to the original if you find any problem. This will be important if you require support, as it is not possible for us to debug a user-modified installation.

As you can see, navigating your way through the Mini Turty ROS packages is not difficult. Just take it one step at a time and follow the path through in a manner similar to that which is described above. You will soon be getting oriented and making your way in the world of ROS autonomous mobile robot mapping!

Visualizing The LiDAR Data On Your Mapping Robot

You’ve setup your robot and remote PC, now the fun begins. Let’s visualize the robot laser data on our autonomous LiDAR mapping robot!

If you’ve not already done so, power up your robot, initialize it and run the mapping script (see our Tutorials for more details). With the robot now running (and the on-robot lidar device scanning), follow this procedure:

First, make sure that the remote PC is configured to work with your robot. Switch to your remote PC and open up a terminal window in Ubuntu. In the terminal, type:

env | grep ROS

In our example, here is what we got:

ROS environment variables
ROS environment variables

From this, we can see that our remote PC is currently configured to work with a ROS master that would be located on the same machine. Specifically, the line:

ROS_MASTER_URI=http://localhost:11311

That shows us that this host expects to find its ROS master locally. Well, that’s not how we want to work with our robot (we want to make our robot the ROS master). So let’s go ahead and change that, by running the following command:

export ROS_MASTER_URI=http://ubuntu:11311

The above command tells our remote PC that the ROS master is to be found on the host with the hostname ‘ubuntu’. So now, when we run ROS commands, these commands will go to our robot (whose default name is: ubuntu). If you have renamed your robot, you will want to replace the robot name with whatever name you used.

Examining ROS Topics

Having configured our remote PC to use the robot as the ROS master, let’s quickly examine what ROS topics are being published by our robot:

rostopic list

The above command yields the following response:

ROS topics
ROS topics

This lists out several ROS topics that are running, including the ‘/scan’ topic which is where our laser data is presented. Let’s have a quick look at that laser data, by typing the following:

rostopic echo /scan

Which shows:

Echo topic 'scan'
Echo topic ‘scan’

That’s the tail end of a ROS topic ‘/scan’ showing the data point “intensities”. We’ll skip over the details of just what that is.

Visualizing the LiDAR Data

For now, let’s visualize the scan data in the graphical package RViz, by typing:

rosrun rviz rviz

This will start the ROS RViz program, where we will be visualizing the robot laser data from our autonomous lidar mapping robot…

Visualizing the LiDAR data
Visualizing the LiDAR data

If all went well, you should be seeing something similar to the above. If not, you will want to check your ROS connectivity (see our Tutorials for more details). Enjoy!

Register with Rhoeby to Win a Mini-Turty Flex!

Win a ROS navigation-capable robot! Advanced robotics navigation, map building, tele-op, tele-viewing, frontier exploration and computer vision. Little robots with BIG capabilities.

Register for a new account here:

https://rhoeby.com/my-account/

for a chance to win one of these marvelous robots.

Mini-turty Flex Robot

The raffle runs until the end of the Bay Area Maker Faire 2019, so hurry for a chance to win. Winner will be announced in this blog on 5/24/19.

See us and the robots at the Bay Area Maker Faire 2019!

See us and the robots at the Bay Area Maker Faire 2019!

Things You Can Do With The Mini-Turty Flex Robot

There are many things you can do with your Mini-Turty Flex robot, including:

  • Basics: ROS learning
  • Teleop: robot remote control
  • Map Building: make maps of your home or office for the robot to use
  • Navigation: the robot moves autonomously around your home or office
  • Tele-Viewing: see what your robot sees, even from another room
  • Frontier Exploration: the robot autonomously explores unknown terrain
  • Computer Vision: Mini-Turty Flex recognizes objects in its environment