Laboratory One Research Blog

Chefbot Build Notes

February 25, 2018

This blog post is a relic of my build of Chefbot, a project detailed in Learning Robotics using Python by Lentin Joseph. Chefbot is an autonomous robot for food delivery in a restaurant. My version of Chefbot deviates from the one detailed by Lentin. I will mostly cover my trouble spots.

Chefbot Front

Chefbot Side

Chefbot Top

Chefbot Top - Lid On

Chefbot Top - Lid On

Robot Spec

The book specifies the following set of hardware requirements to be met by this robot:

  • The robot should have a provision to carry food
  • The robot should be able to carry a maximum payload of 5kg
  • The robot should travel at a speed between 0.25 m/s and 1 m/s
  • The ground clearance of the robot should be greater than 3 cm
  • The robot must be able to work for 2 hours continuously
  • The robot should be able to move and supply food to any table avoiding obstacles
  • The rogot height should be between 40 cm and 1 meter
  • The robot should be of low cost

I didn’t want to build that robot so I strapped the hardware to a plastic container. I made some other modifications. For starters, I’m not going use chefbot for food delivery, so it doesn’t need a provision to carry food, or be a meter tall.

Simulation

CAD Model

Because I’ve deviated from Lentin’s spec, my version of Chefbot can have a much smaller footprint. I’ve modelled what I’ve built in AutoDesk’s Fusion 360. It’s not perfect but that’s ok. I intend on rechassising Chefbot after it’s software is complete.

The first step it to make a pencil and paper drawing (how arcane).

Drawing of Chefbot - 1

Drawing of Chefbot - 2

Let’s start modelling now that we have a rough idea of what we’re going to model.

Fusion 360 Drawing of Chefbot

Ok, we’ve got the drawing so lets render a few views as well.

Rendered CAD Model - Home

Rendered CAD Model - Front

Rendered CAD Model - Side

Rendered CAD Model - Top

Rendered CAD Model - Bottom

Rendered CAD Model - Back

Gazebo Simulation

Start hotel world environment

We will use Gazebo to run ROS simulations. After installing the ROS integration, we can start the environment.

  1. Start ROS master node: roscore
  2. Start Gazebo using ROS: rosrun gazebo_ros gazebo

Chefbot uses the Turtlebot stack. Install and proceed to test Turtlebot in an empty world with keyboard teleop and monitor it’s values.

  1. Start the Turtlebot in Gazebo: roslaunch turtlebot_gazebo turtlebot_world.launch
  2. Start the keyboard teleop node: roslaunch turtlebot_teleop keyboard_teleop.launch
  3. Look at the ROS topics: rostopic list

Now that we know our stack is correctly working, we can proceed to test Chefbot.

  • Run Chefbot in Gazebo: roslaunch chefbot_gazebo chefbot_playgound.launch
  • Run Chefbot in Gazebo in an empty world: rosrun gazebo_ros gazebo
  • Run Chefbot in Gazebo in an hotel world: roslaunch chefbot_gazebo chefbot_hotel_world.launch
  • Start the keyboard teleop node: roslaunch turtlebot_teleop keyboard_teleop.launch

Chefbot in Gazebo hotel world environment

GMAPPING

GMAPPING is the processing of mapping an environment with sensors. We can map the hotel Simulation with the following process:

  1. Run the gmapping process: roslaunch chefbot_gazebo gmapping_demo.launch
  2. Visualize the processing of ROS sensor data with Rviz: roslaunch turtlebot_rviz_launchers view_navigation.launch
  3. Use keyboard teleoperation to map the environment: roslaunch turtlebot_teleop keyboard_teleop.launch
  4. Save the generated map: rosrun map_server map_saver -f ~/hotel_world

AUTONOMOUS NAVIGATION

Now that we have a name of our environment, we can simulate Chefbot autonomous navigation with the following steps:

  1. Run Chefbot in Gazebo with the hotel world environment: roslaunch chefbot_gazebo chefbot_hotel_world.launch
  2. Run the AMCL demo: roslaunch chefbot_gazebo amcl_demo.launch map_file:=/home/robot/hotel_world.yaml
  3. Start Rviz to view visualization: roslaunch turtlebot_rviz_launchers view_navigation.launch
  4. We can use 2D Nav Goal to direct Chefbot to autonomously navigate to a target.

The Build

Bill of Materials

The Tiva was nice but it didn’t work properly so I used an Arduino Duemilanove (I had an extra).

Bill of materials

Hardware

Circuit Schematic

Circuit Breadboard

Microcontroller

Microcontroller ATmega168
Operating Voltage 5V
Input Voltage (recommended) 7-12V
Input Voltage (limits) 6-20V
Digital I/O Pins 14 (of which 6 provide PWM output)
Analog Input Pins 6
DC Current per I/O Pin 40 mA
DC Current for 3.3V Pin 50 mA
Flash Memory 16 KB (ATmega168) or 32 KB (ATmega328) of which 2 KB used by bootloader
SRAM 1 KB (ATmega168) or 2 KB (ATmega328)
EEPROM 512 bytes (ATmega168) or 1 KB (ATmega328)
Clock Speed 16 MHz

We had to port Lentin’s Tiva code to work with the Arduino. Luckily, the Tiva and Arduino uses a derivative form of Processing C.

First, update the libraries to work with Arduino:

  • Messenger: needs to be switched for the Arduino version.
  • Wprogram: needs to be switched to Arduino.h.

Second, add other libraries to ensure the runs on our microcontroller:

  • PinChangeInterrupt: I required additional pin interrupt for the MPU6050.
  • MPU6050: Ensure that the MPU6050 library works for your model.
  • I2Cdev: Required to use the I2C interface.

Third, adjust the pins from the Tiva C GPIO mapping to Arduino microcontroller GPIO mapping.

Finally, adjust the Tiva C code to ensure it runs as desired. This means hooking in the additional libraries.

Link to our ported code with tests

Motors

We need to determine the motors for this project. These are the calculations from Lentin. I’ve documented for future reference. If we use 9 cm as the Diameter of Wheel, then the RPM to meet our spec (assume required speed is 0.35 m/s) should be atleast 74 RPM. We will round up and use 80 RPM.

RPM Calculation
RPM = (60 * Speed) / (pi * Diameter of Wheel)
RPM = (60 * 0.35 m/s) / (3.14 * 0.09 m)
RPM = 21 / 0.2826
RPM = 74 => 80
Motor Torque
n_wheels = 4
n_motors = 2
coefficient of friction = 0.6
total weight of robot =  weight of robot + payload = (W = mg) = 15 Kg
maximum torque (to overcome friction when stationary) = u * N * r - T = 0
(add intermediate calculations)
Motor Torque = 10.32 kg-cm

The 2 motors interfaces with the motor driver.

Motor Driver

1 motor needs to be inverted (reversed polarity)

IMU

Uses I2C to interface with the microcontroller. We can test that it works with the raw sample program.

Next, test with DMP sample program.

Kinect 360

This is our makeshift Depth camera, RGB camera, and microphone. It enables the really smart stuff. It interfaces with the computer.

We need to test that it’s wired correctly. The kinect sensor needs to be powered, and plugged in via USB. After ensure that it works correct, splice the kinect power adapter so we can hook it into our battery.

The white wire is the ground, and the brown wire is the positive.

Power supply

We need a power supply for our robot. The robot will be powering a lot of electronics, including a laptop. Lets first determine the load and power requirements:

Components Maximum Current (Ampre)
MacBook Pro 12 V, 3 A
Kinect 12 V, 1 A
Motors 12 V, 0.7 A
Motor Drivers 12 V, < 0.5 A
Motor drivers, ultrasonic sensor, IMU 5 V, < 0.5 A
Total ~5.2 A

So we got a battery which sources 12V. It’s going to power the Computer, Kinect, and motors. Then I’ll use 5V to power everything else, except the imu. That will be powered directly by the Arduino.

3.3 V 5 V 12 V
IMU Motor Drivers Computer
Ultrasonic sensor Kinect
Encoders Motors

Circuit board and Arduino

Circuit board

Interfacing with ROS

Chefbot uses Ubuntu 14.0.4 on both the onboard computer and the remote computer. Similarly, they both use ROS Lunar.

Chefbot ROS Standalone

The following software stack needs to be installed:

  • Gazebo (for 3D simulations)
  • Turtlebot robot packages (the main components of Chefbot. Installation using synaptic works well)
  • OpenCV (for computer vision)

    • Test the installation by reading in an image then displaying it.
  • Arduino (for managing and building microcontroller code)

Now that we have the stack installed, set up the Chefbot code.

Build the Chefbot ROS package (Adapted code). Next, verify/upload microcontroller code and test the microcontroller. You may need to manually ‘Add Files’ for libraries. You may need to give permissions: sudo chmod 666 /dev/ttyACM0. The port may be named something else like: /dev/ttyUSB0

Then test the ROS package by running chefbot_bringup launch files.

Interface Readout

To test the robot ROS interface, Arduino Chefbot program which sends serial information to the computer. The data readout is as follows:

u	9
s	0.00	0.00
i	0.52	0.43	0.41	-0.62
t	9011792	0.01
e	0	1

U: Ultrasonic sensor distance reading in cm
S: Wheels' speed
I: IMU readings
T: Time elapsed in milliseconds and in seconds
E: Encoder readings

Kinect Software

Set up the Kinect ROS stack and test it.

You may need to manually install the following to make it work:

  • PCL
  • OpenNI

Test the OpenNI installation: roslaunch freenect_launch openni.launch

If you can’t get the Kinect to work, it’s likely that you need to provide more current and voltage. To check if the kinect is connected correctly, run lsusb. There should be 3 Microsoft devices listed.

ROS Operations

To operate Chefbot, we must start the ROS master node, and the Chefbot ROS standalone node. These should be ran on the Chefbot computer. Run the following commands in seperate bash sessions:

  1. roscore: the master ROS node. Run this on Chefbot
  2. roslaunch chefbot_bringup robot_standalone: launch robot driver nodes. Run this on Chefbot

Note: You may need to give permissions to the USB device: chmod 777 /dev/ttyUSB0

Keyboard Teleoperation

We can control Chefbot from the remote computer by running: roslaunch chefbot_bringup keyboard_teleop.launch.

Note: teleoperation always takes control precedence.

Joystick Teleoperation

We can also control Chefbot from the remote computer with a variety of joystick devices. Turtlebot provides Gazebo bindings some of these devices.

If you have a xbox 360 controller for Gazebo simulation, you can use: roslaunch turtlebot_teleop xbox360_teleop.launch.

I’m using a wireless controller with a wireless receiver which is plugged into the remote PC.

I had to port the Turtlebot xbox360 teleop node to Chefbot. It was a challange as the code is written in C++ as opposed to python. This introduces a build step which is defined by the CMake file. Then I routed the node to /cmdvelmux/input/teleop.

With this port, we can use a xbox 360 controller to operate Chefbot using: roslaunch chefbot_bringup xbox360_teleop.launch

3D Mapping

We can use the kinect to map our environment. This map can be used by Chefbot to orientate itself, and navigate autonomously.

Activate the Kinect ROS node to start taking in sensor data: roslaunch chefbot_bringup 3dsensor.launch

View sensor data: roslaunch chefbot_bringup view_robot.launch

Now that we’ve established that our system is up and running, we can start the 3d mapping process.

The following steps should be followed to accomplish this:

  1. Start the robot driver (robot): roslaunch chefbot_bringup robot_standalone.launch

  2. Start the gmapping process (robot): roslaunch chefbot_bringup gmapping_demo.launch

  3. Start teleop: roslaunch chefbot_bringup xbox360_teleop.launch

  4. Bring up a view of the map being built: roslaunch chefbot_bringup view_navigation.launch

  5. Build the map the same way you did in the simulation section

  6. Save the map (robot): rosrun map_server map_saver -f ~/test_map

Localization and navigation

Now that we have a map built and saved, we can use it to perform autonomous naviation

The following steps are used to direct the robot:

  1. Start the robot driver (robot): roslaunch chefbot_bringup robot_standalone.launch

  2. Start localization and navigation with the stored map: roslaunch chefbot_bringup amcl_demo.launch map_file:=/home/robot/test_map.yaml

  3. View navigation: roslaunch chefbot_bringup view_navigation.launch

We may need to specify the initial pose with 2d pose estimate. After, we can direct the robot to a target.

Calibration

Intrinsic Calibration of Kinect:

To perform Intrinsic calibration of the kinect, Install OpenNI driver packages and camera calibration packages for ROS.

Next, print an 8 x 6 checkerboard of 0.108 meter in length and perform the following:

  1. Start OpenNI driver: Roslaunch openni_launch openni.launch

  2. Run calibration code

RGB: Rosrun camera_calibration cameracalibrator.py image:=/camera/rgb/image_raw camera:=/camerga/rgb --size 8x6 --square 0.108

IR: Rosrun camera_calibration cameracalibrator.py image:=/camera/ir/image_raw camera:=/camerga/ir --size 8x6 --square 0.108

  1. Move the checkerboard and collect image samples.
  2. Press Calibrate when the system is ready
  3. On successful calibration, adjust the size
  4. Commit the calibration

Repeat the above for the Depth camera.

The commited files will be used by ROS when running the Kinect.

Note: The depth camera seems to not trigger well with flourescent lighting. Ensure the checkerboard is lit by a infrared source.

Wheel Odometry calibration

Wheel Odometry calibration is required to reduce navigational errors. We will calibrate the MPU6050 for this.

  1. Load the MPU 6050 Calibration sketch and place the MPU6050 on a flat surface.
  2. Run the sketch for 5 - 10 mins and take note of the gyro values.
  3. Set the offset values in the Chefbot Arduino program to those of the readings.
  4. Repeat until we get a reading of 0 on all gyro axis.

We can use a calibration sketch to do this more efficiently. https://www.i2cdevlib.com/forums/topic/96-arduino-sketch-to-automatically-calculate-mpu6050-offsets/

Graphical User Interface

To create a GUI, install qt-sdk, rqt, and PyQt. Next, design the GUI with: designer-qt4.

After building the GUI, we create bindings to python.


Peter Chau

Written by Peter Chau, a Canadian Software Engineer building AIs, APIs, UIs, and robots.

peter@labone.tech

laboratoryone

laboratory_one