7580OS_Learning ROS for Robotics Programming - Second EditionAs it can be obvious by now given my posting frequency, writing is not something that comes easy to me, especially not in English, so this will definitely be a surprising post. In August, a couple of friends and I published a book about the Robot Operating System (ROS), a robotics framework which we’ve been using for a long time and which made the basis for the software stack.

This is the second edition of the original Learning ROS for Robotics Programming, which was written by two of the same authors and also reviewed by one of the authors of the second edition. Unfortunately, as life would have it, I couldn’t be involved with the first edition, so I couldn’t pass on the opportunity to participate in this iteration.

This second edition improves upon the first by providing updated content, as the latest versions of ROS are significantly different to those used in the first edition. Be aware that even though the book was originally written to support ROS Hydro, we also provide support for Indigo and Jade in our GitHub repository. We have also improved in general the content of the existing chapters with up-to-date examples and better explanations. Finally, we have replaced the last chapter with two new chapters covering point clouds and robotic arms, which we consider to be a great addition to an already extensive book. The layout of this second edition is as follows:

  • Chapter 1 – Getting Started with ROS Hydro: as the name suggests, this first chapter goes through the installation process of ROS Hydro on Ubuntu, covering also the process of doing so in a VirtualBox virtual machine as well as on a Beaglebone Black.
  • Chapter 2 – ROS Architecture and Concepts: the second chapter covers most of the bits and pieces ROS is made of, including practical examples in which the user will learn how to create nodes and packages that interact with each other, as well as with the famous turtlesim.
  • Chapter 3 – Visualization and Debug Tools: there are many situations in software development where we have to deal with the unexpected, this chapter provides information as to how to use common debugging tools, as well as other tools provided by ROS, in order to debug and visualize our nodes, the data and the interactions between the different elements in our system.
  • Chapter 4 – Using Sensors and Actuators with ROS: a very important part of robotics is dealing with hardware, this chapter covers the usage of common and cheap sensors and actuators supported by ROS, as well as some others more complex, and not as cheap, such as the Kinect or laser rangefinders. Finally, this chapter will also cover how to use an Arduino with ROS to expand our hardware possibilities even further.
  • Chapter 5 – Computer Vision:  from connecting a USB or FireWire camera and publishing images to performing visual odometry with stereo cameras or RGBD sensors, passing through the image pipeline to perform corrections and transformations to our images, this chapter provides an overview of the computer vision tools provided by ROS and OpenCV.
  • Chapter 6 – Point Clouds: this chapter explores a different approach to 3D sensor data communication and processing by using the Point Cloud Library (PCL), which is a library tailored to 3D data (or point clouds) processing and it’s well integrated with the latest versions of ROS, providing message abstractions and other facilities.
  • Chapter 7 – 3D Modelling and Simulation: in many situations, working in robotics requires working without the robots themselves and in some others the amount of tests required to validate a system make it an impossibility to use the robot for that purpose, in those situations the best bet of the roboticist are simulations with accurate 3D models. Since simulations are an indispensable tool for any serious robotics project, this chapter covers the process from creating an accurate 3D model of our robot to simulating it and its environment with Gazebo.
  • Chapter 8 – The Navigation Stack – Robot Setups: this chapter introduces the navigation stack, which is an incredibly powerful set of tools provided by ROS to combine sensor and actuator information to navigate a robot through the world. This introduction goes through the basics of the navigation stack, explains how to understand and create our own transformations and covers odometry with the use of a laser rangefinder or Gazebo.
  • Chapter 9 – The Navigation Stack – Beyond Setups: as a continuation to the previous chapter and using all the concepts explained throughout the book, this chapter finalises the configuration of our robot to make full use of the navigation stack.
  • Chapter 10 – Manipulation with MoveIt!: the final chapter of the book covers the integration between ROS and MoveIt! which provides a set tools to control robotic arms in order to perform manipulation tasks such as grasping, picking and placing, or simple motion planning with inverse kinematics.

The authors of the book, which I consider amongst my best friends and most trusted colleagues, are Enrique Fernández, Ph.D. in Computer Engineering by the University of Las Palmas de Gran Canaria and currently working as a Senior Autonomy Engineer at Clearpath Robotics, Aaron Martínez, M.Sc. in Computer Engineering and co-founder of Subsea Mechatronics, Luis Sánchez, M.Sc. in Electronics and Telecommunications and also co-founder of Subsea Mechatronics, and of course yours truly, M.Sc. in Computer Science and Currently a Software Engineer at Dell SecureWorks (I know, unrelated to robotics).

Come on, stop talking and tell us where we can buy the book…

I know, I know, you’re an impatient bunch, right after this paragraph I’ve included a non-exhaustive list of places where the book is currently sold. If you’re not too sure yet, remember that Christmas is very close and books are always a great gift for friends and family, and who doesn’t want to have a grandma who programs robots* as a hobby instead of knitting?

| | | Barnes&Noble | O’Reilly | Safari Books

We’d like to hear your opinions, so don’t forget to comment if you’ve already read the book or even if you haven’t, and spread the word!

* The authors do not claim this book can teach your grandma to program robots.

One day, looking for cheap sensors on ebay, I found this interesting board which contained everything I was looking for. It basically consists of a 3-axis accelerometer (ADXL345), a 3-axis magnetometer (HMC5883L), a 3-axis gyroscope (L3G4200D) and a barometric pressure sensor (BMP085). My plan is to build an Inertial Measurement Unit (IMU) (or maybe I should call it Attitude and heading reference system (AHRS)) and in the process learn how to interact and interpret the information all of this sensors provide. The fact is I have some experience using IMUs since I used one on my and another one on the , but the fact is they come preprogrammed and there is not much point in working with the raw sensor data unless you want to improve the measurement or give it another use.

For this project I am also using an Arduino Duemilanove, for that reason I wanted to call it ArduIMU, but there is already , so I will have to find another name (suggestions would be appreciated). Connecting the sensor board to the Arduino is pretty straightforward, every sensor has an I²C interface so you can access each of them using the . The drawing was done using fritzing, on which I created the corresponding custom part for this board, although I did something wrong and it does not conform to the fritzing graphic standards.

This will be the first of a series of posts I plan to write about this project, since there are several steps I need to take in order to fully understand each sensor and several more to combine them in order to improve accuracy. In this post I want to talk about the accelerometer and how to obtain the roll and pitch angles from it, which is a process that can also be called tilt sensing.

Accelerometers are devices that are capable of measuring the acceleration they experience relative to free-fall,  the same acceleration living beings feel. As a consequence, accelerometers are incapable of measuring the acceleration of gravity, but can be used to measure the upwards acceleration that counters gravity when at rest. This acceleration is measured as 1g on the z-axis, when both pitch and roll angles are zero, but when the sensor is tilted either the x-axis or the y-axis experiences a component of the upward acceleration, whose magnitude depends on the tilt angle.

Pitch & Roll estimation

Obtaining the pitch and roll angles is then a matter of being able to read the accelerometer, convert these readings to the g unit (1g = 9.8 m/s²), and apply the corresponding equations. The process of obtaining and converting the accelerometer readings depends on the accelerometer you are using, in my case, the ADXL345 in its basic configuration, provides 10-bit resolution for ±2g, but has several other ranges (±2g, ±4g, ±8g, ±16g)  and resolutions (from 10 to 13 bits depending on the range) . Generalizing, the formula used to calculate the acceleration from the accelerometer readings is:

G_{Accel} = Raw_{Accel} \cdot \dfrac{Range}{2^{Resolution - 1}}

Once we have the correct acceleration components, we can proceed to calculate the different angles using the following equations:

pitch = \arctan{\left(\dfrac{G_y}{\sqrt{G_{x}^2 + G_{z}^2}}\right)}     roll =\arctan{\left( \dfrac{-G_x}{ G_{z}}\right)}

For more information about where these equations come from, you can read the documentation I include at the end of this post. As you can see, the denominator of the pitch equation is defined to be always positive, so the equation itself only provides [-90, 90] range, which is exactly what is expected for the pitch angle. In contrast, the roll equation provides [-180, 180] range. It is important to take into account that when the pitch angle is 90º, the surge axis (roll) is directly aligned with the gravity vector, thus we cannot measure the roll angle anymore, this is what is called Gimbal Lock.

Also, be aware that the roll equation is undefined when both G_x and G_z are equal to zero, and that for each possible value of the calculation done inside the arctan function there are two valid solutions, not only on the roll but also on the pitch equation. These problems can be easily solved in code by using the function atan2, which eliminates the angle calculation ambiguity by taking into account the quadrant.

Removing short-term fluctuations using a Low-Pass filter

At this point we already have a fully functional pitch & roll estimation system, but if we experiment with it we will discover that the readings fluctuate quite a bit and this may be very annoying for some applications. Removing these short-term fluctuations can be achieved by means of what is called a Low-Pass filter. This type of filter attenuates the higher frequencies of the signal, thus providing a smoother reading. The Low-Pass filter is easily implemented by using the following equation:

y_{t} = \alpha \cdot x_{t} + (1 - \alpha) \cdot y_{t - 1}

Where y_t is our filtered signal, y_{t-1} the previous filtered signal, x_t the accelerometer reading and \alpha the smoothing factor. It probably may seem obvious, but filtering should be done to the accelerometer readings before calculating the angles, instead of to the angles themselves. Regarding the smoothing factor, the lower we set it, the more it will take for the angle to stabilize, so we should not set it too low because then we could lose real-time behaviour. With this I mean that the reading will not correspond to the real angle until it stabilizes, and this could take some time.

The source code & the ADXL345 library

I developed a small library to interface with the accelerometer, even though at the moment I have only implemented the basic functionality, I plan on supporting all of the device features. You can find it in my github account, where you can also find the processing code I used for the video example below. Thanks to the library, the code is pretty straightforward. It just reads the sensor accelerations which are already converted into gs by the library, applies the Low-Pass filter and then uses the roll and pitch equations to calculate the angles.

#include 
#include 

const float alpha = 0.5;

double fXg = 0;
double fYg = 0;
double fZg = 0;

ADXL345 acc;

void setup()
{
        acc.begin();
        Serial.begin(9600);
        delay(100);
}

void loop()
{
        double pitch, roll, Xg, Yg, Zg;
        acc.read(&Xg, &Yg, &Zg);

        //Low Pass Filter
        fXg = Xg * alpha + (fXg * (1.0 - alpha));
        fYg = Yg * alpha + (fYg * (1.0 - alpha));
        fZg = Zg * alpha + (fZg * (1.0 - alpha));

        //Roll & Pitch Equations
        roll  = (atan2(-fYg, fZg)*180.0)/M_PI;
        pitch = (atan2(fXg, sqrt(fYg*fYg + fZg*fZg))*180.0)/M_PI;

        Serial.print(pitch);
        Serial.print(":");
        Serial.println(roll);

        delay(10);
}

The result

For a more interactive visualization of the data, I also developed an example using processing, which consists on a rotating 3D cube. You can see the results in the following video.

In the next post about my Arduino IMU, I will talk about how gyroscopes work and how to interpret the information they provide.

I finally did it, oficially I am now M.Sc. in Computer Science. After long years of very hard work and sleepless nights, but also living under the comfortable feeling of being always busy and the certainty of what was to come. But that’s it, this moment had to come and my days as a university student are over, but I believe I’m ready for what lies ahead.

My Master Thesis consisted in the design and development of a software architecture for monitoring and controlling a remotely operated underwater vehicle (ROV). It consisted of two software blocks: the control system  and the operation system. The control system is the main software architecture, designed to allow multiple modules to work in parallel connected with one another, each of them controlled by a supervisor which guarantees that the system is always working and deals with software and hardware errors. On the other hand, the operation system allows the user to connect to the control system, visualize the sensory data and operate the vehicle.

The software itself is not very complex, but the design of the architecture is focused on offering efficiency, robustness, reliability and flexibility. One of the main goals of the design is to give the developer the ability of adapting the software architecture to different control models, and even to different types of vehicles or robotic systems, such as an autonomous underwater vehicle (AUV). You can read more about it in the documentation, although it is in Spanish.

I had to give a presentation, where I explained the different aspects of the project and demonstrated the results. It went quite well, and I think it took a little bit longer than expected, but I was finally given the highest grade. Overall, I am certainly going to miss being a university student.

Glider School

After returning from the SAUC-E competition, the Oceanic Platform of the Canary Islands offered some of us the opportunity of attending a two-week course on Gliders.  By that time, only two of us had not already made plans. In my particular case, I had a lot of work with my master thesis and was not planning in going anywhere.

The course was divided in two main areas, one concerning the technological aspects of the vehicles and the other one about its scientific applications. For the first part of the course, we were introduced to the physics related to the glider’s movement, which basically describe the modification of density needed to alter the buoyancy, depending on several parameters such as temperature and salinity, both of which have an impact on the average density of seawater. We also dedicated some time learning about the electronic components used in gliders for computation, communication and navigation and were introduced to the software for remote operation.

During this first part we also learnt about the main aspects of glider operation, such as how the different paths and parameters are set to define the glider flight, how to properly ballast the vehicle, and we even got the chance to deploy (and recover) a glider. The deployment  of the glider was quite interesting, we went on a boat to open sea and gently dropped the glider in the water. When the glider was ready, the operator commanded the vehicle to submerge at a certain depth and it certainly tried, but the vehicle was able to detect that the desired depth was unreachable because of the sea floor being higher, and came up again. The vehicle is quite slow in its operation and watching it dive and come up again can be quite boring, but then again speed is not required for the applications of these vehicles. The recovery was definitely not easy, I did not directly participate but watched the entire scene as it happened.

The course also included opening and examining the interior of several gliders such as the Slocum, the Spray and the SeaGlider. The first two gliders are quite easy to open and the operator has full freedom to do it, but in the case of the SeaGlider, opening it voids the warranty. The Slocum Glider has a very polished design and it seems very comfortable to work with. In contrast, the Spray Glider looks handmade and there were a lot of things that remembered us to the design of the Avora AUV. Finally, for the first part of the course we also had some presentations from Bluefin Robotics on the Spray Glider, from Liquid Robotics on the Wave Glider and from ACSA-Alcen on the SeaExplorer. The Wave Glider is a very interesting vehicle, although some people do not consider it a glider, and the SeaExplorer is the first European glider.

The second part of the course was dedicated to the scientific applications of the gliders. We learned about different types of sensors used to analyze the properties of the seawater, such as salinity, temperature, pressure, turbidity, dissolved oxigen, etc. Also, we had quite a lot of presentations from several groups and universities explaining the different sensors, the applications of the gliders, results they had obtained, and more. This part of the course was certainly interesting, but for a technical person such as myself, a little bit difficult to follow.

I was very impressed by the presentation given by Dr. Oscar Schofield from Rutgers University on how the ice cap melting affected quite a number of parameters of the global seawater and how this produced a chain reaction which ended with several polar species being unable to feed. Another thing I enjoyed about the course was the presentation on the Wave Glider, it is an incredible technology and a demonstration of how to intelligently harvest energy from the environment for long term operation.

AVORA and the SAUC-E’12 Challenge

As you may already know, I was involved in the construction of an autonomous underwater vehicle (AUV) for participating at the Students AUV Challenge – Europe, which was held in La Spezia (Italy), at the Centre for Maritime Research & Experimentation, from July 6 to 13. It was a great experience being surrounded by top students from all over Europe and Canada, sharing ideas, conceptions and visions about underwater vehicles and robotics.

The sea basin was divided in two equal arenas, this way at most two teams could be working at the same time. The visibility conditions were quite rough and the water currents at the surface were noticeable. The organization provided us with two different workspaces, one on the outside, beside the competition arena, and the other one inside a warehouse. The combination of heat and humidity made it quite complicated to work, even though we were provided with several fans.

The first 5 days were allocated for practice runs, but the truth is some of the teams used this time to finish the construction of their vehicles, including us. On our first few days we did some recordings with an underwater camera, which we used fine grain our detection algorithms. We also finished the construction and did some preliminary tests in the pools. Unfortunately, when everything was ready, the vehicle suffered some leakage  because of an incorrectly sealed connector, which made us lose more than a day cleaning everything, but at least none of the electronic components were damaged.

After repairing the damage, we repeated the tests and verified that everything was working as expected. During these days, the qualification period started, so we were now running against the clock. When everything was ready again, we proceeded to adjust the navigation algorithms directly in the competition arena, something which took longer than expected because one of the arenas was being used for the qualification rounds. The last day of the qualification rounds, we did some simulations of the qualification mission and finished programming it, but at the end, since we had not done enough tests of the mission, we decided not to put at risk the vehicle and gave up our qualification slot.

We all felt a little bit demoralized because of not being able to qualify, but not everything was lost, we still had our chance on the “Impress the judges” category, and we sure came prepared for this one. A while ago, working on our AUV, a member of the team brought a pair of “virtual reality” glasses that he used on . These glasses had attached an external inertial measurement unit, so that the computer could be aware of the operator’s head motion. Since our vehicle was equipped with a pan-tilt camera system, we developed software that combined the camera and the pan-tilt system with the glasses and the gyroscope, so that the user could look around and see the surroundings of the vehicle.

The judges were quite impressed with our telepresence system and it was kind of fun to see them taking turns to try the glasses. They were also quite interested in some of our innovations, such as our pan-tilt camera system or the use of a bend sensor for water velocity measurement. The award ceremony was kind of a surprise, we won the first prize at the “Impress the judges” category, which was much more than we expected after four months work, competing against teams with years of experience and very mature vehicles. After the award ceremony we had a small good-bye party at Lerici, which was shorter than expected, for some of us at least, because of transportation issues.

During these days I had the opportunity to meet some of the most incredible vehicles I have ever seen, not only because of their design, but because of the fact that they were built by students. The vehicle I liked the most was the Canadian one, from the Team SONIA, with a robust and flexible design and an . The team was very prepared and it felt like they had every situation under control, which is a demonstration of their years of experience participating at the RoboSub Competition. Suffice it to say they won this year’s SAUC-E and got third place at RoboSub, quite a feat!

I was also impressed by the design of the vehicle SMART-E, from the University of Luebeck, even though I think it might present a painful challenge for autonomous navigation. This vehicle was shaped like a UFO, and was equipped with 3 thrusters each of which had an additional rotational axis so as to achieve vertical motion. The main hull was transparent, so they took advantage of this to build a strobe light, which was a requirement of the competition, using LEDs all around it. This combined with its shape, made it look like a real UFO, or should I say UCO? (Unidentified Cruising Object).

Overall, it was a worthwhile experience, not only competing but also building an autonomous underwater vehicle from scratch, and I surely recommend it to any student. It is an opportunity to gain more knowledge and to test the knowledge you already have, but more importantly to achieve experience in a real life project.

You can read more about our vehicle on our Journal Paper, or visit or the , or visit the .

Avora AUV SAUC-E’12

Recently, I have been working on a very interesting project, consisting in the design and development of an Autonomous Underwater Vehicle (AUV) for participating in the Students AUV Challenge – Europe held in La Spezia (Italy) at NATO Undersea Research Centre (NURC). We are a team composed of 8 students from different areas (Computer Science, Telecommunications, Electronics, Naval Engineering), in which I have the great honor of being the team leader and lead developer. The name of the AUV (and the team) is AVORA, which means Autonomous Vehicle for Operation and Research in Aquatic Environments, it was intentionally picked as a reference to an ancient deity from the Canary Islands.

The goal of the competition is to perform a series of tasks autonomously, without external information sources and within a fixed time frame, although this time frame is sufficiently large. Taking into account the broad spectrum of missions an AUV can accomplish, we can see that each of the tasks tries to emulate situations that arise in real life, in a limited fashion. The tasks are:

  1. Passing through a validation gate constructed of 2 orange buoys on a rope, 4 meters apart.
  2. Performing an underwater structure inspection. This underwater structure is basically a pipeline of cylinders.
  3. Searching and informing another autonomous vehicle about a mid-water target.
  4. Surveying a wall.
  5. Tracking and following a moving ASV.
  6. Surfacing in the surface zone.
  7. Impressing the judges! In this task, the teams are encouraged to be creative and demonstrate interesting features about their vehicles.

As one can see, completing all of these tasks requires certain type of sensors such as a sonar, cameras, depth and pressure sensors, inertial measurement units, etc, and also a great amount of hard work and time. Our AUV is on its way, since it is our first time, the vehicle has to be constructed from the ground up and it is not an easy job preventing water from getting inside everything.

Another problem with not having the vehicle constructed from the beginning is that most of the work has to be done with each sensor alone and that artificial datasets have to be created in order to fine tune the algorithms. Some of the algorithms require large amounts of data so as to validate them, in such cases we are trying to use datasets provided by others, non-related to the competition. But as I say, being the first time it’s difficult to know what to expect.

From now on I will try to post regularly about our progress. Wish us luck!