Learning ROS For Robotics Programming

Transcription

[1]

Learning ROS for RoboticsProgrammingSecond EditionYour one-stop guide to the Robot Operating SystemEnrique FernándezLuis Sánchez CrespoAnil MahtaniAaron MartinezBIRMINGHAM - MUMBAI

Learning ROS for Robotics ProgrammingSecond EditionCopyright 2015 Packt PublishingAll rights reserved. No part of this book may be reproduced, stored in a retrievalsystem, or transmitted in any form or by any means, without the prior writtenpermission of the publisher, except in the case of brief quotations embedded incritical articles or reviews.Every effort has been made in the preparation of this book to ensure the accuracyof the information presented. However, the information contained in this book issold without warranty, either express or implied. Neither the authors, nor PacktPublishing, and its dealers and distributors will be held liable for any damagescaused or alleged to be caused directly or indirectly by this book.Packt Publishing has endeavored to provide trademark information about all of thecompanies and products mentioned in this book by the appropriate use of capitals.However, Packt Publishing cannot guarantee the accuracy of this information.First published: September 2013Second edition: August 2015Production reference: 1120815Published by Packt Publishing Ltd.Livery Place35 Livery StreetBirmingham B3 2PB, UK.ISBN 978-1-78398-758-0www.packtpub.com

CreditsAuthorsCopy EditorsEnrique FernándezSarang ChariLuis Sánchez CrespoSonia MathurAnil MahtaniAaron MartinezReviewersPiotr GródekAkihiko HONDAMatthieu KellerAridane J. Sarrionandia de LeónCommissioning EditorUsha IyerAcquisition EditorRichard Brookes-BlandContent Development EditorAdrian RaposoTechnical EditorParag TopreProject CoordinatorSanchita MandalProofreaderSafis EditingIndexerMariammal ChettiyarGraphicsSheetal AuteJason MonteiroAbhinash SahuProduction CoordinatorArvindkumar GuptaCover WorkArvindkumar Gupta

About the AuthorEnrique Fernández has a PhD in computer engineering from the University ofLas Palmas de Gran Canaria and is a senior robotics engineer currently working atClearpath Robotics, Inc. He did his MSc master's thesis in 2009 on SLAM. Enriqueaddresses the problem of path planning for autonomous underwater gliders (AUGs)in his PhD thesis, which was presented in 2013. During that period, he also workedon Computer Vision, AI, and other robotics topics, such as inertial navigationsystems and Visual SLAM at the CIRS/ViCOROB Research Lab of the Universityof Girona for AUVs. He also participated in the Student Autonomous UnderwaterChallenge, Europe (SAUC-E) in 2012 and collaborated in the 2013 edition; in the2012 edition, he was awarded a prize.After his PhD, Enrique joined PAL Robotics in June 2013 as a senior robotics engineerin the Autonomous Navigation department. There, he developed software forREEM, REEM-C, and mobile-based robots and also for corresponding projects, suchas Stockbot, using the ROS framework intensively. He worked on motion planning(path planning and control for mobile robots), robot localization, and SLAM.Recently, in 2015, he joined Clearpath Robotics, Inc. to work as a senior autonomydeveloper on SLAM, within the Autonomy department.From an academic perspective, Enrique has published several conference papersand publications, two of them on the International Conference of Robotics andAutomation (ICRA), in 2011. He is also an author of chapters of a few books anda previous book about ROS, Learning ROS for Robotics Programming by PacktPublishing. His MSc master's thesis was about the FastSLAM algorithm for indoorrobots using a SICK laser scanner and the wheel odometry of a Pioneer differentialplatform. His PhD thesis contributed path planning algorithms and tools for AUGs.He also has experience with electronics and embedded systems such as PC104and Arduino. His background covers SLAM, Computer Vision, path planning,optimization, and robotics and artificial intelligence in general.

AcknowledgmentsI would like to thank the coauthors of this book for the effort put into writing anddeveloping the code for the countless examples provided. I also want to say thanksto the members of the research groups where I did my PhD thesis: the UniversityInstitute of Intelligent Systems and Computational Engineering (SIANI) and theCenter of Underwater Robotics Research (CIRS/ViCOROB). Also, a big thanks goesto my ex-colleagues at PAL Robotics, where I learned a lot about ROS, robotics formobile, and humanoid biped robots—not only software, but also electronics andhardware design. Finally, I would like to thank my family and friends for their helpand support.

About the AuthorLuis Sánchez Crespo completed his dual master's degree in electronics andtelecommunication engineering from the University of Las Palmas de Gran Canaria.He has collaborated with different research groups at the Institute for TechnologicalDevelopment and Innovation (IDETIC), the Oceanic Platform of Canary Islands(PLOCAN), and the Institute of Applied Microelectronics (IUMA), where he actuallyresearches the imaging of super-resolution algorithms.His professional interests lie in computer vision, signal processing, and electronicdesign applied to robotics systems. For this reason, he joined the AVORA team, agroup of young engineers and students working on the development of underwaterautonomous vehicles (AUVs) from scratch. In this project, Luis has starteddeveloping acoustic and computer vision systems, extracting information fromdifferent sensors, such as hydrophones, sonar, and cameras.With a strong background gained in marine technology, Luis cofounded SubseaMechatronics, a young start-up, where he works on developing remotely operatedand autonomous vehicles for underwater environments.Here's what Dario Sosa Cabrera, a marine technologies engineer and entrepreneur(and the cofounder and maker of LPA Fabrika: Gran Canaria Maker Space) has tosay about Luis:"He is very enthusiastic and an engineer in multiple disciplines. He is responsible forhis work. He can manage himself and can take up responsibilities as a team leader,as was demonstrated at the SAUC-E competition, where he directed the AVORAteam. His background in electronics and telecommunications allows him to covera wide range of expertise from signal processing and software, to electronic designand fabrication."Luis has participated as a technical reviewer for the previous version of Learning ROSfor Robotics Programming by Packt Publishing.

AcknowledgmentsFirst, I have to acknowledge Aaron, Anil, and Enrique for inviting me to participatein this book. It has been a pleasure to return to work with them. Also, I want tothank the Subsea Mechatronics team for the great experience working with heavy,underwater robots; we have grown together during these years. I have to mentionLPA Fabrika: Gran Canaria Maker Space for their enthusiasm in preparing andteaching educational robotics and technological projects; sharing a workspace withkids can be really motivating.Finally, I have to thank my family and my girlfriend for their big support andencouragement in every project I'm involved in. I want to dedicate my contributionin this book to them.

About the AuthorAnil Mahtani is a computer scientist who has been working for the past 5 yearson underwater robotics. He first started working in the field with his master'sthesis, where he developed a software architecture for a low-cost ROV. During thedevelopment of his thesis, he also became the team leader and lead developer ofAVORA, a team of university students that designed and developed an autonomousunderwater vehicle for the Students Autonomous Underwater Challenge – Europe(SAUC-E) in 2012. That same year, he completed his thesis and his MSc in computerscience at the University of Las Palmas de Gran Canaria, and shortly thereafter,he became a software engineer at SeeByte Ltd, a world leader in smart softwaresolutions for underwater systems.During his tenure at SeeByte Ltd, Anil was key to the development of severalsemi-autonomous and autonomous underwater systems for the military andthe oil and gas industries. In those projects, he was heavily involved in thedevelopment of autonomous systems, the design of distributed softwarearchitectures, and low-level software development and has also contributed toproviding computer vision solutions for front-looking sonar imagery. At SeeByteLtd., he has also achieved the position of project manager, managing a team ofengineers developing and maintaining the internal core C libraries.His professional interests lie mainly in software engineering, algorithms, distributedsystems, networks, and operating systems. Anil's main role in robotics is to provideefficient and robust software solutions, addressing not only the current problemsat hand but also foreseeing future problems or possible enhancements. Givenhis experience, he is also an asset when dealing with computer vision, machinelearning, and control problems. Anil is interested in DIY and electronics, and he hasdeveloped several Arduino libraries that he has contributed back to the community.

AcknowledgmentsFirst of all, I would like to thank my family and friends for their support and foralways being there when I've needed them. I would also like to thank David RubioVidal, Emilio Migueláñez Martín, and John Brydon for being the most supportivecolleagues and friends, who have taught me so much personally and professionally.I would also like to thank my colleagues at SeeByte and the AVORA team fromwhom I've learned and experienced so much over the years. Finally, a special thankyou to Jorge Cabrera Gámez, whose guidance and advice shaped my career in a wayI could have never imagined.

About the AuthorAaron Martinez is a computer engineer, entrepreneur, and expert in digitalfabrication. He did his master's thesis in 2010 at Instituto Universitario de Cienciasy Tecnologias Ciberneticas (IUCTC) from the University of Las Palmas de GranCanaria. He prepared his master's thesis in the field of telepresence using immersivedevices and robotic platforms. After completing his academic career, he attended aninternship program at The Institute for Robotics at the Johannes Kepler University inLinz, Austria. During his internship program, he worked as part of a developmentteam of a mobile platform using ROS and the navigation stack. After that, hewas involved in projects related to robotics; one of them is the AVORA project atthe University of Las Palmas de Gran Canaria. In this project, he worked on thecreation of an autonomous underwater vehicle (AUV) to participate in the StudentAutonomous Underwater Challenge-Europe (SAUC-E) in Italy. In 2012, he wasresponsible for manufacturing this project; in 2013, he helped adapt the navigationstack and other algorithms from ROS to the robotic platform.Recently, Aaron cofounded a company called SubSeaMechatronics, SL. This companyworks on projects related to underwater robotics and telecontrol systems; it alsodesigns and manufactures subsea sensors. The main purpose of the company is todevelop custom solutions for R&D prototypes and heavy-duty robots.Aaron has experience in many fields, such as programming, robotics, mechatronics,and digital fabrication, and devices such as Arduino, BeagleBone, servers, andLIDAR. Nowadays, he is designing robotics platforms for underwater and aerialenvironments at SubSeaMechatronics SL.

AcknowledgmentsI would like to thank my girlfriend, who supported me while writing this book andgave me the motivation to continue growing professionally. I also want to thankDonato Monopoli, head of the Biomedical Engineering department at the CanaryIslands Institute of Technology (ITC), and all the staff there. Thanks for teaching meall that I know about digital fabrication, machinery, and tissue engineering. I spentthe best years of my life in your workshop.Thanks to my colleagues from the university, especially Alexis Quesada, who gaveme the opportunity to create my first robot in my master's thesis. I have learned a lotabout robotics working with them.Finally, thanks to my family and friends for their help and support.

About the ReviewerPiotr Gródek is a C programmer interested in computer vision and imageprocessing. He has worked as an embedded programmer and now works in banking.He is a developer of open source gaming and a self-driving car. In his free time, heenjoys running, playing squash, and reading.

About the ReviewerAkihiko HONDA is an engineer of space robotics. He did his master's thesis in2012 at the Tokyo Institute of Technology (Tokyo Tech). He is currently a PhD coursestudent at Tokyo Tech.His research interests include the teleoperation and automation of space robotsthat interact with flexible or deformable materials. He has a goal to improve theperformance and stability of spacecraft in space by developing a much betteroperation and automation system. In his previous research, he worked for an earthobservation satellite with a large solar array paddle and a space robotic arm used tocapture the ISS supplier. Currently, he is planning to apply his research results toSpace Solar Power System, planetary exploration rovers, and so on. He got an awardfor the best entry and an award from the Astronomical Society of Japan in JSF'sSatellite Design Contest by proposing a new exploration spacecraft using his research.Through his research at university, he has also participated in several projectsconducted by Japan Aerospace Exploration Agency (JAXA). In the Robot Experimenton JEM (REX-J) project, he played a role in supporting operations for the experimentfacility in orbit and got inspiration for his research. He also joined a project to developa wearable manipulator for astronauts and developed systems for human control. Heis currently working on two exploration robot projects. In one of them, a transformablerover named "KENAGE" is being developed to overcome the extra-rough terrain onthe moon and Mars. The rover is now being examined for the possibility of using aGAZEBO simulator prepared by him. In another other project, he is developing anenvironment recognition system for Jumping Scouter.In 2013, he participated in the SMART rover project at the University of Surreyand contributed to develop an environment protection and recognition system.Also, he played a role in a field test to check the practical utility of the rover ina real environment.

AcknowledgmentsI would like to thank Hiroki KATO from JAXA for opening the door to ROS for meand giving precious suggestions for my research. I would also like to thank ProfessorMitsushige ODA, Professor Hiroki NAKANISHI, and my colleagues in the SpaceRobotics Lab at Tokyo Tech. They share wonderful future visions about cool robotsworking in space with me, give suggestions, and support my research to realize themusing ROS. I would also like to thank my professors and colleagues at STAR Lab atthe University of Surrey for providing me with important advice about how to useROS in a real environment. I would especially like to thank my friends from GranCanaria who introduced me to this exciting work.Finally, big thanks go to my family, Yoshihiko, Nobuko, and Ayaka, who havesupported my life and my dream, and my girlfriend, who understands me.

About the ReviewersMatthieu Keller is a French engineer who loves technology and computer science.His education walked him through computing and robotics, which have nowbecome a hobby. He has reviewed the first version of this book.Aridane J. Sarrionandia de León studied computer sciences and has alwayshad a great interest in robotics and autonomous vehicles. His degree project is aboutunderwater mapping using sonar, for which he has worked with an autonomousunderwater vehicle with ROS. He has experience with autonomous systemsand ROS. He is familiar with OpenCV and PCL and is currently working on thedevelopment of the control system of an autonomous surface vehicle.I would like to thank Luis and Aaron for giving me the opportunityto review this book. Also, I would like to thank the AVORA teamfrom the University of Las Palmas de Gran Canaria, especiallyAaron, Luis, and Enrique, for introducing me to the wonders of ROSand helping me discover the world of autonomous vehicles, and mytutor, Jorge Cabrera Gámez, who gave me the opportunity to be apart of the AVORA team.Finally, I would like to thank my family and friends, who supportedme through the bugs in my life. Special thanks to Eva for dealingwith all my gibberish.

www.PacktPub.comSupport files, eBooks, discount offers, and moreFor support files and downloads related to your book, please visit www.PacktPub.com.Did you know that Packt offers eBook versions of every book published, with PDFand ePub files available? You can upgrade to the eBook version at www.PacktPub.comand as a print book customer, you are entitled to a discount on the eBook copy. Get intouch with us at service@packtpub.com for more details.At www.PacktPub.com, you can also read a collection of free technical articles, signup for a range of free newsletters and receive exclusive discounts and offers on Packtbooks and ion/packtlibDo you need instant solutions to your IT questions? PacktLib is Packt's online digitalbook library. Here, you can search, access, and read Packt's entire library of books.Why subscribe? Fully searchable across every book published by Packt Copy and paste, print, and bookmark content On demand and accessible via a web browserFree access for Packt account holdersIf you have an account with Packt at www.PacktPub.com, you can use this to accessPacktLib today and view 9 entirely free books. Simply use your login credentials forimmediate access.

Table of ContentsPrefaceChapter 1: Getting Started with ROS HydroPC installationInstalling ROS Hydro – using repositoriesConfiguring your Ubuntu repositoriesSetting up your source.list fileSetting up your keysInstalling ROSInitializing rosdepSetting up the environmentGetting rosinstallHow to install VirtualBox and UbuntuDownloading VirtualBoxCreating the virtual machineInstalling ROS Hydro in BeagleBone Black (BBB)PrerequisitesSetting up the local machine and source.list fileSetting up your keysInstalling the ROS packagesInitializing rosdep for ROSSetting up the environment in BeagleBone BlackGetting rosinstall for BeagleBone BlackSummaryChapter 2: ROS Architecture and ConceptsUnderstanding the ROS Filesystem levelThe 21516181919202021212324252729

Table of ContentsMessagesServicesUnderstanding the ROS Computation Graph levelNodes and nodeletsTopicsServicesMessagesBagsThe ROS masterParameter ServerUnderstanding the ROS Community levelTutorials to practice with ROSNavigating by ROS FilesystemCreating our own workspaceCreating a ROS package and metapackageBuilding an ROS packagePlaying with ROS nodesLearning how to interact with topicsLearning how to use servicesUsing Parameter ServerCreating nodesBuilding the nodeCreating msg and srv filesUsing the new srv and msg filesThe launch fileDynamic parametersSummaryChapter 3: Visualization and Debug ToolsDebugging ROS nodesUsing the GDB debugger with ROS nodesAttaching a node to GDB while launching ROSProfiling a node with valgrind while launching ROSEnabling core dumps for ROS nodesLogging messagesOutputting a logging messageSetting the debug message levelConfiguring the debugging level of a particular nodeGiving names to messagesConditional and filtered messages[ ii 87177798383858586868687889090

Table of ContentsShowing messages in the once, throttle, and other combinationsUsing rqt console and rqt logger level to modify the debugginglevel on the flyInspecting what is going onListing nodes, topics, services, and parametersInspecting the node's graph online with rqt graphSetting dynamic parametersWhen something weird happensVisualizing node diagnosticsPlotting scalar dataCreating a time series plot with rqt plotImage visualizationVisualizing a single image3D visualizationVisualizing data in a 3D world using rqt rvizThe relationship between topics and framesVisualizing frame transformationsSaving and playing back dataWhat is a bag file?Recording data in a bag file with rosbagPlaying back a bag fileInspecting all the topics and messages in a bag fileUsing the rqt gui and rqt pluginsSummaryChapter 4: Using Sensors and Actuators with ROSUsing a joystick or a gamepadHow does joy node send joystick movements?Using joystick data to move a turtle in turtlesimUsing a laser rangefinder – Hokuyo URG-04lxUnderstanding how the laser sends data in ROSAccessing the laser data and modifying itCreating a launch 0120121122123126127129130131132136138140142Using the Kinect sensor to view objects in 3DHow does Kinect send data from the sensors, and how do we see it?Creating an example to use KinectUsing servomotors – DynamixelHow does Dynamixel send and receive commands for the movements?143144146149150Using Arduino to add more sensors and actuatorsCreating an example program to use Arduino153154Creating an example to use the servomotor[ iii ]151

Table of ContentsUsing an ultrasound range sensor with ArduinoHow distance sensors send messagesCreating an example to use the ultrasound range157160161How does Xsens send data in ROS?Creating an example to use XsensUsing a low-cost IMU – 10 degrees of freedomDownloading the library for the accelerometerProgramming Arduino Nano and the 10 DOF sensorCreating an ROS node to use data from the 10 DOF sensorUsing a GPS systemHow GPS sends messagesCreating an example project to use GPSSummary164165167169169172174176177179Using the IMU – Xsens MTi163Chapter 5: Computer Vision181Chapter 6: Point Clouds231Connecting and running the cameraFireWire IEEE1394 camerasUSB camerasWriting your own USB camera driver with OpenCVUsing OpenCV and ROS images with cv bridgePublishing images with image transportUsing OpenCV in ROSVisualizing the camera input imagesCalibrating the cameraStereo calibrationThe ROS image pipelineThe image pipeline for stereo camerasROS packages useful for Computer Vision tasksUsing visual odometry with viso2Camera pose calibrationRunning the viso2 online demoRunning viso2 with our low-cost stereo cameraPerforming visual odometry with an RGBD cameraInstalling fovisUsing fovis with the Kinect RGBD cameraComputing the homography of two imagesSummaryUnderstanding the point cloud libraryDifferent point cloud types[ iv 23224225225228229232233

Table of ContentsAlgorithms in PCLThe PCL interface for ROSMy first PCL programCreating point cloudsLoading and saving point clouds to the diskVisualizing point cloudsFiltering and downsamplingRegistration and matchingPartitioning point 59264269Chapter 7: 3D Modeling and Simulation271Chapter 8: The Navigation Stack – Robot Setups303A 3D model of our robot in ROSCreating our first URDF fileExplaining the file formatWatching the 3D model on rvizLoading meshes to our modelsMaking our robot model movablePhysical and collision propertiesXacro – a better way to write our robot modelsUsing constantsUsing mathUsing macrosMoving the robot with code3D modeling with SketchUpSimulation in ROSUsing our URDF 3D model in GazeboAdding sensors to GazeboLoading and using a map in GazeboMoving the robot in GazeboSummaryThe navigation stack in ROSCreating transformsCreating a broadcasterCreating a listenerWatching the transformation treePublishing sensor informationCreating the laser 89293296298301304305306307310310312

Table of ContentsPublishing odometry informationHow Gazebo creates the odometryCreating our own odometryCreating a base controllerUsing Gazebo to create the odometryCreating our base controllerCreating a map with ROSSaving the map using map serverLoading the map using map serverSummary314316319324326327330332333334Chapter 9: The Navigation Stack – Beyond Setups335Chapter 10: Manipulation with MoveIt!367Creating a packageCreating a robot configurationConfiguring the costmaps – global costmap and local costmapConfiguring the common parametersConfiguring the global costmapConfiguring the local costmapBase local planner configurationCreating a launch file for the navigation stackSetting up rviz for the navigation stackThe 2D pose estimateThe 2D nav goalThe static mapThe particle cloudThe robot's footprintThe local costmapThe global costmapThe global planThe local planThe planner planThe current goalAdaptive Monte Carlo LocalizationModifying parameters with rqt reconfigureAvoiding obstaclesSending goalsSummaryThe MoveIt! architectureMotion planningThe planning scene[ vi 53354355356357359360362365368370371

Table of ContentsKinematicsCollision checkingIntegrating an arm in MoveIt!What's in the box?Generating a MoveIt! package with the setup assistantIntegration into RVizIntegration into Gazebo or a real robotic armSimple motion planningPlanning a single goalPlanning a random targetPlanning a predefined group stateDisplaying the target motionMotion planning with collisionsAdding objects to the planning sceneRemoving objects from the planning sceneMotion planning with point cloudsThe pick and place taskThe planning 394396397PerceptionGraspingThe pickup actionThe place actionThe demo modeSimulation in GazeboSummary401402405408411412413The target object to graspThe support surfaceIndex398399415[ vii ]

PrefaceLearning ROS for Robotics Programming, Second Edition gives you a comprehensivereview of ROS tools. ROS is the Robot Operating System framework, which is usednowadays by hundreds of research groups and companies in the robotics industry.But it is also the painless entry point to robotics for nonprofessional people. You willsee how to install ROS, you will start playing with its basic tools, and you will endup working with state-of-the-art computer vision and navigation tools.The content of the book can be followed without any special devices, and eachchapter comes with a series of source code examples and tutorials that you canrun on your own computer. This is the only thing you need to follow in the book.However, we also show you how to work with hardware so that you can connectyour algorithms with the real world. Special care has been taken in choosing devicesthat are affordable for amateur users, but at the same time, the most typical sensorsor actuators in robotics research are covered.Finally, the potential of ROS is illustrated with the ability to work with whole robotsin a simulated environment. You will learn how to create your own robot andintegrate it with the powerful navigation stack. Moreover, you will be able to runeverything in simulation by using the Gazebo simulator. We will end the book byproviding an example of how to use the Move it! package to perform manipulationtasks with robotic arms. At the end of the book, you will see that you can workdirectly with a ROS robot and understand what is going on under the hood.What this book coversChapter 1, Getting Started with ROS Hydro, shows the easiest way you must followin order to have a working installation of ROS. You will see how to install ROS ondifferent platforms, and you will use ROS Hydro throughout the rest of the book.This chapter describes how to make an installation from Debian packages, compilethe sources and make installations in virtual machines and ARM CPU.[ ix ]

PrefaceChapter 2, ROS Architecture and Concepts, is concerned with the concepts and toolsprovided by the ROS framework. We will introduce you to nodes, topics, and services,and you will also learn how to use them. Through a series of examples, we willillustrate how to debug a node and visualize the messages published through a topic.Chapter 3, Visualization and Debug Tools, goes a step further in order to showyou powerful tools to debug your nodes and visualize the information that goesthrough the node's graph along with the topics. ROS provides a logging API thatallows you to diagnose node problems easily. In fact, we will see some powerfulgraphical tools, such as rqt console and rqt graph, as well as visualization interfaces,such as rqt plot and rviz. Finally, this chapter explains how to record and play backmessages using rosbag and rqt bag.Chapter 4, Using Sensors and Actuators with ROS, literally connects ROS with the realworld. This chapter goes through a number of common sensors and actuators thatare supported in ROS, such as range lasers, servo motors, cameras, RGB-D sensors,GPS, and much more. Moreover, we explain how to use embedded systems withmicrocontrollers, similar to the widely known Arduino boards.Chapter 5, Computer Vision, shows the support for cameras and computer vision tasksin ROS. This chapter starts with drivers available for FireWire and USB cameras sothat you can connect them to your computer and capture images. You will then beable to calibrate your camera using the ROS calibration tools. Later, you will be ableto use the image pipeline, which is explained in detail. Then, you will see how to useseveral APIs for vision and integrate OpenCV. Finally, the installation and usage of avisual odometry software is described.Chapter 6, Point Clouds, in this chapter, we show how to use Point Cloud Library inyour ROS nodes. This chapter starts with the basics utilities, such as read or write aPCL snippet and the conversions needed to publish or subscribe to these messages.Then, you will create a pipeline with different nodes to process 3D data, and you willdownsample, filter, and search for features using PCL.Chapter 7, 3D Modeling and Simulation, constitutes one of the first steps

and publications, two of them on the International Conference of Robotics and Automation (ICRA), in 2011. He is also an author of chapters of a few books and a previous book about ROS, Learning ROS for Robotics Programming by Packt Publishing. His MSc master's thesis was about the FastSLAM algorithm for indoor