PART 1 WHY ADAS AND AUTONOMOUS VEHICLES NEED

Transcription

THERMAL FOR ADAS AND AVPART 1WHY ADAS ANDAUTONOMOUS VEHICLESNEED THERMALINFRARED CAMERAS05A CHALLENGING REQUIREMENTCALLS FOR ADVANCED TECHNOLOGYVisible cameras, sonar, and radar are alreadyin use on production vehicles today at SAEautomation level 2. SAE automation levels 3and 4 test platforms have added light detectionand ranging (LIDAR) to their sensor suite.Each of these technologies has strengths andweaknesses. Tragically, as shown in recentUber and Tesla accidents, the current sensorsin SAE level 2 and 3 do not adequately detectcars or pedestrians.FULLThe Governors Highway Safety Associationstates the number of pedestrian fatalities inthe U.S. has grown substantially faster thanall other traffic deaths in recent years. Theynow account for a larger proportion of trafficfatalities than they have in the past 33 years.Pedestrians are especially at risk after darkwhen 75% of the 5,987 U.S. pedestrianfatalities occurred in 2016.2 Thermal, orlongwave infrared (LWIR), cameras can detectand classify pedestrians in darkness, throughmost fog conditions, and are unaffected bysun glare, delivering improved situationalawareness that results in more robust, reliable,and safe ADAS and AV.PERCENT OF 2016 PEDESTRIAN FATALITIES80%60%04HIGHSAE AUTOMATION LEVELSSafe advanced driver assist system (ADAS)vehicles and autonomous vehicles (AV) requiresensors to deliver scene data adequate forthe detection and classification algorithms toautonomously navigate under all conditions forSAE automation level 5.1 This is a challengingrequirement for engineers and developers.03SELF-DRIVING UBER CARINVOLVED IN FATAL ACCIDENTMARCH 2018CONDITIONAL02TESLA HITS COP CAR WHILEALLEGEDLY ON AUTOPILOTMAY 2018PARTIAL01DRIVER ASSISTANCE40%20%0%DARKDAYLIGHTDAWN/DUSKFigure 2. 2016 pedestrian fatalities by light level. Source: Governors Highway Safety AssociationSAE International and J3016, Taxonomy and Definitions for Terms Related to On-Road Motor Vehicle Automated Driving Systems, /www.sae.org/misc/pdfs/automated driving.pdfRichard Retting and Sam Schwatz, Governors Highway Safety Association Pedestrian Traffic Fatalities by State (2017 Preliminary Data) edestrians18.pdf12NONEFigure 1. Recent Uber and Tesla accidents showthe need for a higher-performance ADAS sensorsuite in SAE levels 2 and greater.

PART 1 WHY ADAS AND AUTONOMOUS VEHICLES NEED THERMAL IMAGING SENSORSMATCH THE RIGHT TECHNOLOGYWITH THE OPTIMAL APPLICATIONADAS and AV platforms use severaltechnologies (Table 1), and the core approachis to detect and subsequently classify objectsto determine a course of action. For example,radar and LIDAR systems generate a pointdensity cloud from the reflections they gatherand calculate an object range and closingspeed. To generate the amount of data neededfor object classification in a cost-effective andreliable solution, radar and LIDAR are fusedwith the output from visible and thermalcameras to cover all driving conditions.ANIMAL &OBJECTDETECTIONperformance, but at ranges beyondapproximately 165 feet (50 meters), thermalcameras significantly outperform low-lightvisible cameras and deliver more consistentimagery in all lighting conditions.The NTSB Report3 on the Uber incident inTempe, Arizona—in which a pedestrian wasfatally struck by a developmental, SAE-level-3autonomous car using LIDAR, radar, andvisible sensors—revealed that the pedestrianwas first classified as an unknown object,then a car, and then a bicycle before finallybeing classified as a person. FLIR re-createdthis accident using a wide field of view (FOV)FLIR ADK and a basic classifier. The thermalcamera system classified the pedestrian atapproximately 280 feet (85.4 meters), morethan twice the required “fast-reaction”stopping distance for a human driving at 43mph4 (126 feet or 38.4 meters). Additionaltesting with narrower-FOV thermal camerashas demonstrated pedestrian classification atgreater than 200 meters, which is four timesfarther than typical headlights and visiblecameras can typically see.Classification is challenging in poor lightingconditions, nighttime driving, blinding sunglare, and inclement weather. Thermalsensors improve on the ability to see indarkness, through most fog and sun glare,and reliably classify vehicles, people, animals,and other objects in these common drivingconditions. Furthermore, thermal camerasperform equally well in daytime driving,offering redundancy for a visible camera.Low-light visible cameras, coupled withLIDAR and radar, provide baseline nighttimeTable 1. Detector technologies and application summaryApplicationTraffic Sign GLANE FRONT CROSSTRAFFIC ALERTSURROUNDVIEWFACE& EYETRACKINGBLIND SPOTDETECTIONSURROUNDVIEWBLIND SPOTDETECTIONREAR CROSSTRAFFIC ALERTTHERMAL 360PARK ASSIST &SURROUND VIEWLONG-RANGE RADARFORWARD LIDARCAMERASHORT/MEDIUM RANGE RADARULTRASOUNDTHERMAL IMAGINGFigure 3. Thermal sensors add reliability andimprove performance of the ADAS and AVsensor suitesxxxxEmergency Brake AssistxxxPedestrian/Animal DetectionxxPedestrian/Animal ClassificationxxFront Cross Traffic AlertNight VisionxBlind Spot DetectionxxxxxxRear Collision WarningPark DESTRIANDETECTIONxAdaptive Cruise ControlLane Departure LxxFigure 4. A wide-FOV FLIR ADK classified a personat 280 feet, twice the needed stopping distance, inthe recreation of an Uber accident in Tempe, Arizona.xMapping/LocationxRear Cross Traffic AlertxxxRear AEBCollision AvoidancexxSurround w.brakingdistances.comFigure 5. Fog-tunnel testing demonstratessignificant visibility improvement with thermalversus visible cameras.

PART 1 WHY ADAS AND AUTONOMOUS VEHICLES NEED THERMAL IMAGING SENSORSLIGHT TRANSMISSIONTHROUGH FOG“SEEING” HEAT INSTEADOF RELYING ON LIGHTFOG CATAGORYCatagory 1Catagory 2Catagory 3Catagory 4As recent events and our own drivingexperience demonstrate, it can be challengingto see pedestrians day or night. There arenumerous cases where the combination ofdriving conditions and environment illustratethe need for thermal cameras. In fact, thereare times that a thermal camera may be theonly detection and classification technologythat works. Visible-light cameras depend on light fromthe sun, street lights, or headlights reflectedoff objects and received by the sensor.A radar or LIDAR signal from a pedestriancan be camouflaged by a nearby vehicle’scompeting signal. If a pedestrian is crossingbetween two cars or is partially obstructed byfoliage there will be little to no reflected signalto detect the pedestrian. In such cases, as insearch and rescue or military applications,thermal cameras can see through lightfoliage. Because thermal sensors see heat,not visible shapes, they have an advantageover visible cameras in classifying partiallyoccluded people and animals. The heat from aperson or animal makes them stand out froma cluttered background or partially obstructingforeground, as shown in Figure 6. LIDAR sensors emit laser light energy andprocess the reflected energy by measuringthe time of flight of the illumination source. Radar emits radio signals and processes thereturn signal.Thermal imaging takes advantage of thefact that all objects emit thermal energyand, therefore, eliminates reliance on anillumination source. Figure 6 illustrates howpassive sensing benefits thermal over visiblesensors in light to moderate fog where athermal camera can see at least four timesfarther than a visible camera per the TACOMThermal Image Model (TTIM).LWIR5.9 - 10.12.40.2930.087COMPARISON OF VISIBLEAND IR IN FOGVISIBLE LIGHTA FROM OBJECTSCATTERED AND ABSORBED BY FOGB HEADLIGHTSREFLECTED FROM ACTIVE ILLUMINATIONC INFRARED LIGHTLESS ABSORPTION BY FOGFOG ABSORPTIONThermal sensing boosts the performanceof ADAS and AV platforms in inclementweather, including dust, smog, fog, and lightrain conditions. LWIR thermal sensors arecompletely passive, a key advantage overvisible cameras, LIDAR, and radar. Targetreflectivity and atmospheric effects can createvariables in sensor performance, particularly atthe limits of their operating RCAMERAFigure 5. Passive thermal sensing can detectpedestrians from distances four times fartherthan a visible sensor through light to moderatefog – day or night per the TACOM ThermalImage Model (TTIM)READY FOR ALLDRIVING CONDITIONSThe primary challenge of ADAS and AVplatforms is being prepared for all drivingconditions. The road is full of complex,unpredictable situations, and cars must beequipped with cost-effective sensor suitescapable of collecting as much information aspossible to make the right decision every time.The current standard sensor suite does notcompletely address the requirements for SAElevel 3 and greater. Thermal cameras can seepedestrians up to four times farther away thana visible camera in darkness and through sunglare and most fog. They provide an excellentorthogonal and redundant sensing modalityand further improve the reliability and safetyof ADAS and AV platforms.To learn more about thermal technology for ADAS and AV platforms, visitwww.FLIR.com/adas to download the following solution briefs: Technical Advantages of Thermal Imaging in ADAS and AV Platforms The Pathway to Affordable, Scalable Automotive Integration Overcoming Technological and Logistical Thermal Imaging Automotive Integration ChallengesFigure 6. Thermal cameras see heat, reducingthe impact of occlusion on classification ofpedestrians18-1848-OEM-FLIR Thermal for ADAS and AV – Topic 109/13/18

THERMAL FOR ADAS AND AVPART 2TECHNICAL ADVANTAGES OFTHERMAL CAMERASIN ADAS AND AV PLATFORMSRELIABLY CLASSIFYPEOPLE & ANIMALSENHANCED LONGRANGE IMAGINGPROVIDE CROWAVESThermal sensors perform well in conditionswhere other technologies in the sensor suiteare challenged. Developers are taking advantage of the opportunity to integrate FLIR’s automotive development kit, the FLIR ADK , intovehicles to add thermal imaging to their sensorsuite. Thermal sensors can detect and classifypeople and animals in darkness and throughsun glare and most fog at distances greaterthan four times the illumination distance oftypical headlights.VISABLEUntil recently, SAE automation level 2 (partialautomation) and level 3 (conditional automation) vehicles that drive our roads did not include thermal or infrared (IR) imaging in thesensor suite. High-profile accidents involvingboth Uber and Tesla vehicles have increasedthe scrutiny of sensor performance and safety.Although many test vehicles perform admirably under test and ideal conditions, their actualperformance must stand up to the rigors of real-life driving conditions.RADIOSAFETY CHALLENGES THAT REQUIREREAL TECHNOLOGY SOLUTIONSSEE THROUGHOBSCURANTSSEE INTOTAL DARKNESSGAMMA RAYSFigure 2. The FLIR ADK with VGA resolution can “see” precise details including roadwaysurface markings – day and night.Figure 1. Thermal imagers use infrared energy todetect, classify, and measure temperature from adistance.

PART 2 TECHNICAL ADVANTAGES OF THERMALIMAGING IN ADAS AND AV PLATFORMSSENSING MINUTEDIFFERENCES IN TEMPERATUREThermal or longwave infrared (LWIR) energyis emitted, reflected, or transmitted by everything that would be on or near a roadway. It iswell known that thermal cameras can clearlydetect differences between a human body(living things), inanimate objects, and background clutter, differentiating them as an essential technology to detect pedestrians.FLIR thermal imaging cameras are extremely sensitive to differences in temperatureas small as 0.05 Celsius. With this precisesensitivity, VGA thermal cameras (640 x 512pixels) can clearly show nearly everything ina scene, even the centerline on a roadway.Figure 2 (a screen capture of video from a FLIRrecreation of the Uber accident in Tempe, Arizona) clearly shows roadway surface detailssuch as paint while detecting and classifyingthe pedestrian at over twice the required“fast-reaction” stopping distance for a humandriving at 43 mph1 (126 feet or 38.4 meters).“SEEING” HEAT THROUGHFOG INSTEAD OF RELYING ON LIGHTThe 2016 AWARE (All Weather All RoadsEnhanced) vision project tested a suite ofcameras that could potentially enhance vision in challenging-visibility conditions, suchas night, fog, rain, and snow. To identify thetechnologies providing the best all-weather vision, they evaluated the four differentbands on the electromagnetic spectrum:visible RGB, near infrared (NIR), short-waveinfrared (SWIR), and LWIR, or thermal. Theproject measured pedestrian detection atvarious fog densities (Table 1) and formulatedthe following three conclusions.2The LWIR camera penetrated fog betterthan the NIR and SWIR. The visible camera had the lowest fog piercing capability. The LWIR camera was the only sensorthat detected pedestrians in full darkness. The LWIR camera also proved moreresilient to glare caused by oncomingheadlamps in the fog. Visible RGB, SWIR, and NIR camerassometimes missed a pedestrian becauseshe/he was hidden by headlamp glare.Table 1. Fog thickness for pedestrian detection at 25 meters (glare cases not included) indicate LWIRsuperiority for pedestrian detection in fog.2Figure 3. Example images recorded in fogtunnel with thermal (LWIR), visible RGB,short-wave (SWIR), and near (NIR) cameras. 2CameraCopyright SIA Vision 2016Visible RGBFog Density for Pedestrian DetectionModerate (visibility range 47 10 m)Extended NIRHigh (visibility range 28 7 m)Extended SWIRHigh (visibility range 25 3 m)LWIR1 Extreme (visibility range 15 4 m)http://www.brakingdistances.comNicolas Pinchon, M Ibn-Khedher, Olivier Cassignol, A Nicolas, Frédéric Bernardin, et al.All-weather vision for automotive safety: which spectral band?. SIA Vision 2016 - InternationalConference Night Drive Tests and Exhibition, Oct 2016, Paris, France. Société des Ingénieursde l’Automobile - SIA, SIA Vision 2016 - International Conference Night Drive Tests andExhibition, 7p, 2016. hal-01406023 2

PART 2 TECHNICAL ADVANTAGES OF THERMALIMAGING IN ADAS AND AV PLATFORMSON DETECTION,CLASSIFICATION,AND FIELDS OF VIEWDetection and classification are key performancemetrics within advanced driver assist system(ADAS) and AV sensor suites. Detection lets asystem know that there is an object ahead. Classification determines the class of object (person,dog, bicycle, car, other vehicle, etc.) and indicatesthe classification confidence level.hicle on a rural highway, and a wide FOV sensorfor optimal use in city driving.Current artificial-intelligence-based classification systems typically require a target to fill 20by 8 pixels to reliably ( 90% confidence) classify a given object. For example, to classify a human with reliable confidence, the human needsto be approximately 20 pixels tall as shown inFigure 4. Table 2 includes classification distances for different thermal camera horizontal fieldsof view and indicates that a FLIR ADK can classify a 6-foot tall human at a distance greaterthan 600 feet (186 meters) for a narrow FOVlens configuration. Detection, which requiresfewer pixels on an object, means that a 6-foottall human can be detected at greater than 200meters using the FLIR ADK.In photography and thermal imagers, the fieldof view (FOV) is that part of a scene that is visible through the camera at a particular positionand orientation in space. The narrower the FOV,the farther a camera can see. A wider FOV cannot see as far, but provides a greater angle ofview. FOV impacts the distance at which a thermal camera can detect and classify an object,meaning multiple cameras may be required; anarrow FOV sensor to see far ahead of the ve-Figure 4. Thermal cameras require only 20 by 8pixels to reliably classify an object.Table 2: Classification distance for FLIR ADK by horizontal FOV – day or nightFLIR ADK Horizontal FOVClassification Distance (feet)Classification Distance TTER SITUATIONAL AWARENESSRESULTS IN MORE INFORMED DECISIONSThermal imaging technology is a highly sensitive, passive imaging technology that can be akey enabler for safer ADAS and AV platforms.Thermal sensors can detect and classify people and animals in darkness and through sunglare and most fog at distances greater thanfour times the distance that typical headlightsilluminate and visible cameras can see. FLIRthermal cameras complement existing technologies in the sensor suite and help thesesystems make better, safer decisions basedon improved situational awareness.To learn more about thermal technologyfor ADAS and AV platforms, visitwww.FLIR.com/ADAS to download the following solution briefs: Why ADAS and Autonomous VehiclesNeed Thermal Imaging Sensors The Pathway to Affordable, Scalable Automotive Integration Overcoming Technological and LogisticalThermal Imaging Automotive IntegrationChallengesWideFOVCameraFigure 5. The narrower the horizontal FOV, thefarther a thermal camera can “see.”The images displayed may not be representative of the actual resolution of the camera shown.Images for illustrative purposes only.18-1848-OEM-FLIR Thermal for ADAS and AV – Topic 209/05/18

THERMAL FOR ADAS AND AVPART 3THE PATHWAY TOAFFORDABLE, SCALABLEAUTOMOTIVE INTEGRATIONBREAKTHROUGHSTO AFFORDABILITYMass adoption of SAE automation level 3(conditional automation) and above is dependenton affordable sensor technologies, the computepower required to process the incoming sensordata, and the artificial intelligence neededto execute driving commands that deliversafe and reliable transportation in real-worldconditions. Advanced driver assist system(ADAS) and autonomous vehicle (AV) sensortechnologies include cameras, ultrasonic sensors,radar, and light detection and ranging (LIDAR), butthis combination of sensors falls short in creatinga truly safe and comprehensive solution.Thermal sensors, with their unique abilityto detect the long wave infrared (LWIR)energy given off by everything, can seethrough darkness, smoke, most fog, and isunaffected by sun glare. Their strength, todetect and classify pedestrians in highlycluttered environments, makes them a keytechnology to help reduce the number ofpedestrian deaths, which totaled 5,987 in 2017in the U.S., of which 4,490, or 75%, occurredafter dark1. A common misconception is thatthermal sensors, with their background inmilitary use, are too expensive for automotiveintegration. Thanks to advances in thermalimaging technology, improved manufacturingtechniques, and increased manufacturingvolume, it is becoming possible to massproduce affordable thermal sensors2 for SAEautomation level 2 and higher.FLIR is focused on the development of costeffective thermal imaging technology. As theworld leader in thermal imaging, FLIR hasdelivered millions of thermal sensors, includingmore than 500,000 sensors into driver warningsystems installed on several automotivenameplates, including General Motors,Volkswagen, Audi, Peugeot, BMW, and others.Working closely with our tier-one automotivecustomer, Veoneer, FLIR has driven down thecost of thermal imaging technology for theautomotive market as well as other emergingconsumer markets, including mobile phones3and consumer drones.4Until recently, thermal cameras were measuredin thousands of dollars each for VGA resolutionor higher. Now they are an order of magnitudelower in price due to volume and technologyimprovements. FLIR continues to innovate andfurther reduce camera costs, enabling ADAS andAV developers and engineers to add affordablethermal imaging to their sensor suites.1 Richard Retting and Sam Schwatz, Governors Highway Safety Association Pedestrian Traffic Fatalities by State (2017 Pre

Sep 13, 2018 · now account for a larger proportion of traffic . density cloud from the reflections they gather and calculate an object range and closing speed. To generate the amount of data needed . 18-1848-OEM-FL