Autonomous Trucking,Industry News/Regulations,Driver Assistance Systems,Safety Systems

Across the industry we’re throwing a lot of terms around these days as we head down the roadway to autonomous vehicles. Some are pretty basic, but for me, some could use a little refinement. My favorite conundrum starts simply with the word, autonomous. Are we really talking about autonomous vehicles or automated vehicles? Believe it or not, there is a difference. And, for the most part, nearly everyone is using autonomous in the wrong way.

To figure this out, I went to my usual source for clarity and definition – dictionary.com. Autonomous means that the vehicle is navigated and maneuvered by a computer, without a need for human control or intervention, under normal road condition. This is really the driverless vehicle of SAE J3016 level 5 autonomy (which NHTSA has adopted) – no driver involvement in any situation. Autonomous vehicles may not even have typical driver controls – like a steering wheel, brake pedal, or accelerator. It is a truly driverless vehicle. Examples of this at work already exist today – the monorail at Disneyland or the people movers at major airports like Chicago, Atlanta, or San Francisco. And we’ve all heard about off-road driverless vehicles used in mining/farming operations and, of course, the old DARPA Grand Challenge – from 2004, ‘05, and ‘07 – where autonomous vehicles moved through off road and urban settings. (Interestingly, no one won the 2004 challenge. And, yes, the challenge still continues, though the challenges are not all autonomous vehicle applications.)

The fact is that until we get to truly driverless vehicles, we’re not talking autonomous. Instead, we’re really referencing automated vehicle applications. Again, back to dictionary.com, where automated is defined as “to operate or control by automation.” Taking the next step, automation is defined as “the technique, method, or system of operating or controlling a process by highly automatic means, as by electronic devices, reducing human intervention to a minimum.” The key here is “reducing human intervention.” Humans are still involved in some part of the process, just in decreasing levels. This represents level 2-4 on the SAE and NHTSA scale. It’s important to point out that yes, you can argue that level 4 could be driverless, and it is, but not in all circumstances: A lot depends on the operational design domain as stated in the Federal Automated Vehicles Policy.

The bottom line is that the majority of technologies under discussion for now and the foreseeable future are automated, not autonomous. There will be a human driver involved until there isn’t one. And when that happens, it will be an autonomous vehicle. Of course, when that will happen is still up for debate but we’ll look at timing in a future blog.

Now that we’ve got that cleared up, let’s talk about some of the elements that will be part of automated – and, eventually, autonomous – vehicle developments.

  • DAS and ADAS: DAS is the acronym for driver assistance systems – everything ranging from ABS and stability control to the current state-of-the-art collision mitigation systems. Basically longitudinal control (acceleration and braking) to help mitigate rollovers, loss-of-control, or collisions. ADAS, on the other hand, is advanced driver assistance systems, which technically means future generations of systems that will do more to help drivers in more types of situations. From my perspective, increasing automated functions – along with lateral (steering) and longitudinal control – distinguish these systems from DAS. For example, the Bendix® Wingman® Fusioncollision mitigation system represents a DAS system, while Otto’s automated driving system represents ADAS. At Bendix, moving toward automated vehicles is being driven by advances in both DAS and ADAS systems.

  • Sensors: As noted in an earlier blog, more information into the system will be critical to deliver advanced automated systems. On the vehicle, sensors will gather the information and deliver it to an ECU (intelligence), which will determine the need for intervention (either alert, or alert and deceleration). Today we use a radar and camera working together to provide the enhanced collision mitigation capabilities of the Wingman Fusion system. In the future, additional cameras and radars, along with other sensors, will work together to deliver higher levels of performance. Here are just some of the sensors you can expect:

    • Radar: “Radio Detection And Ranging (RADAR)” – a method for detecting the position and velocity of a distance object.1 In practical terms for our discussion, Radar is the basic sensor that has been used in forward collision warning, adaptive cruise control, forward collision mitigation, and blind-spot detection on commercial vehicles since the introduction of VORAD in the mid-90s. How does it work? “The electromagnetic waves emitted by the radar device are rebounded against metal surfaces or other reflecting material and can be picked up again by the radar’s receiving section. The distance to the objects in the sensing range can be measured from the propagation times of these waves. The Doppler Effect (shift in frequency of the radar signal) is used to measure the relative speed. Thanks to its excellent properties in terms of fast and precise measurement of distance and relative speed, the radar sensor is…particularly well suited for use in active and passive safety functions”2 such as collision mitigation and blind-spot detection systems. Radars can vary in frequency, but are typically 24Ghz or 77Ghz in collision mitigation system applications and have a range of 500-600 feet in front of the vehicle.

    • Lidar: We’ve been hearing a lot about LIDAR sensors lately thanks to the Google car and other automated applications. Lidar is typically the spinning sensor perched atop the Google car, or at points around the Otto truck. Lidar (Light Detection And Ranging)2 is a “device similar to radar in principal and operation but using infrared laser light instead of radio waves and capable of detecting particles and varying physical conditions in the atmosphere. Unlike most radar sensors, LIDAR does not measure the object speed directly. Instead, it is calculated by differentiating the distance signal, which results in a certain delay and reduced signal quality. On the other hand, the good lateral resolution of a scanning Lidar is far superior to the typical radar sensors used today.”3 Constantly spinning, (Lidar) uses laser beams to generate a 360-degree image of the vehicle’s surroundings. Positioning of the Lidar can vary, depending on the vehicle. Thus far we’ve seen applications where a single Lidar is placed on the vehicle’s roof (as in the Google car application), or positioned at lower points around the vehicle (as on the front bumper and toward the rear of the Otto truck application).

    • Ultrasonic: If you have a car with reverse sensing, you likely have ultrasonic sensors. These are typically the little circles that dot the back bumper on your vehicle and detect objects in the vehicle path as you back up. Ultrasonic sensors are “ultra-short-range sensors with a typical range of 2.5 m (8.2 feet). Parking assist applications typically have further-developed sensors with greater ranges (up to around 4.5 m or 14.8 feet).2

    • Cameras: “(A camera) uses parallax (“the apparent displacement of an observed object due to a change in the position of the observer”) from multiple images to find the distance to various objects. Cameras also detect traffic lights and signs, and help recognize moving objects like pedestrians and bicyclists.”5

  • Connectivity: There are many definitions around connectivity – from computer systems to business operating units. One of the better definitions comes from “Business Dictionary(.com),” which defines connectivity as “(The) Measure of the extent to which the components (nodes) of a network are connected to one another and the ease (speed) with which they can ‘converse.’” Putting this into context for our use, think of nodes as vehicles, infrastructure, or other elements that are connected together via a means of communications so they can easily and quickly talk to each other.

    Connectivity exists today – your truck’s ability to communicate back to your office via telematics to deliver data about loads, routes, safety incidents, etc. – isa form of connectivity. In the future, connectivity will tie to vehicles sharing information with each other and their surroundings (infrastructure, pedestrians) to enhance vehicle safety, as well as help improve efficiency of the transportation system and provide environmental benefits, such as improved fuel economy and reduction in emissions. Connectivity is the cornerstone of what the U.S. Government calls the “Intelligent Transportation System” or ITS. As we explore connectivity let’s review a few terms to keep in mind regarding the ITS and future autonomy.

    • Proposed FMVSS (Federal Motor Vehicle Safety Standard) 150 – NHTSA (National Highway Transportation Safety Administration) released in December 2016 an NPRM (Notice of Proposed Rulemaking) to require light vehicles (vehicles 10,000lbs. or less) to be equipped in the future with communications devices that enable information to be:

      • shared between vehicles (V2V, or Vehicle-to-Vehicle) and, eventually,
      • the surrounding infrastructure (V2I or I2V –Vehicle-to-Infrastructure and vice versa); and, even,
      • pedestrians (V2P or P2V – Vehicle-to-Pedestrian and vice versa).

      Let me pause a moment to explain that all these “V2somethings” are generically referred to as V2X – or Vehicle-to-whatever. With that as a backdrop, FMVSS 150 does not consider the V2X applications, just the V2V applications.

      V2X may be an integral part of automated / autonomous vehicle development in the future. The focus of the NPRM, and the primary reason for pushing forward with a regulation to equip vehicles with this technology, is to help improve safety on the roadways, while also trying to speed implementation (as opposed to simply letting OEMs offer the capability as an option.) This makes sense, as V2V as an option would take much longer to equip vehicles – you don’t have much connectivity if there are not a lot of connections available!

      What doesn’t make sense, at least from my point of view, is not including heavy trucks and buses in the requirement. The same argument holds – more vehicles with the equipment, more connections, more opportunity to help mitigate crash situations. The good news, however, is that there is an expectation that the inclusion of heavy trucks and buses will be addressed in a future – possibly 2017 – NPRM for connected heavy vehicles.

      The value of V2V from a safety standpoint ties to the ability of the system to warn drivers of potential collision situations that can’t always be handled by onboard sensors. For example, a radar or camera can’t see around corners. A connected vehicle, however, could receive a signal of a vehicle crossing in front of an oncoming car, alerting the driver so he or she can apply the brakes. In the future, the warning may become a brake application – on both cars and trucks. 

      My final point on this topic – I promise – is that target implementation of the rulemaking is scheduled two years after the final rule is issued – per the NPRM, likely in 2019. If the final rule is issued in 2019, the agency envisions a “3-year phase-in period to accommodate vehicle manufacturer’s product schedules” as noted in the NPRM. With this plan, implementation begins in 2019 and is completed in 2023.5

    • Basic Safety Message (BSM) – “The data (exchanged between vehicles) is known as the “basic safety message” (BSM)…and contains vehicle dynamics information such as heading, speed, and location. The BSM is updated and broadcast up to 10 times per second to surrounding vehicles. The information is received by other vehicles equipped with V2V devices and processed to determine collision threats. Based on that information, if required, a warning could be issued to drivers to take appropriate action to avoid and imminent crash.”6 The BSM is based on SAEJ2735 Part 1 and the data elements include: vehicle size, positon, speed, heading acceleration, brake system status.

    • DSRC (Dedicated Short-Range Communications) – “DSRC is a two-way short-to-medium range wireless communications capability that permits very high data transmission critical in communications-based active safety applications. The Federal Communication Commission (FCC) allocated 75 MHz spectrum in the 5.9GHz band for use by Intelligent Transportation Systems (ITS) vehicle safety and mobile applications.”7 This is how communications between vehicles and V2X applications happens. Take note, however, that there is major concern regarding the FCC allowing other applications – such as wireless internet – intothe 5.9GHz band. In fact, this is the source of opposition for a number of groups concerned with V2V. They feelovercrowding of the band could result in lost messages at critical times. We’ll watch to see how this plays out in the new administration.

       

  • Platooning: You’ve heard, and perhaps seen, much about platooning trucks. In a nutshell, platooning is controlled drafting to gain fuel economy and, some may say, a safety benefit. Peloton Technology is the major player in this technology.

     

    • Phase 1: In the first phase of platooning (multiple phase are expected as we go down this path), you will have two trucks with two drivers and following in close proximity using a system that will automatically bring them close together (about 12M, or 40 feet, optimally) when conditions are right for platooning. Right conditions are typically a sunny day on a limited access highway (freeway or interstate), with minimal traffic. The system will provide longitudinal control (braking and acceleration)through DSCR communications to the rear vehicle – to enable the vehicle to quickly react to braking or acceleration by the lead vehicle. Lateral, or steering control, in the following vehicle is maintained by the driver. The lead vehicle driver has control of all functions on the vehicle and his or her changes in speed (not steering) are communicated and duplicated almost immediately by the rear vehicle.

    • Future phases: In future phases, the goals will likely focus on eliminating the need for the driver in the rear vehicle, as well as the potential for more than just two vehicles in the platoon. And, quite possibly, having mixed vehicles – cars, tractor-trailer, trucks, and buses – enabled to platoon together. These approaches, most likely, will require changes in regulations as well as infrastructure (e.g., dedicated platooning lanes) to make this happen. Keep in mind - these are possibilities that have already been tested and demonstrated.

  • Deep Learning (Neural Networks) – There’s much to discuss when it comes to deep learning– anywhere from a lengthy blog to a full book. In a nutshell, we understand deep learning as advancing machine learning to a computer model that mimics the structure of the human brain. This helps make the system smarter about the situations it encountersand enables the system to quickly learn from new inputs or new system experiences. For comparison, think of all the different situations you encounter as you pilot your personal vehicle on the road today, and then consider how much better a driver you’ve become than when you were at 16 with a newly-minted license in hand. All that learning occurred from your experience over time. Deep learning, in a sense, will help autonomous (is it autonomous or automated?) vehicles learn from various experiences and not repeat mistakes – unlike, sometimes, we human drivers will do!

Okay, so now we’ve covered some of the key vernacular that you’re likely to hear around the development of automated and autonomous vehicle technologies. This list, of course, is by no means complete, as there will be new terms and revisions or updates as we along the path. This list is the perfect definition of a work in progress, but it makes for a great starting point. Also, check out some of my earlier blogs for some additional definitions, such as the SAE levels of autonomy.

Thanks to Andy Pilkington, Product Manager, Radar/Fusion at Bendix for reviewing this blog!

Sources:

1Dictionary.com

2Bosch Automotive Handbook, 8th Edition pg.1154

3Ibid pg. 1158

4New York Times, “How Self-Driving Cars Work,” 12/14/16

5See Docket Number NHTSA-2016-0126 for a the NPRM “Federal Motor Vehicle Safety Standards; V2V Communications” for additional details

6”NHTSA Issues Notice of Proposed Rulemaking and Research Report on Vehicle-to-Vehicle Communications,” U.S. Department of Transportation, Dec. 12, 2016, pg. 2

7”Dedicated Short-Range Communication (DSRC), The Future of Safer Driving,” U.S. Department of Transportation, pg. 1

Bendix Blog

Technical and industry insight from OUR experts.

Since We've Discussed the Donut Hole, Let's Talk About the Donut

The why, what, and how of the AEB (Automatic Emergency Braking) Notice of proposed rulemaking (NPRM) – part 1.

Read More

Pairing Your Advanced Safety Systems with the Right Brakes

From antilock brakes to full stability to collision mitigation technologies, today’s commercial vehicle driver assistance systems are engineered to help commercial vehicle drivers do their job more safely.

Read More

Closing The Donut Hole – FMVSS 128 – The Heavy Vehicle Automatic Emergency Braking (AEB) Mandate

Rulemaking Means AEB Will Be Required On A Wide Swath Of Trucks And Buses.

Read More