The stuff of science fiction in decades past, self-driving cars are fast becoming a reality—at least to some extent. Features such as automatic emergency braking, blind spot warning, and adaptive cruise control are part of the spectrum of automation, even though cars equipped with these abilities today still need a human in the driver’s seat. Whether we’re close to seeing a car pass us on the road with no one behind the wheel is debatable, but automation continues to develop as car manufacturers explore the possibilities.
Automated, Autonomous, or Self-Driving?
These terms get tossed around and can be difficult to define, so let’s cut through the confusion. In some regions of the world (particularly the European Union), the term automated vehicle describes a car that can drive itself with little to no human oversight. The titles autonomous vehicle and self-driving car are basically synonymous and are the terms of choice in the U.S.
What are the Benefits of Self-Driving Cars?
While some are skeptical of self-driving vehicles on the road, proponents of the technology have presented a number of potential benefits. Here’s a closer look at a few.
Perhaps the biggest benefit of self-driving cars is enhanced safety. Between 94 and 96 percent of all car accidents result from human error, according to a 2016 study by the National Highway Transportation Safety Administration (NHTSA). A shift toward autonomous vehicles could reduce crashes due to impaired or distracted driving, speeding, and other human-instigated problems.
For companies across industries, switching to autonomous vehicles—whether on job sites or for cross-country shipping—could result in tremendous cost savings as they will need fewer personnel onsite to operate vehicles. Other potential cost-saving impacts include lower insurance premiums with the improvement of road safety.
Reduced Traffic Congestion
With fewer car crashes and other traffic jam-inducing incidents on the roadways, the impact on traffic congestion could be profound. Because autonomous vehicles maintain a consistent (and safe) distance between vehicles, there’s likely to be a reduction of stop-and-go traffic and an increase in smooth flow.
When workers don’t have to drive a vehicle, either as part of their job or to commute from one site to another, they can use that time to do other things. Drivers could (safely!) respond to emails or work on a project while they’re en route to a destination, just as if they are a passenger on a bus or train.
Self-driving cars can also make it easier for people with disabilities to get around. Those who might not otherwise drive a car—for example, those with severe visual impairment—could attain a much greater level of freedom.
The reduction in traffic jams will help to cut down Co2 emissions—and autonomous vehicles are likely to inspire more ride sharing and a wider interest in electric cars. Each of these changes will help to reduce the current environmental impacts of gasoline-powered cars on the roads.
Levels of Driving Automation
While the term “self-driving car” indicates a vehicle that’s completely autonomous, the reality is a lot more complicated. Between the two extremes of cars that are manual and those that drive themselves lies a wide spectrum of variables. To clarify the situation, SAE International has defined six levels of vehicle automation. Levels 0–2 include some automated support features that the driver must still constantly supervise (no, you can’t crawl in the backseat to make a sandwich). Levels 3–5 take autonomy a step further, with some features that completely take over even if a human is still holding the steering wheel. Here’s a quick breakdown of features at each level:
This is the baseline of cars with no automated features—except plain old cruise control. Many cars on the road today fall under this category, especially if they were made prior to about 2015.
Level 1 autonomy includes driver assistance features like adaptive cruise control (the kind that adjusts your speed automatically based on the speed of the car in front of you) and lane keep assist, which corrects your steering when you start to cross a lane line. Most of today’s new cars qualify as level 1 autonomous, if not higher.
At Level 2, we cross a line into partial driving automation. In levels 0–1, a human is still required to monitor some elements of driving throughout the trip, such as braking and steering the vehicle. Advanced driver assistance systems (ADAS) fall under the level 2 heading, granting the car a bit more autonomy. While a human still sits behind the wheel and can take over when needed, cars equipped with ADAS are fully capable of steering, braking, and navigating on their own. They can recognize traffic signs, detect and respond to unexpected obstacles (such as pedestrians), and more.
At Level 3, autonomous cars begin to feel more sentient. They can perform all the ADAS tasks, with the addition of more environmental detection features and decision-making powers. For example, a level 3 vehicle might detect a slow-moving tractor on a straight road, and make and execute a decision to pass it in the opposite lane if the way is clear. While this sounds sophisticated (and it is), level 3 cars are not yet ready to make their own way in every situation. If driving conditions become complicated—the road is icy, for example—the human driver must immediately take back control.
Vehicles at this level can go beyond simple tasks—they’re actually able to evaluate a situation and make real-time decisions if something goes wrong either because of unexpected circumstances on the road or a system failure. In most situations, these cars do not need human oversight. They’re capable of driving themselves, but humans can still take over the wheel at any time. While vehicles capable of Level 4 operations do exist right now, government restrictions often put stringent boundaries around their usage. For example, some countries may allow them to operate only in a city’s downtown area, where speed limits are low.
Level 5 vehicles will be capable of driving themselves without any oversight from a human driver—and likely won’t even have hardware like steering wheels or brake pedals to make it possible. While a few prototypes are being tested around the world, Level 5 cars are not yet available to consumers or businesses. And it’s worth noting that while cars will continue to become more autonomous and Level 5 is certainly within reach, it’s questionable whether there will ever be a vehicle that’s able to respond to every possible driving condition and traffic scenario—so purely Level 5 vehicles might be restricted to certain areas and kept off the roads if weather conditions are dicey.
How Do Autonomous Cars Work?
Self-driving cars use several types of sensors to “see” the objects around them, perceive road conditions, make split-second decisions, and create route maps. They must maintain a stable connection to the internet, typically through an onboard cellular module, to receive GPS data and share road information with other vehicles in proximity. Here’s a closer look at the primary technologies behind self-driving cars.
Radar has been around since the 1930s, and works by sending out radio waves that reflect off objects and return to the receiver. Over the past few decades, cars began incorporating radar to enable features like automatic braking in an emergency and adaptive cruise control. While radar is a vital technology in autonomous cars because it can detect objects as far as hundreds of yards away, it can’t see details or identify types of objects, so its uses are limited.
Light Imaging Detection and Ranging (LIDAR) fills in some of the details that radar cannot. This technology sends out millions of pulses of laser light every second. The light bounces off surrounding objects and returns to the receiver, where the LIDAR measures the time it took for the light to travel there and back and uses that data to create a three-dimensional model. LIDAR can “see” enough detail to identify types of vehicles and obstacles, feeding that information to the car’s computer to inform predictions and decisions. While LIDAR holds great promise for use in autonomous cars, it doesn’t work in every situation—in foggy or dusty conditions, for example, it will not function.
Global positioning systems (GPS) rely on satellite signals to pinpoint the car’s physical location. GPS information feeds into the car’s internal navigation system, allowing it to build a three-dimensional map of its surroundings and chart a path toward a destination.
Cameras equipped with computer vision are another key element of autonomous cars. They can read signs and provide high-resolution images, helping the car’s computer system make sense of its surroundings. Like LIDAR, cameras cannot operate in every condition—if visibility is low, they won’t be able to see clearly.
Building an Internal Map
To comprehend its situation, an autonomous car must create a three-dimensional model of its surrounding area. LIDAR and cameras typically supply the needed data for this task. The car’s computer system combines and interprets information from different sensors to create a map which can then be used for planning a route.
Charting a Path
Once the car creates a model of its surroundings, it can incorporate GPS capabilities to identify the fastest and safest path to its destination. While charting the path starts off similar to plugging in an address on a smartphone’s Maps app, fully autonomous cars must consider more than just directions and navigation. They must also look at objects in the way—whether it’s a pedestrian in the road or a fire truck racing through a stop light—and make decisions about how to respond.
Just like a live driver, a self-driving car continually revises its navigation plans based on changing conditions. The car’s sensors and cameras are constantly looking out for both static and moving obstacles, and either alerting the driver (if the vehicle is a Level 1 or 2) or making a decision to respond on its own. Fully autonomous vehicles must make continuous predictions about where objects might appear, what other cars might do, and when pedestrians could appear on a crosswalk.
Emerging Use Cases for Autonomous Cars
While most of today’s new cars have some autonomous features, there aren’t any fully self-driving cars on the market at the moment. In the near future, autonomous vehicles are most likely to be used in situations where road conditions are fairly predictable, allowing their onboard sensors to operate at full efficiency. Here are a few possible use cases:
Shuttles and ride sharing
For short rides in predictable conditions—such as transporting people from one airport concourse to another, or across a downtown area—autonomous vehicles may be a good solution. Since it involves more variables, ride sharing may be a bit further off, but it’s possible that one day you’ll order a ride on Uber and a driverless car will pull up to take you home.
Some organizations are looking into autonomous vehicles for local deliveries—whether it’s pizza, groceries, or packages. A few companies in California used their driverless vehicles to provide contactless grocery deliveries during the COVID-19 pandemic.
While autonomous shipping may be further in the future, it’s quite possible that one day you’ll pass a driverless semi-truck on the highway. Many developed countries don’t have enough drivers to carry out needed logistics, and autonomous shipping could be a solution. Because cross-country truck drivers must operate vehicles in diverse weather situations and road conditions, though, for now autonomous vehicles may become a support for them to enable more rest or longer hours on the road—for example, allowing a driver to put the truck on “autopilot” for a while during good weather and traffic conditions.
5 Best Examples of Self-Driving Cars
While there aren’t any fully autonomous vehicles on the consumer market yet, there are some cars out there with Level 2–3 autonomy on board. Let’s take a look at five of the best examples in 2020.
Tesla Model S
Perhaps the best-known brand in the realm of smart cars, Tesla has developed its own Autopilot system for navigation and automation. The software is continuously updated via over the air (OTA) connectivity and uses eight cameras positioned around the vehicle, along with radar and ultrasonic sensors. The Autopilot function can operate on most roads (although it must be overseen by a human driver at all times), and the system also includes assistive features like adaptive cruise control, automatic lane changing, and attentiveness monitoring.
Another leader in self-driving vehicle innovation, GM introduced Super Cruise with the Cadillac CT6. Super Cruise draws on a variety of vision and positioning technologies and uses a camera inside the vehicle to track the driver’s eyes, ensuring focused attention. The Cadillac CT6 also includes driver assistance features such as adaptive cruise control, lane centering, and automated lane changes.
The Hyundai Palisade, along with a few other Hyundai makes, includes a system of autonomous features such as adaptive cruise control, lane centering, attentiveness monitoring through sensors on the steering wheel, a blind view monitor for side mirrors, and automated emergency braking with pedestrian detection.
Kia and Hyundai are jointly owned, so the two brands use the same autonomous feature platform. The Kia Telluride features a similar array of driver assistance features including forward-collision warning and automated emergency braking, lane departure warning and lane keep assist, and blind spot monitoring.
While Audi’s cars include Level 3 self-driving features in Europe, government regulations prevent them from becoming available in the U.S. for the time being. Available features include traffic jam assistance, adaptive cruise control, remote control parking, lane centering, and attentiveness monitoring.
IoT security is a perpetual concern, and autonomous vehicles are certainly at risk for hacking. In 2015, two hackers remotely took control of a Jeep Cherokee using vulnerabilities in the car’s entertainment system to access its dashboard functions, initiating a series of unexpected disturbances and finally disabling the brakes. If hackers are able to take control of a car wasn’t not designed to be fully autonomous, what happens when they hack into a driverless vehicle? In the realm of autonomous vehicles, cybersecurity concerns directly impact the physical safety of those in and around the car. The U.S.’s National Highway Traffic Safety Administration (NHTSA) is researching cybersecurity for self-driving vehicles and working with the automotive industry to create a framework and standards for secure design.
Will All Cars Be Self-Driving in the Future?
Maybe, but it’s going to be a while. Vehicles with autonomous features will continue to come on the market, but they’ll provide automated assistance to drivers rather than being fully autonomous. Some experts believe driverless cars will be on the roads in the next decade, but their use will not be widespread. Too many issues such as aging road infrastructure, government restrictions, and weather variables are standing in the way of releasing fully self-driving cars. Those sci-fi vehicles are likely coming someday—but may be several decades away.