driving levels

Author: Phil Muncaster

Mordor Intelligence predicts that the market for autonomous vehicles is likely to grow from $22.22B in 2021 to $75.95B in 2027, reflecting a compound annual growth rate of 22.75% during that period. In fact, the market has reached $27 billion in global value this year, and Mordor’s study suggests that it may reach almost $62 billion in global value by 2026. However, stakeholders in the autonomous vehicle industry will need to jump through many technological and regulatory hoops before fully autonomous cars are on the roads.

But you may have already driven in a vehicle with some autonomous capabilities. There are six autonomous driving levels, each with varying degrees of functionality.

What are the six levels of autonomous driving?

Back in 2014, SAE International published a classification system for autonomous vehicles. Its autonomous driving levels taxonomy is based on how hands-on drivers have to get behind the wheel. It outlines each level as follows:

  • Level 0 - Manual Driving: No autonomy. A driver is fully responsible for all aspects of controlling the vehicle. This level includes cars with cruise control, as long as the driver has to operate braking and acceleration.
  • Level 1 - Driver Assistance: This is the type of autonomous driving most commonly found on today's roads. It refers to cars with cruise control where the driver steers, but onboard systems control speed and distance. Park assist is also a Level 1 feature, as the driver controls the speed, but Advanced Driver Assistance System (ADAS) computers do the steering. Safety features, such as auto emergency braking and blind-spot monitoring and alerts, are also included here.
  • Level 2 - Partial Automation: At this level, onboard systems can handle steering, acceleration and braking, while drivers don’t need to be actively involved in the driving function. Manufacturers have warned that drivers must stay alert at all times in case the ADAS makes the wrong decisions. Tesla Autopilot and Nissan ProPilot are both examples of Level 2 autonomous vehicles.
  • Level 3 - Conditional Automation: This is what the market refers to as "eyes off" systems, where drivers in some scenarios, can sit back safe in the knowledge that their car will handle all aspects of driving. That's because vehicles have environmental detection capabilities, which enable them to accelerate past slow-moving vehicles and perform other tasks. However, drivers must be relatively alert to take over if required.
  • Level 4 - High Automation: Also known as "mind off" autonomy, this is where cars have the capabilities to completely replace a driver if necessary. However, while regulators and legislators catch up, such automotive technology is restricted or "geofenced" to specific locations, like 30 mph zones or scenarios such as traffic jams. If the driver is unable to take control in an emergency situation, a Level 4 car could get itself autonomously to a safe location.
  • Level 5 - Full Driving Automation: This is the ultimate in autonomy: No driver interaction is required and, therefore, the car doesn't need pedals or a steering wheel. Vehicles will be free from geofencing and trusted to go anywhere and cope with any situation. Governments haven't yet approved any cars of this level for road use, although testing continues in various geographies.

What are the benefits of autonomous vehicles?

There's much to look forward to with the coming wave of autonomous vehicles set to hit the roads in the years to come. They could help lead to:

  • A reduction in accidents and deaths otherwise caused by human error
  • A reduction in emissions, not simply because many will be electric-powered but also because a smoother ride burns less fuel and causes fewer traffic jams in addition to the efficiencies garnered from platooning techniques and Time to Green functionalities.
  • Improved traffic flow, if vehicles can communicate with each other and highway technology, such as traffic lights, that is equipped with sensors to reroute and avoid congestion
  • Productivity for commuters, by giving them more time to work on the way to the office
  • Cooperative driving, due to data that is gathered and shared with highway technology and other drives which can help reduce travel time by avoiding congestion or slowing down to look at accidents
  • A reduction in traffic and an increase in free parking spaces in urban centers if autonomous ride-hailing vehicles take off
  • Reduced costs for consumers otherwise spent on insurance premiums, fuel and parking
  • Job creation and economic prosperity, given the right policy environment

What role does secure connectivity play?

At a basic level, in-car sensors, ultrasound, cameras, radar, and light detection and ranging (LiDAR) systems are only one piece of the puzzle. Relying on multi-access edge computing (MEC), they must also be able to communicate with onboard units in other cars, as well as sensors in roadside traffic lights, smart streetlights, etc. to gain rapid situational awareness and information on potential hazards ahead.

In short, huge volumes of data must be transmitted and processed in split-second time to facilitate safe autonomous driving. This is where 5G and MEC could play a vital role. When combined, they bring the power of the cloud closer to each vehicle to lower latency, significantly increase bandwidth and drive high-speed connectivity between cars and connected highway technology. Verizon and Honda are already trialing the technology in a controlled environment with drivers in the vehicles at Mcity, the University of Michigan’s autonomous vehicle test facility, to:

  • Warn autonomous vehicles when obscured pedestrians are crossing
  • Alert drivers to fast-approaching emergency vehicles they can't see
  • Warn vehicles of drivers that have run red lights

Advances in smart 5G network and MEC technology may hold the key to moving up through the gears and seeing Level 5 autonomous vehicles on the roads soon.

Discover how 5G and MEC can speed up the development and adoption of autonomous vehicle technology.

The author of this content is a paid contributor for Verizon.