Today’s autonomous vehicles rely on a wide variety of sensors to provide the spatial awareness necessary to navigate autonomously without intervention from the driver - and innovative radar technology compliments this plethora of sensors, forming the next evolutionary step in the advance to develop and deploy autonomous vehicles into our daily lives.
This guest blog is the first in a series from CW Member & CW International Conference sponsor: Plextek. It has been written by Clem Robertson, Radar Capability Manager.
Autonomous Vehicle Sensors: Long Way from Home
Today’s autonomous vehicles rely on a wide variety of sensors to provide the spatial awareness necessary to navigate autonomously without intervention from the driver. Current sensors solutions heavily rely on visual spectrum sensors to provide the 3D detail of the surrounding environment. These sensors suffer from similar limitations incurred by humans of limited range perception and reduced performance in low visibility situations (night, rain, snow, fog, dust, low sunlight etc).
Recent figures illustrate that 3.5 Trillion miles are driven by US citizens every year with one fatality every 90 million miles. Currently only 12 Million miles have been covered by autonomous vehicles in real world testing with one disengagement every 5,600 miles. The Industry Sensors and Artificial Intelligence (AI) have a long road ahead.
Artificial Intelligence Cannot Solve the Problem on its own
The majority of vehicles available today in the marketplace rely on the driver still being in control of the vehicle but provide advanced driver assists system (ADAS) levels of autonomy, also known as level 1 and 2 on the autonomy scale. The automotive industry is striving to provide higher levels of autonomy, with Tesla recently announcing their intention to roll out new self-driving features earlier this year based on improvements to their navigation and autopilot software.
But can the advancements in artificial intelligence (AI) and machine learning fully addressing the needs of the fully autonomous vehicle?
AI and machine learning are already becoming a part of everyday life but AI cannot solve the problem on its own. The key to obtaining the levels of spatial awareness required for level 3, 4 and 5 autonomous vehicles will rest on the fidelity of the data received from the vehicles sensors. The higher the fidelity the better the end result.
Autonomous vehicles in the future will require increased levels of spatial awareness and discrimination in all weather conditions. There will be a requirement to discriminate between objects close together so each objects x, y & z location can be precisely mapped.
The Importance of Sensors
We as humans primarily use vision combined with sound, touch, smell and balance to move safely in our daily lives. Just think for a moment how difficult it becomes when one or more of our senses is impeded or removed. Our brains have to work considerably harder to compensate for the reduction in fidelity with a greater risk of misinterpreting the situation.
A good example is when we cannot see. To move about, we compensate by using balance, touch & sound but our progress is much slower with significant ambiguity and reduced certainty of our surroundings.
So in short, the higher the fidelity of the sensor or groups of sensors, the less the brain (Artificial Intelligence) will have to work to resolve the situation and with a lower level of ambiguity hence a higher level of confidence and safety.
Limitations of Sensors Today
Autonomous vehicles to date principally focus on replicating many of the functions of a human; this has included the replication of the key human senses combined with senses from the animal kingdom. For example:
- Sight – Stereo Cameras & LIDAR for high-resolution 3D spatial awareness.
- Sound – Ultrasonic and radar sensors used for location, velocity and direction of objects
- Balance – Inertia and Gyroscopic sensors to sense movement.
- Location: Geo location and direction using magnetic field sensing and GPS
When these senses are used together along with AI in good weather conditions, today’s autonomous vehicles are able to proceed with a sufficient level of confidence in known situations but not with enough confidence for us humans to instil complete trust in the technology.
Sensors of the Future
So what advancements are needed for the industry to obtain the confidence required to achieve the highest levels of autonomy (levels 4 and 5). The big hype today is about the advancement of AI to provide comparable, and one day enhanced, responses to humans in everyday scenarios. To do this, AI will need sensor data which goes well beyond the replicated human senses.
To increase safety in all weather conditions, the sensors of the future will need to perform well in all weather conditions relying less on visual spectrum sensors as the primary sensor, exploiting other cutting edge technologies like mm-wave radar.
High resolution, high fidelity radars are evolving, there are radars today that are capable of creating detailed images of the world in all weather conditions. Seeing is believing.
This exciting and innovative radar technology is the next evolutionary step in the advance to develop and deploy autonomous vehicles into our daily lives.