banner

Blog

May 30, 2023

What Are Self

Automakers are already offering semi-autonomous driving on cars, but this is just preparation for when they will not need a driver at all.

The ideal to create the perfect self-driving car has been popular since the very early days of automobiles. After over a century of innovation and technological breakthroughs, you’re closer than ever to having a car that can drive itself, with several companies working on projects already on public roads.

But how do self-driving cars work? And how close are you to achieving your robo-chauffeur dreams?

As the name suggests, self-driving (also called autonomous) vehicles are cars that drive themselves. Most modern self-driving cars require a driver to be present to take over in emergencies. Outside of emergencies or situations when the car starts acting erratically, it is supposed to handle most of the driving without any form of driver intervention.

Self-driving cars use a combination of sensors and cameras to create a 3D image of the world around them. Advanced software is then used to detect cars, people, and obstacles on the road, enabling the vehicle to drive itself safely while following the rules of the road.

Many companies are working on this technology, and this means that there are several different approaches to making a self-driving car. There are also different levels assigned to self-driving cars with different features.

Booking and jumping into a Waymo self-driving taxi is one of the easiest ways to try a self-driving car for yourself, but you will need to be in Arizona for your first Waymo ride.

Most of the world’s self-driving cars aren’t fully self-driving models, and they fall under six different automation levels, each offering better automation than the last.

The first three levels all require a human to control the vehicle while it drives, while the remaining three require limited or zero human interaction. Each level of vehicle automation is a milestone, but level five is the most exciting and is what many companies are working hard to achieve.

Surprisingly, hardware limitations aren’t a major problem in the self-driving car space. In theory, the only sensors you need for a self-driving car to work are regular cameras, with software processing doing the heavy lifting. Of course, though, it’s much safer to use an array of different sensors to give the software as much data as possible.

Light detection and ranging, or LiDAR, sensors measure depth to produce an accurate 3D model of a self-driving vehicle’s surroundings. This is achieved by emitting millions of laser pulses each second and measuring the time it takes for each pulse to reflect. The longer the reflection time, the further an object is from the sensor.

This helps a self-driving car understand its environment and the surrounding objects. This includes buildings, people, and animals, as well as everything else the vehicle drives past. On a clear day, LiDAR is all a car needs to navigate in busy city environments. Its performance drops through rain or fog, though, and this is why self-driving cars can’t rely on LiDAR as their only sensor type.

Radar performs a similar role to LiDAR on automated vehicles. Rather than emitting lasers, though, it emits radio waves and measures the reflections from objects around you. The goal is still to understand the environment around the car, though.

LiDAR sensors have a resolution 10 times greater than radar, but radar isn’t affected by poor weather conditions. Radar sensors are also cheaper than LiDAR sensors.

Companies like Google’s Waymo use a mix of LiDAR, radar, and regular cameras for their main sensor arrays. Tesla, on the other hand, has opted to invest fully in regular cameras and advanced software to navigate roads autonomously.

Facial recognition technology has been around for a long time, although it's mostly been used on smartphones and advanced security solutions. With self-driving cars, the aim is to take this to the next level, with machine learning-powered object recognition, detecting buildings, cars, people, and everything else around your vehicle.

Radar, LiDAR, and regular cameras are often the main sensors in a self-driving car, but some vehicles have more. Additional hardware, such as ultrasonic sensors, gives the car an even greater understanding of its surroundings. This makes it possible for self-driving cars to respond to non-visual cues, like the sound of an ambulance’s sirens.

Whether it's Tesla, Waymo, or any other self-driving car system, all of these vehicles need a central computer, or “brain”, to process the data provided by their sensors. Nvidia’s Drive AGX platform is a leading example of this, but some automakers are choosing to develop this type of technology in-house.

Building functional self-driving car software is one of the biggest challenges faced by manufacturers. It’s relatively easy to create a program that uses road markings and location data to follow modern roads. But what happens if another car cuts you off or an animal runs out into the road?

Roads are not predictable places. Self-driving car software has to be able to react to a huge array of different situations, many of which are impossible to pre-program.

AI sits at the core of the self-driving car industry. In essence, autonomous vehicles like this aim to mimic the human brain while driving, which means that they have to be able to make decisions based on a huge range of variables. This includes junctions and road signs that are part of the road, along with vehicles, people, and other obstacles that a regular driver would usually be aware of.

It would be far too time-consuming for humans to create databases and algorithms that perfectly recognize everything on the road. Instead, manufacturers like Tesla use machine learning to train their algorithms and improve them.

The machine learning algorithms found in self-driving cars have to start with some basic data, but a huge portion of their learning is done on the road. This is what makes it so crucial that companies can test their cars on real roads, but it also means that self-driving cars are only going to get better the more they drive.

A pedestrian stepping out into the road is a good test case for self-driving car machine learning. The car has several options in this scenario; it can attempt to swerve around the pedestrian, slam the brakes on and attempt to stop, or use the horn to alert the pedestrian. Most self-driving cars will take an active approach to obstructions like this, ruling out the last option.

From here, it has to decide whether it’s best to swerve or brake, factoring things like speed, distance, weather conditions, and a variety of other environmental factors into it. If swerving would bring the car into the path of oncoming traffic, for example, it is likely to choose to use the brakes.

Failing to react properly and succeeding to react properly both help a self-driving car learn how to tackle similar problems in the future. Ideally, this data is shared between self-driving cars to ensure they can improve together.

Alongside AI, there is a lot of other software behind the scenes in a self-driving car. GPS mapping systems help the car navigate roads accurately, while driver monitoring systems ensure that the person behind the wheel is focused, even in self-driving mode.

Each self-driving car company takes a different approach to software, and this means that it’s rare for them to be open about how their tools work.

It’s fair to question the safety of modern self-driving cars, especially with the growing list of deaths and injuries associated with autonomous driving. As you can see from the prevalence of driver awareness monitoring systems in many self-driving cars, even their manufacturers know that they’re not perfect yet.

But that’s not the point. Self-driving cars still have a long way to go. This means that autonomous car fans need to wait just a little bit longer to get their hands on an AI-controlled vehicle that drives itself and may even be able to repossess itself.

Samuel is a UK-based technology writer with a passion for all things DIY. Having started businesses in the fields of web development and 3D printing, along with working as a writer for many years, Samuel offers a unique insight into the world of technology. Focusing mainly on DIY tech projects, he loves nothing more than sharing fun and exciting ideas that you can try at home. Outside of work, Samuel can usually be found cycling, playing PC video games, or desperately attempting to communicate with his pet crab.

SHARE