How do Self Driving Cars Work?
There are few IoT projects with a profile as high as self driving cars. Everyone from Tesla and Waymo to Intel and Apple are working on their own autonomous vehicle, yet a recent poll of 17 international markets from YouGov found that nearly half of all consumers are worried about seeing driverless cars on the road.
Key to this anxiety is a lack of understanding of all of the preparations that go into self driving cars. The marriage of IoT, AI, and electrical engineering necessary for autonomous vehicles (AV) to function is staggering, so let’s break down some of the key components.
What kind of sensors are in self driving cars?
There are a number of IoT devices located throughout any AV, but the most important environmental sensors are radar, ultrasound/sonar and lidar sensors.
- Radar – The AV will emit radio waves in all directions and garner several insights from the waves reflected back to them. This information includes the angles, range, and velocities of any item pinged by the waves. These sensors are integral to safety features such as blind spot monitoring, lane-keeping assistance, and parking aids at short range, as well as distance control and brake assistance at longer ranges.
- Ultrasound/Sonar – Utilizing both active and passive listening, these sensors both listen for sounds being made by nearby objects and emit imperceptible sonic pulses that echo off nearby objects. This echolocation can detect obstacles at the speed of sound, adding a layer of enhanced security when partnered with Lidar sensors.
- Lidar – Light detection and ranging (Lidar) is also known as 3D laser scanning. It works by pulsing thousands of infrared beams of light into its surroundings, creating point clouds to map objects in 3D. The sensors then measure how long it takes for that light to be reflected at the speed of light, making it great for detecting small objects. Unfortunately, these readings can be compromised by bad weather or low light conditions, meaning lidar works best in concert with the other sensors found in AVs.
How are cameras used in self driving cars?
Self driving cars are also very dependent on high-resolution digital cameras, which utilize visual data to interpret potential obstacles and roadways. Though these can also be impacted by inclement weather, cameras are able to identify common roadside hazards such as pedestrians, cyclists, other vehicles, and more. Typically there are two kinds of cameras utilized in self driving cars:
- Mono, or “one eye” cameras process 2D images to recognize traffic signs, lane markers and lighting needs. These readings can be somewhat limited, unfortunately, as computations such as distance are beyond their capabilities.
- Stereo cameras, on the other hand, use two lenses to craft a 3D image that allows sensors to track distance and speeds. These cameras are currently seeing use in some commercial vehicles, mostly as the basis of features like adaptive cruise control and emergency brake assist.
How do self driving cars process all of this information?
The system through which cars transmit data between sensors and cpus, via high-bandwidth, low-latency, high reliability links, is called ‘vehicle to everything,’ or V2X. This can be broken down into several different factors:
- Vehicle-to-vehicle (V2V) – Connectivity between cars. Can be used for everything from collision warnings to speed limits.
- Vehicle-to-infrastructure (V2I) – Cars speaking to existing structures such as traffic lights and parking meters.
- Vehicle-to-pedestrian (V2P) – Cars communicating with pedestrians via their personal devices, such as smartphones.
- Vehicle-to-network (V2N) – Communication with cellular networks.
For AVs, cellular V2X (C-V2X) appears to be the future. C-V2X utilizes low-latency direct communications for active safety messages like road hazard warnings, and a second module working off cellular networks for infotainment systems and longer range traffic and hazard warnings.
What is an Automated Driving System?
As defined by the Society of Automotive Engineers, there are six levels of driving automation, with levels 3-5 being considered automated driving systems (ADS). Beginning at level 3, a driver does not need to be in control of the vehicle for the entire time it is operational, though will need to be able to take the helm within a 10- 60-second window should the vehicle request it.
Level 4s offer full autonomy to the AV within a restricted operational design domain (ODD). The scenarios can be very specific, and limit everything from the location of operation to the time of day to the atmospheric and weather conditions.
Level 5 has yet to be fully realized, but is used to denote a system that is fully automated and requires absolutely no human input to operate. This level is primarily intended for services like shipping.
“The leap from level 4 to level 5 is huge. It’s a quantum leap,” Calum Macrae, head of automotive R&A at GlobalData, told Verdict. “It’s not just so I could take in the microcosm of being autonomous. It’s about being autonomous anywhere.”
The current crop of AVs from brands like Tesla would most comfortably fit into level 2. That is because they offer partial automation of the driving experience, but a human operator would be necessary to monitor and take the wheel under certain conditions.
What are the concerns about self driving cars?
There is a level of trust involved with accepting any new technology and though the state of modern IoT and AI is capable of amazing things, the fact is that human drivers are currently more reliable than AI.
One key challenge is common sense reasoning. There are an infinite number of scenarios that drivers may find themselves in, and people navigate these unfamiliar scenarios based on deductive reasoning and intuition. Unfortunately, despite incremental approximations over the years, common sense reasoning in AI has yet to be achieved. Even on well researched and oft-traveled roadways, it’s impossible to anticipate every challenge that an AV may encounter. As such, AVs not bound to ODDs may not be prepared for the open road.
The technology behind sensors and cameras has grown incredibly complex, and yet there are still some issues to be worked through. Visual sensors can be tricked by something as simple as a piece of tape, and the results can occasionally be deadly. Radar and ultrasound face similar issues, and operate at the speed of sound – which is slower than light. Considering that the average AV also boasts around 100 onboard computers, constant maintenance will be essential to avoiding malfunctions.
A positive outlook
Though there are still some kinks to work out, there is still considerable optimism around self driving cars. Major automotive companies like Audi and Mercedes Benz are investing heavily in the autonomous market, and even tech leaders like Intel and Google are getting in on the action. All this buzz has made the AV market a consistent job creator, and that’s just the tip of the iceberg.
Research firm IDTechEx recently released a report that predicts that AVs could match or exceed human safety levels by 2024. Depending on international regulations, the report even claims that autonomous vehicles could conceivably meet the world’s collective transportation needs by 2050 while dramatically reducing the number of collisions in the process.
Self driving cars represent a marriage of IoT devices, advanced AI, and human ingenuity – and with Statista suggesting that the connected car market could be valued at $166 billion by 2025, it is an exciting industry to watch.