What are self-driving cars? Well, in simple words we can say that they are “A computer-controlled car that drives itself”. These cars have a variety of sensors to perceive their surroundings ( like radar, lidar, sonar, GPS, odometry ) and advanced control systems to interpret sensory information to identify appropriate navigation paths, as well as obstacles and relevant signage.
In early January last year, a Tesla self-driving car hit a fully autonomous delivery robot on the streets of Las Vegas. The autonomous robot, which goes by the name “Promobot” had strayed on to the road and gotten knocked over by the Tesla.
This prompted humorous reactions from news agencies (autonomous car ‘kills’ robot) and internet memer’s alike. One twitter user quoted, ‘Instead of pondering whether robots will turn on us, we should have asked whether they will resort to murdering themselves’.
In just two months, however, the media’s outlook on autonomous cars completely changed. Many automakers had touted self-driving cars as actually the future due to their impeccable AI safety features. In March 2019, a self-driving car being tested by Uber collided with a pedestrian who happened to be crossing the road with her bicycle, instantly killing her.
There was a backup driver in the vehicle for safety reasons, however, he was looking down moments before the disaster and couldn’t react fast enough to swerve the vehicle out of the way. Although from the dash-cam footage it was clear that the car couldn’t be blamed, Uber promptly suspended all testing and an investigation was launched by the American National Transportation Safety Board (NTSB). Later NTSB reported that the car could not classify any object which was not near the footpath as a pedestrian. In simpler terms, it just failed to realize that people sometimes ignore/bend the laws and jaywalk. Also noteworthy was the fact that the system’s emergency brakes were disabled, and it was the driver’s duty to take care of emergency braking.
The crash was the first case of a fatality due to an accident involving a self-driving car. It might not have dented the image of self-driving cars all that much, but people surely did take notice. People who had been convinced that the system is fail-proof suddenly realized there might be cases where even the most ingeniously designed algorithms and sensors can fail.
Up to a couple of years ago, self-driving vehicles were viewed as not just a possibility, but an eventuality. Recreational driving does have its place in every petrol head’s heart, but who really wants to spend an hour or 2 every day negotiating the increasingly horrendous traffic of the world’s cities on their daily commute. These artificial intelligence-powered autonomous vehicles promised safety, efficiency and even a reduction in traffic snarls.
The computer system in control of the car will never suffer from fatigue, never drive under the influence of alcohol or drugs, make dangerous maneuvers for the adrenaline rush and always follow traffic rules. They are already touted to be safer than the average human driver. What is happening now is that the systems are being made more dynamic and adaptive.
The earliest Automated Driving Systems (ADS) had nothing more than vehicles with adaptive lane centering and cruise control. These are considered Level 1 ADS by the Society of Automotive Engineers. Level 2 is partial automation, which is an improvement on the features of Level 1. Level 3 is conditional automation, where the driver is needed, to monitor the environment and step in if necessary. However, the vehicle can drive itself if certain environmental conditions (Clearly marked lanes, visibility etc) are satisfied.
In Level 4, high automation, the vehicle doesn’t require to be ready to assume control like in the previous case. However, the quality of the ADS might decline based on environmental inputs. Level 5 is the zenith of ADS as envisioned now and has completely automated driving irrespective of conditions. It’s as simple as get into the car and let your vehicle know where you want to go, and the car will drive you there with absolutely no other driver input.
While level 5 seems a bit futuristic now it is not that distant into the future either. Many companies like Tesla already give out features that can be considered to fall in the Level 3 bracket. Some of the big players have announced that they will introduce Level 4 cars by the early 2020s. Baidu and Volvo, for example, are teaming up to produce Level 4 cars by 2021.
There are some observers who are a bit more cautious in their predictions as well. Steve Wozniak, one of the founders of Apple, has said that it will take a long time for self-driving cars to be good enough that they are implemented on the scale.
Some believe it can never even be implemented everywhere. And as an Indian who had to navigate the pothole-ridden, crammed, undisciplined roads of Cochin and Mumbai, I believe him. Jaywalking is the norm here and often lane discipline is a fantasy. ADS algorithms that are programmed to maintain a certain safe distance from the vehicle in front and behind you will go into shell shock on busy Indian roads.
How these unique challenges will be overcome is something for the creators of these technologies to ponder. We can’t expect a system to take over a function when the environment it is subjected to is complete without rules and chaotic.
Self-driving cars might be an assured future. All major technology and automotive players have thrown their weight behind the idea. But how far away is that future, is a different question altogether.
Visit our website to learn more about Artificial Intelligence and understand the implications of AI in various technological advancements.