This is the point that Zach Aysan, a Canadian data scientist and cyber security expert, wants to drive home in a long-form piece titled, "Self-Crashing Cars." To him, while the prospect of self-driving cars can be the source of inspiration and a sign of great things to come, it's important to be mindful of certain negative consequences that could befall society as a whole as the technology behind them finally hits the mainstream.
In order to understand his overarching point, first it's important to know exactly what self-driving cars are. It has been said that self-driving cars, or automated cars, shouldn't even be called cars at all. Instead, they should be called simply automobiles – or autos, for short. It's simple, straight to the point, and really underlines the fact that they aren't, and rightfully shouldn't be, treated as cars.
For one thing, self-driving cars have one fundamental difference from regular cars. And that is the fact that they don't actually need a driver because, of course, they can drive themselves. In fact, no human even needs to be present in the conventional driver's seat. It just isn't necessary at all for a man or woman to take up space inside a self-driving car for it to function. Working as a car that can be used to commute or for any other purpose is just one incidental benefit – a self-driving car is still a car even if there is no driver in it.
And how is this possible? Well, largely because self-driving cars are actually just highly-advanced, internet-connected, GPS-equipped computers on four wheels. And that, according to Aysan, is where the main crux of the problem comes from: Computers-as-cars are just as vulnerable to hackers as most other kinds of computers.
"It only takes a single entry point incorrectly secured to allow inadvertent public access," said Aysan. "Defending all entry points and permanently keeping them defended, despite changing organizational requirements, personnel, and a never ending stream of vulnerability updates to software libraries; is nigh on impossible." In his view, this could end up being one of the causes of major problems for self-driving cars.
If a group with nefarious interests somehow manage to access and remotely control just one self-driving vehicle, they could easily and effectively turn it into a directed bomb, warns Aysan. "A fully charged Tesla traveling over 200 kilometers (125 miles) per hour that crashes into a chemical plant, electrical subsystem, oil line, or gas station will have an impact worse than bombs that terrorists set off in the middle east," he said.
Now imagine if they could hack into and control a fleet of a hundred. Or a thousand.
There exists a silver lining, fortunately. Aysan says that there is largely confusion in the industry, more than any real fear or uncertainty. In short, the door is wide open for preventive measures that could help to protect against the potential dangers that might occur if groups of hackers begin to target autonomous vehicles.
Sources include: