July 21, 2016 — By now just about everybody knows about the incident with Tesla that killed a driver. But that is more of a wakeup call than many of the non-technical media really understands.
Frankly, I have to slap Tesla on the wrist for that one. Yes, the driver was an idiot by thinking that self-driving cars have arrived, but Tesla should have known better. They should not allow fully hands-off driving.
The cars may be closing in on full autonomy, but the infrastructure that has to support that is not in place. Hands-off parking is one thing. Flying down the highway at 75 mph, while watching a movie (or sleeping through one) is just plain stupid. Without being integrated into the transportation infrastructure, it is suicide. And people who push the envelope are going to find that out. This will not be the only such incident if the edge isn’t rolled back a bit.
There is not going to be a fully functional self-driving ecosystem until the transportation arterials are plugged in. Translated, that means a ubiquitous smart transportation infrastructure, of which the wireless (and non-wireless) components are still a long way away from being in place. Autonomous cars will require something called “deep learning.” Deep learning is an artificial intelligence that uses neural networks, which are able to “learn as they go.” In technical terms, that simply means it is able to analyze inputs and use them to collectively “learn” about the environment.
For example, when the vehicle car approaches an intersection, the software receives inputs of the situation (how many cars are present and what direction are they headed, what is the stop sign or light situation, if applicable, etc.) and then make decisions that are best suited from the information. If the decision made turns out to be incorrect, then the logic is updated to make it work better for almost every possible situation. As the logic improves, so does the intelligence. Eventually, you have millions of ways to process information through the network.
However, that puts tremendous stress on vehicle-only AI. If the intersection and streets are smart, the input from that segment, coupled with the inputs from vehicle sensors, presents a much more complete set of inputs – to where true hands-off capabilities can be assumed.
And that will also be critical for the “intuitive” type of decisions such as if the self-driving AI can — in a situation that places multiple vectors in harm’s way — either direct the vehicle to hit the people in the crosswalk — or run the car over a cliff killing the passenger(s) in the car. We aren’t even close to that yet.
Therefore, Tesla is making a grave mistake allowing the vehicle to run, completely hands-off in scenarios that are related to life and death – such as hands-off highway driving.
I am all in for autonomous vehicles. But not until all of the ecosystem components are in place.