Self-Driving Cars Should Not Excite You, Not Yet


Self-driving cars are a reality, companies such as Tesla and Google have been at the fore front pushing for this technology. We already have road-legal, consumer ready autonomous cars, such as the Tesla Model 3 and Tesla Model X. The technology in these cars is so good, that they can drive themselves on the highway, change lanes, read traffic signs and follow them (human beings should pick this up) and even park themselves.

The revolution of self-driving cars is however not yet at its peak, we’re actually just starting the climb. Tesla cars still come with steering wheels and the company insists that drivers should have their hands on the wheel at all times, even when the car is in self-driving mode. Uber have also been testing self-driving taxis and they are actually being used in some parts of the world.

As exciting as it would be to have a car that can drive itself, I think we as human beings are not yet ready to let this happen. Would you trust a taxi that doesn’t have a driver? As you think about that, here’s my argument as to why self-driving cars should not excite you just yet.

Foster laziness – as human beings we are already too lazy. We have the TV remote, escalators and elevators, smartphones among other automated machines that already take care of most of the manual labour that our fore fathers would do. Now imagine, not driving — which isn’t all that hard in the first place. Instead of putting effort into doing something correctly, we are now expecting it all to be done for us.

It is tech and tech fails – just the other day I was in a banking hall, recently, you get a ticket number for the kind of service you need and you will be served when it is your turn (your ticket number is called out and you’re directed which teller to go to). Everything was going well, until the ticketing system failed.The system just stopped calling out the next ticket numbers and it hang.

So what do we do now? We weren’t sitted in any order, since we relied on the ticket number, we didn’t even know which teller to go to, since again, we relied on the ticket numbers. Now imagine, something similar happens to your car. You’re busy cruising on the highway and suddenly the system “hangs”. You have no steering wheel, so you cannot take over. You will totally be at the mercies of the car.

Self-driving cars have already been responsible for one death. In May 2016, a Tesla driver in Florida put his car in self-driving mode and started watching a Harry Potter movie. Then his car decided to make a simple lane change. Unfortunately, it couldn’t detect the difference between a clear blue sky and the large white side of an 18-wheel truck, and bam. That is how the owner of the Tesla lost his life.

The autonomous cars rely on GPS to correctly navigate their way around, now how many times have people driven into dead-ends because they were following a GPS? Probably too many to count. There are just so many variables that could go wrong.

Security threat – a self-driving car is one more thing to add to your life that can be hacked, since the cars will all need to be connected to a network for them to communicate with each other effectively (I can’t imagine of they decide to go on strike).Think tanks are already worried about could happen if hackers took over cars while people were inside them. The best-case scenario is that hackers take control safely but demand money to give it back (ransomware attacks). What if they managed to take over the system and brought entire cities, or even countries, to a halt?

Reports have also warned that terrorists could use this new technology by filling cars with explosives, then having them drive to a destination with no one inside. Studies have been done on how criminals could become more dangerous once they have the ability to shoot at police during car chases without having to watch the road.

Lack of human instincts – yes, we could argue that human beings are the cause of many accidents but let us not forget the many times that our instincts have saved us from trouble. Autonomous cars are programmed to absolutely always follow the law, and this could be bad. What happens when the traffic lights (which the autonomous cars heavily rely on) malfunction? Or there is a police officer directing traffic? Yet, these self-driving cars cannot understand human signals.

I am pro-technology, but I also think we should draw a line. I love driving, it gives you control and power over the car. There are a lot of “mad” drivers on the road, but self-driving cars are not the answer at the moment. Remember, the cars rely on AI to learn, so what if some cars decide to be “mad” as well?

For now, I will settle with assisted driving at the most. The car can park itself, monitor blind spots and aid when it comes to braking. I love my cars with steering wheels, and 92 percent of other drivers agree with me.