Create a free account, or log in

Crash courses in how not to use Tesla Autopilot feature

Self-driving cars are being held back by a single complicating factor, human stupidity.   All of the accidents involving Google’s autonomous cars are the result of human error, and a general reluctance to trust a robot, and now it’s Tesla’s turn to find out just how stupid human drivers can be.   Included in Tesla’s […]
Fallback Image
Denham Sadler

Self-driving cars are being held back by a single complicating factor, human stupidity.

 

All of the accidents involving Google’s autonomous cars are the result of human error, and a general reluctance to trust a robot, and now it’s Tesla’s turn to find out just how stupid human drivers can be.

 

Included in Tesla’s latest software update was an ‘Autopilot’ feature, which the company stressed is not the same as a self-driving car.

 

“While truly driverless cars are still a few years away, Tesla Autopilot functions like the systems that airplane pilots use when conditions are clear. The driver is still responsible for, and ultimately in control of, the car,” the statement said.

 

“Today’s Autopilot features are designed to provide a hands-on experience to give drivers more confidence behind the wheel, increase their safety on the road, and make highway driving more enjoyable.

 

“Drivers can’t abdicate responsibility; we expect the driver to be present and prepared to take over at any time.”

 

But it seems this newfound confidence has gone straight to the heads of many Tesla owners, with multiple people taking to the internet to describe how this new feature apparently tried to kill them.

 

One video shows a car on autopilot mode swerving into oncoming traffic, another takes an unwanted detour down an off-ramp, while one even earns its driver a speeding fine.

 

“After several seconds of successful hands-free driving, all hell broke loose,” the description of one of the video says.

 

As the Next Web reports, the drivers are abusing the new feature, which is explicitly meant to be a “driving aid” rather than a “self-driving machine”.

 

Drivers are meant to still keep their hands on the steering wheel, with an alarm sounding if they’ve been off for too long, but many of the videos posted involve the drivers being too busy filming what’s actually happening to notice the warning sign.

 

It’s also only meant to be used on freeways, using a combination of radar and camera sensors to track the cars around it and follow them.

 

While this feature does involve the car taking on the lion’s share of the driving, it’s definitely not at the level where you can zone out and complete other tasks, such as filming yourself, as a driver.

 

Elon Musk says we can expect that sort of fully autonomous vehicle by 2020.

 

“It will get more and more refined,” he tells Wired.

 

“Eventually, we want it to automatically have your car put itself to bed in your garage.”

 

But this new feature is another important stepping stone towards a fully autonomous commercial vehicle, as long as people learn to use it correctly.

 

“I think this is going to be quite a profound experience for people,” Musk says.

 

“I think it’s going to change people’s perceptions of the future, quite rightly.”

 

Want to grow your business with Instagram? StartupSmart School can help.