Reviews Another Autopilot Crash Highlights the Limitations of Driver-Assist Systems

19:56  16 may  2018
19:56  16 may  2018 Source:   consumerreports.org

Tesla Admits Autopilot Was Active In Fatal Model X Crash

  Tesla Admits Autopilot Was Active In Fatal Model X Crash The driver, though using Autopilot, had visual and audible warning prior to the crash that claimed their life.Last week, a Tesla Model X driver was killed when their vehicle impacted a concrete highway divider, whose crash attenuator was out of order from a prior incident. The crash occurred with sufficient force to burst the vehicle's lithium-ion batteries, which, once free of their casings, conflagrated. Eager to rip Tesla, many leapt on the Autopilot blame bandwagon, to which Tesla responded by opening an investigation into the accident, with a scan of the smashed Model X's black box.

The Autopilot system was not designed for fully automated driving and had no way of "seeing" the oncoming crash due to limitations in its sensor setup. Nor was the Tesla adequately engaging the driver with warnings about his inattention to the road or the task of driving .

The "operational limitations " of Tesla's Autopilot system played a "major role" in a 2016 crash , the National Transportation Safety Board said on Tuesday. The accident was the first known fatal crash involving a car using an automated driver assistance system , NTSB chairman Robert Sumwalt said.

a motorcycle parked on the side of a dirt road © Provided by Consumer Reports

Consumer Reports has no financial relationship with advertisers on this site.

As more vehicles debut driver-assistance features that automatically accelerate, brake, and steer, another crash has drawn attention to how these new technologies work—and in what situations they may fail.

Drivers should not view driver-assistance systems as “self-driving,” and automakers should take care not to brand them as such, according to Consumer Reports experts.

“It’s important that drivers realize the limitations of the new technology and not be overconfident,” says Kelly Funkhouser, program manager for vehicle interface at CR.

Tesla's new statement on fatal crash doubles down on driver error

  Tesla's new statement on fatal crash doubles down on driver error This is similar to Tesla's response to other, similar crashes involving its semi-autonomous driver aids.After the family of Walter Huang went on a California ABC affiliate to discuss the Tesla Model X crash that took his life, the company sent out a statement. As with past statements, Tesla touted the benefits of Autopilot while casting the blame on Huang for allegedly missing warnings and not paying attention while operating the system.

The crash of a Tesla while it was using Autopilot hints at the geographic limitations of autonomous vehicles. But a recent crash of a Tesla raises another possibility. What if the success of autonomous vehicles will rely not just on the power of their sensors and computer systems , but also on the

The May 7 death of Ohio technology company owner Joshua Brown in a Tesla Motors Inc (tsla) Model S while the car’s semi-automated Autopilot system was engaged highlighted the limitations of current automated driving systems .

The latest crash took place earlier this week in Utah, when a Tesla Model S driver who said she was using Tesla’s Autopilot crashed into the rear of a fire truck that was stopped for a red light.

Autopilot is a suite of features that can help steer and brake, and is not meant to be a replacement for an attentive driver.

According to a statement from the South Jordan, Utah, Police Department, the driver “admitted that she was looking at her phone prior to the collision,” and witnesses said she “did not brake or take any action to avoid the collision.” The Model S was traveling at 60 mph before the crash, police said.

The driver of the Model S, identified only as a 28-year-old woman from Lehi, Utah, had minor injuries. The driver of the fire truck was not injured.

NHTSA releases quick shopping guide for driver-assist systems

  NHTSA releases quick shopping guide for driver-assist systems It's not perfect, but it's necessary, since not everyone knows what these systems do.The National Highway Traffic Safety Administration (NHTSA) has released a pocket shopper's guide for driver-assistance tech. It's short and straightforward -- after a quick definition of driver-assist tech, it goes into brief explanations for most systems and the benefits they confer.

Design limitations of the Tesla Model S's Autopilot played a major role in the first known fatal crash of a highway vehicle operating under The guidelines encourage companies to put in place broad safety goals, such as making sure drivers are paying attention while using advanced assist systems .

In one example after another , it's clear too many people don’t get, or ignore the limitations of these Automakers anticipated such problems, and have tried to respond with systems that keep drivers There's Tesla's Autopilot , Cadillac's Super Cruise, Audi’s Traffic Jam Pilot , Nissan’s ProPilot Assist .

“This is a problem that’s not unique to Tesla,” Funkhouser says. “Many drivers may be surprised to find out that these systems aren’t designed to completely stop from 60 mph when facing a stationary or stopped car.” Indeed, the owner’s manuals for driver-assist systems such as Cadillac Super Cruise, Tesla Autopilot, and Volvo Pilot Assist II all state that the systems may not be able to avoid stationary vehicles or obstacles.

According to Funkhouser, driver monitoring may be able to reduce distraction while driver-assistance technology is in use.

“Monitoring driver behavior through eye tracking or other biometrics may be a partial solution to the issue,” she says.

That’s what Cadillac’s Super Cruise system does. Like Autopilot, it can steer and brake. But unlike Autopilot, Super Cruise uses sensors to track where the driver’s eyes are looking, and will give audible and visual alerts if it detects that the driver isn't paying attention.

Tesla Autopilot to get ‘full self-driving features’ in August

  Tesla Autopilot to get ‘full self-driving features’ in August Tesla's Autopilot driver assistance system will get full self-driving features following a software upgrade in August. Autopilot, a form of advanced cruise control, handles some driving tasks and warns those behind the wheel they are always responsible for the vehicle's safe operation. But a spate of recent crashes has brought the system under regulatory scrutiny."To date, Autopilot resources have rightly focused entirely on safety. With V9, we will begin to enable full self-driving features," Musk tweeted on Sunday, replying to a Twitter user.

The driver apparently told the fire department the car was in Autopilot mode at the time. The crash highlighted the shortcomings of the increasingly common semi-autonomous systems that let cars drive themselves in Volvo's semi-autonomous system , Pilot Assist , has the same shortcoming.

Design limitations of the Tesla Model S's Autopilot played a major role in the first known fatal crash of a highway vehicle operating under The guidelines encourage companies to put in place broad safety goals, such as making sure drivers are paying attention while using advanced assist systems .

On Twitter on Monday, Tesla CEO Elon Musk responded to a Wall Street Journal report that claimed Tesla chose not to include eye-tracking technology on its vehicles because it cost too much and would not be effective. “Eyetracking rejected for being ineffective, not for cost,” Musk tweeted.

Tesla’s Autopilot displays audio and visual warnings if it does not detect a driver’s hands on the wheel, and it will shut the system off after too many warnings. While the Model S owner’s manual says drivers should not use Autopilot on city streets, it does not prohibit them from doing so. By comparison, Cadillac’s Super Cruise uses map data to determine what kind of road the car is on and will work only on a divided, limited-access highway.

“These cars can’t drive themselves, but it’s too easy for consumers to think they can,” says William Wallace, senior policy analyst for Consumers Union, the advocacy division of Consumer Reports. “Drivers need to pay attention, and companies need to take responsibility for the fact that greater automation opens the door to dangerous, and foreseeable, distractions. From clearer names and sensible system limitations to effective driver monitoring, there’s a lot more automakers can and must do."

Tesla Autopilot Update Warns Drivers Sooner to Keep Hands on Wheel

  Tesla Autopilot Update Warns Drivers Sooner to Keep Hands on Wheel Tesla delivered an over-the-air update to its Autopilot driver-assist system over the weekend that warns drivers quicker with visual and audible alerts to put their hands back on the wheel.Tesla delivered an over-the-air update to its Autopilot driver-assist system over the weekend that warns drivers quicker with visual and audible alerts to put their hands back on the steering wheel.

The Tesla’s driver said he had the vehicle’s Autopilot driver - assist system engaged when it The firetruck was parked in an emergency lane at the side of the highway attending to another The NTSB has previously said Tesla’s Autopilot system was a contributing factor in a 2016 fatal crash in Florida.

Another Tesla crash , another round of finger-pointing. This time, a Tesla owner in Los Angeles drove into the back of a parked fire truck on a freeway this past Monday while the car’s Autopilot semi-autonomous drive system was, according to the driver , engaged.

There have been two reported fatal crashes when the driver of a Tesla vehicle was using Autopilot. Joshua Brown died in a May 2016 crash in Florida with the Autopilot activated in his 2015 Model S, which led the automaker to release software updates designed to keep drivers more engaged. In March, Wei Huang was using Autopilot when he was killed in a single-vehicle crash into a concrete barrier in California.

Consumer Reports is an independent, nonprofit organization that works side by side with consumers to create a fairer, safer, and healthier world. CR does not endorse products or services, and does not accept advertising. Copyright © 2018, Consumer Reports, Inc.

Research the Tesla Model S on MSN Autos | Find a Tesla Model S near you

Follow MSN Autos on Facebook and Twitter

Two Teslas Perform the Worst In IIHS Automatic Braking Test .
However, the Model S and Model 3 performed better than the competition with Autopilot turned on.In this particular test, the five cars—also including the 2017 BMW 5-Series, the 2017 Mercedes-Benz E-Class, and the 2018 Volvo S90—were driven toward a stationary object at 31 mph with their advanced driver assistance systems, such as Tesla's Autopilot, turned off, and automatic braking turned on. While the Teslas did slow down before hitting the object, the other cars managed to stop before hitting it. The Teslas did not.

—   Share news in the SOC. Networks

Topical videos:

This is interesting!