Tesla semi-autonomous car crash in California raises safety questions

Posted by Sharon Bowles on Feb 25, 2018 10:07:40 PM

In legal marketing

Federal agencies are investigating a recent crash of a Tesla Model S electric car in California, raising concerns over the safety of semi-autonomous and self-driving cars.

The Tesla Model S was traveling at 65 mph on Interstate 405 when it crashed into a parked fire truck on the side of the highway, after having responded to an earlier accident.

The Tesla driver told the California Highway Patrol that he had activated Autopilot before the crash. According to a statement released by Tesla, the driver, who is known to them, said he was traveling behind a pickup truck with Autopilot engaged. Because of the truck’s size, the Tesla driver could not see what was in front of it. One report states that the pickup truck swerved into the right lane after suddenly coming upon the parked firetruck, and the Tesla driver didn’t see it in time to react to avoid it.

The National Transportation Safety Board (NTSB) and the National Highway Traffic Safety Administration (NHTSA) are investigating the performance of Autopilot, which keeps a semi-autonomous vehicle centered in its lane at a set distance from any vehicles driving in front of it, as well as its ability to safely change lanes and brake automatically.

The crash raises many questions about the semi-autonomous vehicle. Did the Autopilot sensors fail to pick up on the presence of the firetruck ahead?  Why didn’t the Tesla’s Autopilot system have time to react while driving at 65 mph? Why didn’t the self-braking system deploy? Was the vehicle traveling too closely behind the pickup? Did the driver fail to appropriately monitor the vehicle’s performance?

The Model S Autopilot is a level 2 on a self-driving scale of 0 to 5. Level 5 vehicles can operate autonomously almost any scenario. Level 2 systems are usually limited to use on interstate highways. With level 2 systems, drivers are supposed to continuously monitor vehicle performance and be ready to take control if necessary.

Previous accident involving Tesla Model S Autopilot

In May of 2016 the NTSB determined that a fatal crash in Florida of a similar make car was caused by the Tesla driver’s overreliance on technology and the truck driver who made a left turn in front of him; however, they did believe the design limitations of the Autopilot played a major role. They determined that the sedan’s cameras and radar weren’t capable of detecting a vehicle turning into its path.

The NTSB re-issued previous recommendations that the government require all new cars and trucks to be equipped with advanced technology that wirelessly transmits the vehicles’ location, speed, heading and other information to other vehicles in order to prevent collisions.

NHTSA’s biggest warning to automakers is to not treat semi-autonomous cars as if they were fully self-driving.

Fortunately, the Tesla driver escaped injury with only minor cuts and bruises and Tesla and other automakers are using this accident to take a closer to see how they can improve the safety of all future self-driving and fully-autonomous vehicles.

Would you feel safe sharing the the road with a self-driving or semi-autonomous vehicle? Would you feel safe driving one? I, for one, hope that the auto manufacturers leave no stone unturned when it comes to safety in their rush to dominate the market.

Group Matrix  Blog – January 29 – by Sharon Bowles