Tesla Autopilot and similar automated driving systems get ‘poor’ rating from prominent safety group

4 mins read
51 views

The Insurance Institute for Highway Safety, which rates cars and SUVs for safety, examined so-called advanced driver assistance systems such as Tesla Autopilot and found them wanting.

These systems combine different sensors and technologies to help a driver keep their vehicle in its lane and avoid hitting other vehicles in front and to the sides. Usually, these systems work only on highways. Some can even allow drivers to remove their hands from the steering wheel but all require drivers to pay attention to the road and vehicles around them at all times.

Of the 14 systems tested by the agency, 11 earned a “poor” rating including Tesla’s Autopilot and so-called Full Self Driving systems. (Full Self Driving is not actually fully self driving but, unlike Autopilot and almost all other such systems, it is designed to work on city and suburban streets.)

The organization also rated hands-free highway driving systems from Ford and Nissan as “poor.” General Motors’ hands-free system, Super Cruise, was rated as “marginal.” Only Lexus’s Teammate with Advanced Drive system received a rating of “acceptable.” Even that rating, though, is still one step below the Insurance Institute’s highest possible rating of good.

“Some drivers may feel that partial automation makes long drives easier, but there is little evidence it makes driving safer,” Insurance Institute president David Harkey said in a statement. “As many high-profile crashes have illustrated, it can introduce new risks when systems lack appropriate safeguards.”

The federal government’s National Highway Traffic Safety Administration doesn’t currently regulate these sorts of systems. That’s one reason the IIHS instituted these ratings, Harkey said in an interview with CNN.

“We felt like this is an appropriate time for us to step into the space, fill this regulatory gap we’re talking about, help drive automakers to produce safer implementations within their vehicles and also help consumers understand what these systems actually are and what the differences are between some of these systems,” Harkey said.

Many of the systems received demerits for not doing enough to make sure drivers stayed attentive and undistracted as the car steered, braked and accelerated on its own. The Insurance Institute said none of the 14 systems it tested do a good job monitoring driver attention.

In vehicles where an interior camera monitors driver attention, the Institute tested what happened if that camera is deliberately blocked or if the driver turns their gaze away from the road for too long. Some systems also – or only – monitor whether the driver is holding the steering wheel, so testers looked at what happened if a driver let go of the steering wheel for too long.

With systems such as GM’s Super Cruise and Nissan ProPilot Assist that allow drivers to remove their hands from the steering for long periods, testers also held a block resembling a cell phone. This tested whether the system could detect the driver’s hands were not ready to grab the steering wheel in an emergency.

“We are evaluating the results from the first-ever Partial Automation Safeguards test and will continue to work with IIHS in all matters related to customer safety,” Nissan said in a statement emailed to CNN.

The Institute also checked how these systems alerted an inattentive driver to pay attention. Seven of the systems, it found, didn’t provide two-method alerts – such as a blinking light and a sound or steering wheel vibrations and lights – within 15 seconds of the driver becoming inattentive.

Insurance Institute test drivers also looked at what would happen if the driver became incapacitated with the automated driving system in use. This is a real concern because the car could just keep driving at high speed with no one actually in control. If a driver doesn’t start paying attention for as long as 35 seconds after having been warned to, the vehicle should begin an emergency slow down and, also, contact emergency services on its own, according to the Insurance Institute. Of the systems tested, only GM’s Super Cruise handled that sort of situation the proper way, according to the Institute.

Some automakers actually market adaptive driving features the Institute feels are unsafe. For instance, the Tesla and GM systems can perform lane changes entirely on their own without asking whether the driver wants to change lanes. They can be set to do this if another lane seems faster than the one the vehicle is currently in. That’s not safe, Harkey said in an interview with CNN, because it further invites drivers to disengage from driving. Drivers should at least be prompted to allow a lane change even if the car can handle the maneuver on its own, he said.

“As new vehicles increasingly come equipped with more advanced driver assistance systems (ADAS), efforts like the partial automation ratings report by the Insurance Institute for Highway Safety (IIHS) are important for more robust and unified safeguards,” GM said in a statement emailed to CNN. “We are judicious about safely expanding access to our industry-first hands-free system, Super Cruise, which is meant to serve as an enhancement to the driving experience.”

Since these ADAS systems are not safety features, it shouldn’t be possible to use them when actual safety features are disabled or not used, the Insurance Institute said. For instance, if seatbelts are unbuckled or if automatic emergency braking is turned off, the systems should be disabled. Only a few of the systems met that criteria, the group said.

Ford noted that its vehicles have a system that reminds drivers to buckle their seatbelts and that, according to its own data, vehicles using BlueCruise are 10 times less likely to swerve out of their lane, according to a statement from the automaker.

“We have been working closely with IIHS since BlueCruise was introduced in 2021,” Ford said in an email to CNN. “While we do not agree with IIHS’s findings, we will take their feedback into consideration as we continue to evaluate future updates.”

The Tesla Autopilot system was tested by the Insurance Institute before a December, 2023 recall in which a software update improved driver attention alerts. The Insurance Institute will continue to monitor software updates and improvements and will periodically retest the systems, Harkey said.

Read the full article here

Leave a Reply

Your email address will not be published.

Previous Story

Canoo Inc.: There Is A Glimpse Of Hope But Hold (NASDAQ:GOEV)

Next Story

U.S. curbs on China to rise as ‘decoupling is really in full force,’ expert warns, amid possible TikTok ban

Latest from Business

Nutter Butter, are you OK?

For the past month TikTok users have been commenting on Nutter Butter’s account. “You good?” asked one. “Nutter Butter are you paying for my