No data to display.

Consumer Reports says Tesla's 'Full Self-Driving' software lacks safeguards

By Arghyadeep on Jul 21, 2021 | 03:31 AM IST

tesla.jpg


Tesla Inc’s “Full Self-Driving” software lacks safeguards, Consumer Reports (CR) research revealed on Tuesday, raising concerns on the safety of others in the road citing reports from drivers.

The influential nonprofit consumer publication cited videos posted by several drivers on social media using the FSD and raised concerns about issues, including “vehicles missing turns, scraping against bushes, and heading toward parked cars.”

CR said it would independently test the prototype software known as FSD Beta 9 as soon as its Model Y receives the update.

“Videos of FSD Beta 9 in action don’t show a system that makes driving safer or even less stressful,” Jake Fisher, senior director of Consumer Reports’ Auto Test Center, said.

“Consumers are simply paying to be test engineers for developing technology without adequate safety protection,” said Fisher.

Bryan Reimer, a professor at MIT and founder of the Advanced Vehicle Technology (AVT) consortium, told CR that “while drivers may have some awareness of the increased risk that they are assuming, other road users—drivers, pedestrians, cyclists, etc.—are unaware that they are in the presence of a test vehicle and have not consented to take on this risk.”

Even Tesla CEO Elon Musk urged that drivers use caution when using FSD beta 9, and tweeted “there will be unknown issues, so please be paranoid.”

Most of the experts whom CR has interviewed have called for Tesla to incorporate robust driver monitoring systems that work in real-time and ensure that the person behind the wheel is ready to take control as soon as the car cannot handle a driving task.

Since May, Tesla has started using the camera above the rear-view mirror in the Model 3 and Model Y for driver monitoring.

Although some of the new Tesla vehicles come with real-time driver monitoring cameras enabled, CR experts have raised questions about their effectiveness.

In April, Consumer Reports said its engineers could defeat safeguards on Tesla’s Autopilot and get out of the driver’s seat.

“Car technology is advancing really quickly, and automation has a lot of potential, but policymakers need to step up to get strong, sensible safety rules in place,” says William Wallace, manager of safety policy at CR. “Otherwise, some companies will just treat our public roads as if they were private proving grounds, with little holding them accountable for safety.”

Last month, NHTSA said that it has started 30 investigations into Tesla crashes involving 10 deaths since 2016, where it is suspected that the vehicles were using advanced driver assistance systems.

In at least three cases, the autopilot was engaged, the National Transportation Safety Board (NTSB) has said.

The NTSB has criticized Tesla’s lack of system safeguards for autopilot, allowing drivers to keep their hands off the wheel for extended periods.

Tesla said last week eligible owners can subscribe to FSD for $99 or $199 a month. Tesla says FSD’s “currently enabled features do not make the vehicle autonomous. The currently enabled features require a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.”

Jason Levine, executive director of the Center for Auto Safety, thinks unless Tesla changes course, its approach to automation may backfire, mentioning that automation can save lives, but not if its limitations aren’t made clear to drivers.

Picture Credit: Teslarati

Stock View