A Tesla car has crashed into a parked police car in California.
The driver suffered minor injuries and told police he was using the car’s driver-assisting Autopilot mode.
The crash has similarities to other incidents, including a fatal crash in Florida where the driver’s “over-reliance on vehicle automation” was determined as a probable cause.
Tesla has said customers are reminded they must “maintain control of the vehicle at all times”.
In a statement, it added: “When using Autopilot, drivers are continuously reminded of their responsibility to keep their hands on the wheel.”
The California crash is the latest example of semi-autonomous vehicles struggling to detect stationary objects. A Tesla driving in Autopilot hit a stationary fire engine in Utah in May.
According to a police report obtained by the Associated Press, the Tesla accelerated before it hit the vehicle.
In Greece, a Tesla Model 3 crashed after the autopilot caused the car to suddenly veer right “without warning”.
The driver of the crashed Model 3, You You Xue, voiced his concerns about Autopilot on Facebook.
He wrote: “The vigilance required to use the software, such as keeping both hands on the wheel and constantly monitoring the system for malfunctions or abnormal behaviour, arguably requires significantly more attention than just driving the vehicle normally.”
Benedict Evans, partner at venture capital firm Andreessen Horowitz, tweeted; “There is a serious argument that the incremental, ‘level 2/3’ approach to autonomous cars followed by Tesla, where the human isn’t driving but might might have to grab the wheel at any time, is actively dangerous and a technical dead end. Waymo decided not to do this at all.”
It is not the first time the Autopilot feature has led to dangerous driving.
In England, a driver was banned from driving after putting his Tesla in Autopilot on the M1 and sitting in the passenger seat.
The news comes after two US rights groups urged the Federal Trade Commission to investigate Tesla over its marketing of the assisted driving software.
The Center for Auto Safety and Consumer Watchdog said it should be “reasonable” for Tesla owners to believe that their car should be able to drive itself on Autopilot.
It called the naming of the Autopilot “deceptive and misleading”.
The chief executive of Tesla, Elon Musk, has previously complained abut media attention on Tesla crashes. He tweeted: “It’s super messed up that a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in US auto accidents alone in past year get almost no coverage.”
His comments received support from prominent academic and psychologist Steven Pinker, who has in the past voiced concerns about Tesla’s Autopilot.