Washington, DC CNN —
Tesla’s Autopilot was involved in a third fatal motorcycle crash this summer, raising questions about the driver-assist system’s ability to operate safely.
The National Highway Traffic Safety Administration has already launched investigations into the first two crashes and gathered information on the third crash. More details of the latest crash surfaced Monday.
The three fatal crashes occurred in a 51-day span this summer and follow a similar line of events: A person driving a Tesla in the early morning hours with Autopilot active strikes a motorcycle.
The crashes renew questions about whether users of the systems are kept sufficiently engaged and prepared to fully control the vehicle when needed. Research has shown that drivers glance away from the road more frequently while using Autopilot, and that many Autopilot users believe their cars drive themselves.
Tesla’s Autopilot system keeps the vehicle in its lane while traveling at a set speed, and drivers are instructed to keep their hands on the steering wheel at all times. The automaker says it detects torque on the wheel and uses a camera near the rear-view mirror to determine driver inattentiveness, and uses alerts to remind drivers to keep their eyes on the road.
Ingrid Eva Noon was riding her motorcycle in Palm Beach County, Florida at 2:11 a.m. on Aug. 26 when an impaired driver using Tesla’s Autopilot impacted the rear of Noon’s motorcycle, throwing her onto the Tesla’s windshield and killing her, according to the Palm Beach County Sheriff’s office. Driver-assist crash data that automakers like Tesla must report to NHTSA was published Monday and revealed that Autopilot was engaged.
Utah resident Landon Embry was killed while riding his Harley-Davidson on July 24 at approximately 1:09 am when a Tesla driver using Autopilot collided with the back of his motorcycle.
A Utah motorcycle rider was killed in July after being struck by a Tesla driver using Autopilot. Utah Department of Safety
A Tesla driver using Autopilot struck a motorcycle lying on a road on July 7 at 4:47 a.m in Riverside, California. The motorcyclist, who had already fallen off the bike after hitting a dividing wall, was killed, according to California Highway Patrol. The Tesla did not strike the rider, who had already been ejected, California Highway Patrol said.
The recent crashes suggest the Tesla system is insufficient, according to motorcycle advocates.
Motorcycle safety advocates say they’re concerned that the software fails to see motorcycles and lulls Tesla drivers into a sense of complacency and inattentiveness. The advocates say that the government’s vehicle safety regulations do not adequately protect motorcycle riders and that steps should be taken to better protect them, including testing driver-assist systems like Autopilot for motorcycle detection.
“Motorcyclists have long been told by crash-causing inattentive drivers, ‘Sorry, I didn’t see you.’ Now we are hearing, ‘Sorry, my car didn’t see you.’ This is unacceptable,” Rob Dingman, President and CEO, American Motorcyclist Association said.
“If it can’t see a motorcycle, can it see a pedestrian? Can it see a small child? Can it see an animal?” Eric Stine, treasure of the Utah chapter of ABATE, which advocates for motorcycle riders.
NHTSA said in a statement Monday that no commercially available vehicles today can drive themselves and encouraged drivers to use assistance technologies appropriately.
“Certain advanced driving assistance features can promote safety by helping drivers avoid crashes and mitigate the severity of crashes that occur, but as with all technologies and equipment on motor vehicles, drivers must use them correctly and responsibly,” NHTSA said.
This summer Tesla fans rushed to defend the automaker after a prominent critic released a video showing one of its cars with driver-assist technology plowing into child-size mannequins.
Tesla did not respond to a request for comment for this story.
It is not alone in apparent challenges identifying things at night.
The Insurance Institute for Highway Safety found that 19 of 23 vehicles tested for pedestrian detection earned a “superior” or “advanced” rating during the daytime, but only four received a “superior” rating at night. More than half earned a basic score or no credit.
Visibility is a challenge for humans and machines at night as there’s less light reflecting off things on the road. Tesla cautions in its vehicle owners’ manuals that many factors can impact Autopilot’s performance, including poor visibility.
“Never depend on these [Autopilot] components to keep you safe,” Tesla says. “It is the driver’s responsibility to stay alert, drive safely, and be in control of the vehicle at all times.”
Tesla has said it relies exclusively on cameras to detect objects on the road and to determine if a driver has their eyes on the road. Tesla competitors General Motors and Ford use infrared sensors in their vehicles, which can see some objects better when there’s less visible light, in order to better see a driver’s face and detect distracted driving in low light conditions.
The American Motorcyclist Association says that driver-assist technology that reliably detects motorcycles can prevent crashes. For years it has urged the National Highway Traffic Safety Administration to test for motorcycle detection as it assesses the safety of new vehicles, including their driver-assist technologies. (NHTSA declined to comment on why it does not do so.) Europe’s vehicle safety programs test if driver-assist systems identify motorcycles.
The American Motorcyclist Association has cautioned for years about the risks of emerging driving technologies not adequately detecting motorcyclists.
“If this issue is not addressed early in developing automated vehicles,” it wrote to NHTSA last year, “The consequences will prove disastrous for motorcyclists.”