Recent Florida Car Accidents Show the Risks of Over-Reliance on Self-Driving Technology

The promise of self-driving cars seems more within reach than ever before, but placing too much trust into the early versions of these systems has shown that over-reliance can lead to deadly consequences.

car using self driving technology

In Florida alone, there have been two fatal accidents involving the use of Tesla’s “Autopilot” driver-assist technology. The first Florida fatal Tesla autopilot crash, which occurred in 2016, was also the first of its kind in the nation. It involved a Tesla Model S with Autopilot engaged driving underneath a tractor trailer turning left across the direction of oncoming traffic. A very similar incident happened in 2019 when a Tesla driver in Delray Beach using Autopilot crossed underneath a tractor trailer turning left across a highway.

At a third fatal crash in September 2021, this one in Coral Gables, two individuals were speeding through a neighborhood when the vehicle abruptly turned left into a tree. The crash is currently under investigation by the NHTSA, and it has not been disclosed as to whether or not Autopilot was engaged at the time of the collision.

Regardless of the circumstances, drivers in Florida and everywhere else on the planet are put at risk when technology that is not fully tested is put into the hands of individuals without proper controls in place to prevent abuse. Videos of vehicle operators asleep at the wheel and sitting in the backseat should raise alarms that the technology has a long way to go in order to reduce the risk of abuse and dangerous, or deadly, crashes.

Two Identical Fatal Florida Tesla Crashes Indicate a Critical Autopilot Flaw

The first crash in the U.S. where Autopilot was reported to have been a factor occurred in May 2016 in Williston, Florida. A tractor-trailer hauling a shipment of blueberries made an improper left turn on Highway 27, cutting off the path of a black Tesla S traveling 74 mph in the opposite direction.

In a typical accident, the truck driver’s failure to properly observe oncoming traffic and their decision to turn in front of the Tesla would have been the only major factor considered. But in this crash, two major points were observed by news outlets around the world. First, the Tesla, which was being operated by 40-year-old driver Joshua Brown of Ohio, made no efforts to brake or evade the tractor trailer. Secondly, the Tesla had its Autopilot system engaged at the time of the collision.

Autopilot is designed to handle many driving tasks for Tesla operators, including acceleration, braking, and steering. However, the system is intended to be used solely as a form of driver assistance. Drivers are supposed to remain alert and observant of the road.

Tesla also implements technology that is meant to require drivers to keep their hands on the wheel at all times — although it does not always work this way in practice. Most notably, a 2021 Consumer Reports investigation found that they could activate the Autopilot functionality on a Tesla Model Y to have the vehicle operate entirely on its own with no one in the driver’s seat. (Since that test, Tesla has implemented onboard cameras to monitor driver inattentiveness, although the effectiveness of this system has not been fully verified.)

The 2016 accident being the first of its kind, alerted Tesla and the general public to the fact that many onboard systems can critically fail to provide the intended level of safety. In Tesla, and in many other modern vehicles from other manufacturers, onboard safety systems are supposed to engage automatic emergency braking in the event front-facing cameras detect on imminent collision. However, a public disclosure made on Tesla’s website revealed that the safety system confused the white reflective surface of the trailer with the sky, which meant that the vehicle completely failed to notice the oncoming collision.

Tesla CEO Elon Musk even went as far as to say that the onboard safety system is calibrated such that it ‘tunes out what looks like an overhead road sign to avoid false braking events.’

History would all but repeat itself in May of 2019 when a Tesla Model 3 using Autopilot would also cross underneath a tractor-trailer making an improper left turn. 50-year-old driver Jeremy Beren Banner would die at the scene after his windshield made impact with the lower trailer carriage, shearing off the entire roof of the vehicle. Again, accident scene evidence suggested that the Model 3 made no attempt to brake or avoid the collision.

The National Transportation Safety Board (NTSB) would later issue a report that would conclude that “the design of the Autopilot system contributed to the crash because it allowed the Tesla driver to avoid paying attention.”

Jeremy Banner’s surviving family members would later file a wrongful death lawsuit against Tesla in response to the tragedy.

Tesla Model 3 Driver Using Autopilot Strikes Emergency Vehicles on Orlando Highway

A third notable incident involving an Autopilot accident in Florida was, fortunately, non-fatal. The August 2021 accident occurred in the wake of another collision that had left the wreckage of a Mercedes-Benz SUV obstructing Interstate 4 in downtown Orlando. The driver of a Tesla Model 3 failed to notice emergency vehicles stationed in the middle of the interstate with their red-and-blue emergency lights flashing. The Model 3 then collided with a parked Florida Highway Patrol (FHP) car and then careened into the disabled Mercedes.

<blockquote class=”twitter-tweet”><p lang=”en” dir=”ltr”>Happening now: Orange County. Trooper stopped to help a disabled motorist on I-4. When Tesla driving on “auto” mode struck the patrol car. Trooper was outside of car and extremely lucky to have not been struck. <a href=”https://twitter.com/hashtag/moveover?src=hash&amp;ref_src=twsrc%5Etfw”>#moveover</a>. WB lanes of I-4 remain block as scene is being cleared. <a href=”https://t.co/w9N7cE4bAR”>pic.twitter.com/w9N7cE4bAR</a></p>&mdash; FHP Orlando (@FHPOrlando) <a href=”https://twitter.com/FHPOrlando/status/1431565185899171840?ref_src=twsrc%5Etfw”>August 28, 2021</a></blockquote> <script async src=”https://platform.twitter.com/widgets.js” charset=”utf-8″></script>

A trooper who was nearby was almost seriously injured, and could have even been killed, according to a FHP representative. “We’re very lucky he was able to get out of the road, because after the impact with a Tesla and the patrol car, the Tesla kind of swung around and hit the disabled vehicle,” she told Orlando News 6 station WKMG.

One remarkable factor about the case is that the vehicle operator alleges they were using Autopilot at the time of the crash, which FHP has yet to confirm. FHP has also declined to file charges, which would allow them to require the Model 3 owner to yield data that would confirm the vehicle’s active systems and the driver’s actions leading up to the collision. What’s notable about this is that the information cannot be retrieved from the vehicle itself without Tesla’s cooperation or, in many cases, the consent of the vehicle owner. Obscuring this data stymies crash investigations and can prevent a full understanding of how and when assisted driving systems fail.

In the wake of the crash, the NHTSA has launched an investigation and has formally requested “detailed information on how Tesla’s Autopilot system detects and reacts to emergency vehicles parked on highways.”

The NHTSA investigation is one of 12 currently listed by the Administration involving Tesla Autopilot.

Manufacturer Liability for Advanced Driver Assistance Systems Remains in Legal Gray Area

Responses from Tesla and other manufacturers that offer advanced driver assistance technology has been contrite but also firm in their resolve that the provided technologies are safe. They are also quick to point out the vehicle operator’s responsibility and the repeated warnings that they must maintain focus and be prepared to disable the systems and intervene.

Yet, according to the New York Times, “safety experts say Autopilot may encourage distraction by lulling people into thinking that their cars are more capable than they are.” The Times also adds that “the system does not include safeguards to make sure drivers are paying attention to the road and can retake control if something goes wrong.”

The issue is so pervasive that an article detailing a Tesla crash into a Palm Harbor home felt the need to clarify that Autopilot was not in use at the time.

In total, the NHTSA has “opened 33 individual investigations into Tesla crashes involving 11 deaths since 2016 in which use of advanced driver assistance systems was suspected,” according to Reuters.

Legal action against Tesla and other manufacturers has been scant but growing. In addition to the wrongful death lawsuit mentioned above regarding the fatal 2019 Delray Beach collision, a Fort Pierce Model S owner using Autopilot collided into a disabled vehicle in 2019, leading to severe permanent injuries, according to the Ninth Judicial Circuit Court filing.

A study conducted by MIT also confirms that despite repeated warnings and attempts to explain the narrow usage of advanced driver assistance features, users almost inevitably become inattentive once systems like Autopilot are engaged. Even more concerning is the fact that other manufacturers have introduced similar features last year, including the country’s two biggest manufacturers, Ford (BlueCruise) and GM (Super Cruise).

Also of note is the fact that the Los Angeles County District Attorney’s office filed felony vehicular manslaughter charges against a negligent Tesla owner who, in 2019, ran a red light while using Autopilot and crashed into a parked vehicle, killing its two occupants.

Work With an Advanced Driver Assistance System Accident Lawyer in Tampa

If you or someone in your family has been injured as a result of Autopilot, self-driving technology, or an Advanced Driver Assistance System (ADAS), you may be able to hold negligent manufacturers and negligent drivers alike responsible for your damages.

Florida law (F.S. §316.85) states that the vehicle operator is responsible for the vehicle’s safe operation even when autopilot is engaged and even if the vehicle is operating without them in it after they have engaged the autopilot. Another law (§319.145) states that all semi-autonomous vehicles must “be capable of being operated in compliance with the applicable traffic and motor vehicle laws of this state,” which strongly implies that manufacturers must provide the means to prevent abuse and common accident scenarios to the extent possible.

Darrigo & Diaz Attorneys at Law is a highly experienced Tampa personal injury firm prepared to help you and your loved ones hold all responsible parties accountable. Reach out to our offices to review the details of your case and learn how we can assist you with litigation to recover the maximum compensation available for all of your damages.

Schedule a free, confidential, and no-risk consultation today when you call (813) 774-3341 or contact us online.

ACT FAST TO PROTECT YOUR FUTURE

Call now, live chat, or complete the form below to take the first step toward securing a strong defense.


  • This field is for validation purposes and should be left unchanged.
What Our Clients Are Saying