Tesla's Autopilot feature has seen many growing pains upon being introduced to market
In the long run, self-driving cars are expected to be safer than cars driven by humans, particularly because the mechanism will reduce injuries and deaths. However, in the short run, autonomous self-driving cars are likely to be involved in many accidents resulting in injuries, deaths, and lawsuits, and a general “added danger”.
Living in an age of rapid technological advancement comes with many benefits; over the past few decades, our lives have been made easier and easier as the result of high-tech innovations. Unfortunately, advents in technology can often lead to new and added dangers for every-day citizens. This stems from the fact that brand-new pieces of technology are neither societally-pervasive, nor widely utilized prior to their market/societal introduction. Therefore, a technological product/innovation that is new to the market, will often suffer multiple substantial setbacks, as the lack of real world data and user feedback in regards to the product, makes it difficult to ensure that all of its serious kinks have been ironed out. A prime example of a company adding danger by releasing new technology without having all the kinks ironed out, is illustrated by the automotive company Tesla and their decision to introduce automobiles with “Autopilot” features to market.
Tesla has been aggressive and proactive in touting their car’s “Autopilot” feature. Tesla claims that the Autopilot feature will engage in life-saving actions, such as evasive maneuvering and immediate breaking, in order to prevent accidents. Despite the Autopilot’s supposed capacity to save lives, some claim it is having the opposite effect.
In August 2019 a Tesla Model S being driven in Russia, allegedly in Autopilot mode, struck a tow truck, and burst into flames. In March 2019, a Tesla Model 3 collided with a tractor trailer in Florida while in Autopilot mode. In March 2018 a Tesla Model X crashed into a barrier in California, while in Autopilot mode, killing the driver.Two other deadly crashes occurred in 2016 while Tesla vehicles were in Autopilot mode.
The most recent of these lawsuits, Banner v. Tesla, was filed by the family of the deceased. The death occurred in Florida when the decadent, who was the driver of a Tesla Model 3, collided with a semi-trailer that had ran a stop sign. The family of the driver named both Tesla and the semi-trailer as defendants, claiming that the vehicle the decedent was operating failed to engage in evasive maneuvers, despite the fact that the car’s “Autopilot” feature was enabled.A preliminary report from The National Transportation and Safety Board (NSTB), confirmed that the autopilot feature was indeed enabled and that it failed to attempt any evasive maneuvers. The deceased attempted no evasive maneuvers himself, and is survived by his wife and three children.
A car accident involving a vehicle in Autopilot mode which causes injuries, raises litigation issues not present in the average car accident.If the driver is injured, a suit may be brought against the car’s manufacturer for the Autopilot feature not working properly. The driver himself may be negligent for not being attentive to surrounding conditions while the Autopilot mode is in operation. If a pedestrian or person in another vehicle is injured due to a car being driven on Autopilot, both the “driver” and owner of the car in Autopilot may be negligent, as well as the car’s manufacturer. A normal negligence case thus becomes a product liability case.
As we rely more on technology to operate vehicles we must be aware that there are likely to be bumps in the road towards greater safety. Every car accident will need to be scrutinized to determine if faulty software or technology played a role in the accident.Lawsuits against the manufacturers of faulty Autopilot systems will serve not only to compensate the injured, but to deter manufacturers from putting dangerous Autopilot systems on the road, and encourage them to put safety over profit.Tesla has been chastised for overstating the capabilities of its Autopilot feature. It may be that the government will have to take a stronger role in making sure such features are safe.
Our firm recommends citizens use extreme discretion/caution when purchasing and using items new to the market place, and even more so, when purchasing items that are seen as innovative to their field.
If you or a loved one has somehow been injured through no fault of your own as the result of faulty or misadvertised products, you should contact a personal injury attorney immediately. Zalman, Schnurman, & Miner P.C are personal injury attorneys with over 30 years of experience serving the Metropolitan New York Area. The first consultation is always free and our firm can be reached over phone at 212-668-0059 and through email at email@example.com.