ABUQUERQUE, New Mexico. While driverless cars are not
currently legal, some vehicles have autopilot modes. Now, officials are
reporting the first deadly driverless vehicle accident. The passenger was in a
Tesla vehicle set to autopilot. According to Slate, the degree to which the
vehicle’s autopilot is at fault is unclear, though the National Highway Traffic
Safety Administration is investigating what happened.
According to Tesla, the autopilot mode wasn’t able to
distinguish between the white side of a tractor trailer and the brightly-lit
sky. Under autopilot mode, the driver should remain focused on what is taking
place and should be in a position to take over the controls if an accident is
going to occur. According to Tesla, the driver also didn’t notice the tractor
trailer, and both the driver and the vehicle failed to engage the brakes. According
to the driver of the tractor trailer, the driver had been watching a Harry Potter movie while driving. Yet,
Tesla claims that this is not possible in the model and the police report
showed no indication the driver was distracted. The car passed under the
trailer. According to ABC News, the driver had just recently
been on the record praising his autopilot system for preventing another
accident.
Because full-time driverless vehicles are not currently
legal, if drivers are using autopilot modes, they need to be in a position to
take over if the car makes an error. In fact, the autopilot mode requires that
drivers keep their hands on the wheel and the car will issue beeping warnings
if the driver removes his or her hand from the wheel. Tesla’s autopilot mode is therefore not a
full driverless mode, and is nowhere
near what Google hopes to sell in the future. Autopilot is a feature that is
meant merely to supplement the driver’s attention, not replace it.
How companies market their technology could have an impact
on accident rates. While Tesla sells its cars with an autopilot mode, Toyota
sells the technology merely as a backup override for human errors. Drivers are
still expected to drive like normal. However, Google claims that drivers will
become distracted if they feel that technology can do the work for them. For
this reason, Google wants to eliminate people from the equation altogether,
removing any confusion about the role of the human driver.
If the computer makes an error, drivers could still be held
responsible for their actions. As it stands, drivers are held accountable in
accidents, not computers. If you’ve been in an accident in Albuquerque, New
Mexico, contact the Law Office of Brian K.Branch today to learn more about how to protect your rights.
It seems that the problem with the Tesla vehicle may have to
do with edge detection, a concern that regulators want to address before driverless
vehicles are made legal. The computer may not have properly been able to detect
an edge due to brightness of the surrounding landscape and sky. The reality is
that the technology is not yet sophisticated enough to work in bright light or
overcast light.
Tesla’s response has been to offer condolences, but to also
defend the safety of its cars. Statistically, self-driving autopilot mode is
safer than a human driver, even taking into account the most recent death. As
it stands, the vast majority of accidents occur due to human error. If you’ve
been in a crash due to another person’s negligence or neglect, you may be
entitled to compensation for your medical expenses, pain and suffering, and
lost wages. Visit http://www.bkblaw.net to
learn more.
No comments:
Post a Comment