Would you trust your car to drive itself ? Image Credit: CC BY 2.0 Steve Jurvetson
Joshua Brown died when the Tesla Model S self-driving vehicle he was in collided with a tractor-trailer.
In what is believed to be the first ever fatality involving a self-driving car, the 40-year-old was tragically killed when the vehicle failed to spot a white 18-wheel truck crossing the road and continued at full speed in to the side of it without making any attempt to put on the brakes.
The incident has cast further doubt on the safety of autonomous vehicles however Tesla has moved quickly to provide reassurance by citing the fact that it is only the first time someone has died out of 130 million miles driven by owners of its vehicles.
The self-driving "autopilot" mode of the car in question enables a driver to cruise along with minimal interaction however it is strongly recommended that proper attention still be paid to the road.
In this case Brown was believed to have been watching one of the Harry Potter movies on his DVD player when the accident happened so had not been in a position to react to the oncoming danger.
"Clearly this is a horrible thing, but in the big picture it doesn't affect the technology," said transportation systems analyst Richard Wallace. "But it may affect public perception of the technology, and obviously people have to buy these vehicles."
Apparently the technology has done the equivalent of 13million miles before this fatality. The inaugural trip of Stephenson's Rocket killed a spectator who stood on the tracks (hope I have got my facts right). It didn't stop train development, and we still get deaths due to rail accidents. People died trying to get off the ground during early flight experiments and people still die in air crashes......
I had a feeling something like this would happen. Problem is in the name. Auto Pilot should be called Driver Assisted Mode. NY Times article has the Youtube video of the dead driver when he was previously driving in Auto Pilot mode: http://www.nytimes.com/2016/07/01/business/self-driving-tesla-fatal-crash-investigation.html
The car continued to drive for quite a while after the crash. The car traveled “hundreds of yards from the point of impact, through a fence into an open field, through another fence and then avoiding a bank of trees before being unable to swerve and miss a power pole that eventually stopped the car a few feet away." Another witness said that she was driving 85 MPH and was passed by the driver in the Model S. http://www.roboticstrends.com/article/aftermath_of_the_deadly_tesla_autopilot_crash
From the article ( and news accounts ) According to Tesla’s account of the crash, the car’s sensor system, against a bright spring sky, failed to distinguish a large white 18-wheel truck and trailer crossing the highway. My first thought after reading this was, "OMG, Tesla is relying solely on a visual system?" That is, they use optical cameras, and if the "obstacle" visually looks like the surrounding area it just plows through even if it's a person, a car or a big-rig. So, if you are walking across the street in an outfit that matches your background... you will be run-over???? Something... [More]
I hear what you're saying, but actually it "should" lead to improvement. Failures having occurred are crucial to responsibly understanding what needs be done in the future. Unfortunately a death was involved in this case.
The video even explains that it is not designed to be completely oblivious to the actual driving. The guy was watching a movie and not paying attention. It's his fault only. I think I also saw aEULA when you turn on the autopilot. If that was then he agreed to engage in the dangers of it. Tesla should not be sued or responsible for it.
Please Login or Register to post a comment.