We all are familiar with Uber Company which provides ride sharing services to people almost around the world. Uber is a good public vehicle which we can book to go anywhere in the country. Uber is a big American company which provides you vehicle with drivers and without drivers too. Didn’t get my point? Yes guys, Uber self-driving test vehicle is one in which you’ll not be provided with driver rather car will use its self-driving mode. I know it’s quite interesting, but do you know Uber self-driving test vehicle is also dangerous. How?
Let me tell you how. Uber self-driving test mode vehicle is claimed to kill a woman (2018) on road due to flaw found in the software. Recently a document released by National Transportation Safety Board after the investigation of 20th month. According to report it was not designed to recognize the pedestrian and this is the biggest flaw and the matter of safety. Now let me tell you the whole matter below-
A woman accidentally killed by Uber self-driving test mode vehicle
Uber self-driving test mode vehicle killed an Arizona woman, a 49-year old Elaine Herzberg as she was walking a bicycle across a street at night. This accident happened in 2018, but why vehicle didn’t recognize the woman was not disclosed or rather I would say wasn’t clear.
The crash was too late to avoid as the vehicle didn’t correctly identify the bicycle as imminent collision until 1.2 seconds before the impact.
After the death of Herzberg, Uber suspended all its self-driving car operations until December. During this break it added new restrictions and safeguard in the software.
Let’s have a look at the report of NTSB, US
As per the US National Transportation Safety Board (NTSB), it is revealed that some flaws are found in software imbedded in Uber self-driving test vehicle. They also said that car failed to identify woman properly as a pedestrian crossing a street. “The system design didn’t include a consideration for jaywalking pedestrians,” NTSB Said.
NTSB also disclosed that Uber Company’s autonomous test vehicles were involved in 37 crashes between September 2016 and March 2018. The NTSB board will meet November 19 to determine the major cause behind the March 2018 accident in Tempo, Arizona.
NTSB said Uber conducted simulation of sensor data from the Arizona crash with the revised software and told the agency the new software would have been able to detect the pedestrian 88 meters (289 feet) or 4.5 seconds before impact. The car’s system would have started to brake 4v seconds before the impact.
What flaws found in the Software imbedded in Uber?
Uber was using the Volvo XC90 to test its self-driving technology. The flaw which NSTB found in Uber self-driving vehicle is that it was unable to identify the person crossing the street.
Uber’s system perceived Herzberg as a vehicle, a bicycle, and an “unknown object” in the second before impact, as per the preliminary report.
According to James Court, President of the nonprofit group Consumer watchdog, “Difficulties recognizing pedestrians are not unique to Uber’s self-driving cars. Robot cars would do well driving in a world of robots, but not on roads and crosswalks where human beings have the right of way”.
Guys in this final note I just want to tell you that Uber is not criminally liable for car death and its crash driver was watching TV which means accident could have been avoided from happening but software too didn’t work properly. After the accident Uber said, “Our hearts go out to the victim’s family. We are fully cooperating with @TempePolice and local authorities as they investigate this incident”. Uber also suspended all autonomous cars and looking forward to reviewing NTSB recommendations for its new software.
Get in touch with us for more updates. Keep reading and keep sharing!!!