Big Questions Emerge On The Future of Self-Driving Cars, After Recent Fatality



Next Story

On March 18th, 2018, the history of the automated vehicle industry was forever altered, as an automated vehicle caused its first reported fatality in Tempe, Arizona. A 49 year old woman named Elaine Herzberg, was struck and killed, by the vehicle while it was in autonomous mode.

A video released by Tempe Police is bringing up new questions as to why the systems failed to prevent the accident. The Volvo SUV owned and operated by Uber collided with Herzberg who was jaywalking with a bicycle across three lanes of traffic.

It is currently unclear what has gone wrong as there was a driver present in the vehicle and still, it did not prevent a collision. “I think the sensors on the vehicle should have seen the pedestrian well in advance,” says Steven Shladover, a UC Berkeley research engineer who has been studying automated systems for nearly a decade. “If she had been moving erratically, it would have been difficult for the systems to predict where this person was going,” Shaldover continues, “this is one that should have been straightforward.”

Local police are considering how to proceed with the investigation and whether criminal charges will be laid. Arizona is seen as an industry leader and testing ground for autonomous cars. The state government updated laws and regulations from 2015 to include fully autonomous vehicles with an executive order issued by the governor early March 2018. Governor Doug Ducey said “this executive order embraces new technologies by creating an environment that supports autonomous vehicle innovation and maintains a focus on public safety.”

According to the Society of Automotive Engineers (SAE) standards, the Volvo was a level 4 or 5 which means that there is a high to full automation of the car monitoring the driving environment with little to no human response required for the operation of the vehicle. It appears with this incident that the regulatory framework is ahead of the technology and is potentially putting the public at risk, turning people into the guinea pigs for these companies to test their luxurious and expensive new technologies.

Uber issued a statement: “The video is disturbing and heartbreaking to watch, and our thoughts continue to be with Elaine’s loved ones.”

Uber CEO Dara Khosrowshahi tweeted “Some incredibly sad news out of Arizona. We’re thinking of the victim’s family as we work with local law enforcement to understand what happened.” As a response to the incident, Uber has voluntarily grounded its entire autonomous fleet, suspending operations in San Fransisco, Tempe, Pittsburgh, and Toronto. But is this enough of a response to a critical question: Are self-driving cars currently safe enough to be tested on live streets?

It is too soon to say. In 2016, there were 1.18 fatalities for every 100 million miles that had American human drivers behind the wheel. With a grand total of 3.2 trillion miles driven that year, human driving accounts for the vast majority of vehicles on the road.

In contrast, Uber has driven 2 million miles with autonomous vehicles and Waymo, the industry leader, has reported it has reached 4 million miles with autonomous vehicles. Therefore, the difference in miles driven is statistically too small to compare autonomous driving with human driving. However, at 2 million miles, Uber already has a fatality. This suggests that the safety features of autonomous cars are not as safe as corporations wish they are. The question arises, is it safe to rollout self driving cars in multiple cities? Or should regulations be tighter to ensure public safety?

Challenges still exist in the software side of autonomous driving. A Mckinsey report published in May 2017 says there are three types of challenges:

1) Object analysis, or how the software interprets the multitude of data of objects the car encounters on a journey. For example, is the object stationary or moving? Is it moving towards, away, perpendicular or alongside the vehicle?

2) Decision making systems that can mimic human driving behaviour

3) A fail-safe mechanism that allows the car software to fail without putting its passengers and people around it in danger.

Needless to say, the auto industry is gearing up towards the eventual and perhaps inevitable industry wide emergence of autonomous vehicles in the coming decade. However, lawmakers need to make public safety a priority and ensure proper regulation and extensive valid testing is completed before allowing autonomous vehicles on the streets.


Get Your FREE In Depth Numerology Reading

Your life path number can tell you A LOT about you.

With the ancient science of Numerology you can find out accurate and revealing information just from your name and birth date.

Get your free numerology reading and learn more about how you can use numerology in your life to find out more about your path and journey. Get Your free reading.

×

Source Article from http://feedproxy.google.com/~r/Collective-evolution/~3/CkkebrCM3gQ/

Views: 0

You can leave a response, or trackback from your own site.

Leave a Reply

Powered by WordPress | Designed by: Premium WordPress Themes | Thanks to Themes Gallery, Bromoney and Wordpress Themes