Comments:Uber suspends self-driving car program after pedestrian death in Arizona, United States

Back to article

This page is for commentary on the news. If you wish to point out a problem in the article (e.g. factual error, etc), please use its regular collaboration page instead. Comments on this page do not need to adhere to the Neutral Point of View policy. Please remain on topic and avoid offensive or inflammatory comments where possible. Try thought-provoking, insightful, or controversial. Civil discussion and polite sparring make our comments pages a fun and friendly place. Please think of this when posting.

Use the "Start a new discussion" button just below to start a new discussion. If the button isn't there, wait a few seconds and click this link: Refresh.



Start a new discussion

Contents

Thread titleRepliesLast modified
The Talk of the Tape202:27, 22 March 2018
Speeding, human driver also responsible917:14, 21 March 2018

The Talk of the Tape

The video of Sunday's collision has been released: WARNING IT IS A BIT GRAPHIC.

The video clearly show the vehicle was in the right most lane which means she crossed two other lanes to the point of impact. The bike was tossed with her to the sidewalk. She would have been taken to Tempe's St. Lukes Hospital nearest to the accident. Between the night vision, motion sensors, heat signature, and laser guidance, having that much information should have preformed better then the human driver. Also she was approximately 25 ft a light post, with a car passing her moments before - she was visible. She did not jump out from behind a bush. The vehicle never had it's emergency stop activated until the manual override happened. The driver, whom served years in prison for attempted armed robbery, was looking down which could have been a computer screen or a cell phone. The NTSB is going to have a field day with this. Intel, another company testing these vehicles, grounded their fleet in southeast Phoenix. My guess is Uber is thankful they hit a homeless women. No family for a multi-million dollar civil lawsuit. Sad but true.

AZOperator (talk)01:42, 22 March 2018

I predict nobody is going to suggest that driverless vehicles are a bad idea.

Pi zero (talk)01:51, 22 March 2018

No but the IEEE, I’m a member, is looking at their ethics policy and the group responsible for standards is mobilizing. These car are far more safer then a drunk driver. There are other issues with privacy and government possible government overreach. This subject is far more complicated then just this wreak.

AZOperator (talk)02:27, 22 March 2018
 
 

Speeding, human driver also responsible

I would say the car was speeding. It needed to move slower. The speed limits are maximum but the drivers need to make conscious decisions and drive to conditions. If it was dark the car needed to go way slower (the SI units for this were 61km/h or so).

The above materials are mandatory read for all Sydney drivers before obtaining a non-learner driving permit.

I would personally suggest to hold the driver responsible for this not only the self driving car technology.

Gryllida (talk)02:35, 21 March 2018

Yes, I wondered about that, but while specifically not excluding the option of filing charges against the operator in the future, the chief of police went on to warn pedestrians against crossing where there is no crosswalk. I'm aware that cities in the US vary in their tolerance of that; in some places it's quite common for pedestrians crossing against a traffic light or in mid-block to be ticketed, whereas at least one state, California, requires vehicles to stop when a pedestrian is crossing the street regardless of where or when. Also, tolerance of slight speeding varies from one jurisdiction to another; in some places drivers can be ticketed for driving at less than the prevailing speed while that speed may well be a traditional 5 mph above the posted limit. However, a crucial element here is that the car was in autonomous mode. I presume that means the computer chose the speed. For it to be traveling at higher than the speed limit, after dark, raises a red flag in my mind concerning its programming; quite apart from the failure to program in an appropriate response to a pedestrian darting into the road. Next time it could be a little kid chasing a ball in broad daylight.

Yngvadottir (talk)02:46, 21 March 2018

AFAIK in Sydney cars are required to stop regardless of where and when a pedestrian is crossing, but pedestrians are discouraged,by law, to cross within less than 20 meters of a cross walk (even if it does not have zebra).

Personally I had been jaywalking excessively for years, until a point I read some blog post in which the term was given a ridiculing tone (where I learnt what the word meaning was), and then started learning to drive. That allowed me to fully realize that movement of cars are governed by humans, and are error prone. That was also the same time when I was taught the importance of eye contact with other drivers. An interesting experience.

I'd perhaps suggest self driving cars to choose speeds that are below the speed limit (obviously) and reduce it by at least 10 km/h at night (in high pedestrian activity areas, perhaps by far more than that). I'd also encourage holding drivers responsible because if the computer erred in choosing the desired speed, adjusting it needed to be within the human driver control.

Hopefully programmers of the self driving car had logged enough information to at least explain why it was speeding above the nominated speed limit.

Gryllida (talk)03:08, 21 March 2018

I am pretty sure most jurisdictions have laws requiring drivers to stop if a pedestrian is in the roadway. The driver cannot simply laugh, "Ha ha, you are not in the crosswalk. Now I've got you!" and hit them at full speed. The difference in law from place to place is how much of the blame is placed on the driver.

SVTCobra16:20, 21 March 2018
 

The police chief said it'd be hard for the vehicle to avoid in any mode, i.e., whether it was being operated by the car or by a human driver. I figure that means, any case in which such an accident was avoided would be a statistical outlier — and statistical outliers like that are going to happen with a human operator. Technological "AI" isn't going to produce outliers like that (at least, not in a positive direction).

Pi zero (talk)03:09, 21 March 2018

I think the officer statement could perhaps be interpreted as 'at this speed, the death would be unavoidable, no-matter who or what was driving'. I doubt he meant 'if a human driver decided to travel at a slow speed then the death would also be inevitable'. Statistical outlier means that statistically the chances of a pedestrian coming there are so low that a majority of drivers would not decide to slow down, perhaps? I could argue that is a signal to improve driver education.

Gryllida (talk)03:19, 21 March 2018

(To have a conclusive discussion, it could be interesting to obtain the footage from the camera that was present on the car.)

Gryllida (talk)03:20, 21 March 2018

Apparently they've been concerned about rising numbers of pedestrian deaths in Arizona and are focusing on pedestrian education in response: one of AZOperator|AZOperator's sources for their article on the death led me here.

Yngvadottir (talk)15:32, 21 March 2018
 

You're not likely to get the footage, AZ usually does not release that kind of information in fatality wreaks. If they were to release anything all you would see is vehicle crossing the bridge up to the point where she is visible in the median.

I'm not convinced about the would have hit her anyway defense. The vehicle has damage to the front bumper, but not enough to jettison the bike into the sidewalk. Remember this is a three lane road with two turn lanes - that is a long distance. Also very well lit. It will be interesting to see.

AZOperator (talk)17:14, 21 March 2018