snopes.com  

Go Back   snopes.com > Non-UL Chat > Business Bytes

Reply
 
Thread Tools Display Modes
  #41  
Old 20 March 2018, 04:55 PM
iskinner's Avatar
iskinner iskinner is offline
 
Join Date: 06 May 2011
Location: Sacramento, CA
Posts: 403
Frying Pan



Looks like whoever designed the roadway may have some responsibility. "Let's build a pedestrian path in the middle of the road that would be unsafe for pedestrian's to use" <- Urban Planner/Designer responsible for this.
Reply With Quote
  #42  
Old 20 March 2018, 05:24 PM
GenYus234's Avatar
GenYus234 GenYus234 is offline
 
Join Date: 02 August 2005
Location: Mesa, AZ
Posts: 25,688
Default

Quote:
Originally Posted by Richard W View Post
(Addendum to my post above) I see people are already phrasing this in terms of "stupid humans" getting in the way of the robots, in fact.
Yes, sometimes stupid humans will get in the way of robots. Just like stupid humans get in the way of stupid and not stupid humans. The robot should/will be better at avoiding the stupidity of humans but no system can completely avoid human stupidity. Developers and the public need to avoid the fallacy of the perfect is the enemy of the good.

Tempe Chief of Police Sylvia Moir has viewed the dash cam video from the car and said, "itís very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway."
Reply With Quote
  #43  
Old 20 March 2018, 06:53 PM
Alarm's Avatar
Alarm Alarm is offline
 
Join Date: 26 May 2011
Location: Nepean, ON
Posts: 5,579
Default

Humans are good at not choosing the safer, obvious path.

At a mall I go to, there is a ramp leading into a tiered parking lot that runs between the parking lot and the mall. There is a path marked with pedestrian crossing signs, and big yellow cones but people still walk on the ramp to get to the doors, because they would rather walk a shorter distance, than walk safely. I've even see people on the ramp itself pushing a baby carriage.

There is not much anyone can do when someone walks onto a roadway from the shadow or from behind an obscuring object. In this case, would a radar have helped? I believe the car uses a visual sensor similar to a photo-radar to identify oncoming objects?
Reply With Quote
  #44  
Old 20 March 2018, 07:16 PM
erwins's Avatar
erwins erwins is offline
 
Join Date: 04 April 2006
Location: Portland, OR
Posts: 11,998
Default

People not using the path they are supposed to use can also mean that the design is faulty. If I design the safest X in the world, but most humans will not actually use it, then I have not made a safer design.

It would be nice to know what sensing tech the car was using. If autonomous cars are touted as being safer than human drivers, then why would it be satisfactory to say, "even a human driver could not likely have avoided this accident"? Why would it necessarily be an excuse that the pedestrian stepped from the shadows? To understand why this happened, we need to know what the car is supposed to be doing, and how it is supposed to be doing it. So what if a human driver might have hit the pedestrian? The question is whether the Uber car did because of a failure of something that should have prevented it, or a limitation of a system that can be improved, etc.
Reply With Quote
  #45  
Old 20 March 2018, 07:38 PM
BoKu's Avatar
BoKu BoKu is offline
 
Join Date: 20 February 2000
Location: Douglas Flat, CA
Posts: 3,871
Driver

Quote:
Originally Posted by GenYus234 View Post
...Tempe Chief of Police Sylvia Moir has viewed the dash cam video from the car and said, "itís very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway."
Interesting tidbit from that report:

Quote:
Traveling at 38 mph in a 35 mph zone...
Not that this is any kind of smoking gun, or even a proximal cause, just that it's interesting that the car was not adhering to the strict letter of the posted speed limit. Are they programmed to add a couple MPH when it is likely they'll get away with it? Was there some sort of mismatch between the actual and perceived limit? Or are they just not that concerned with a couple MPH either way?
Reply With Quote
  #46  
Old 20 March 2018, 07:45 PM
iskinner's Avatar
iskinner iskinner is offline
 
Join Date: 06 May 2011
Location: Sacramento, CA
Posts: 403
Default

The reporting of the speed limit is unclear.

Quote:
The vehicle was traveling 38 mph, though it is unclear whether that was above or below the speed limit. (Police said the speed limit was 35 mph, but a Google Street View shot of the roadway taken last July shows a speed limit of 45 mph along that stretch of road.)
The Verge article
Reply With Quote
  #47  
Old 20 March 2018, 09:02 PM
Beachlife!'s Avatar
Beachlife! Beachlife! is offline
 
Join Date: 22 June 2001
Location: Lansing, MI
Posts: 28,432
Default

If autonomous vehicles will be traveling quite a bit below the speed limit I wonder if this will lead to more accidents while they are mixing with human drivers.
Reply With Quote
  #48  
Old 20 March 2018, 09:34 PM
GenYus234's Avatar
GenYus234 GenYus234 is offline
 
Join Date: 02 August 2005
Location: Mesa, AZ
Posts: 25,688
Default

Quote:
Originally Posted by erwins View Post
If autonomous cars are touted as being safer than human drivers, then why would it be satisfactory to say, "even a human driver could not likely have avoided this accident"?
The answer to that question is half the formula to determine if/when self-driving cars are generally safer than human drivers which (hopefully then and not before) will be the point when they can be sold in the general market. The other half being, are there accidents that human drivers cause that self-driving cars would not?

Quote:
To understand why this happened, we need to know what the car is supposed to be doing, and how it is supposed to be doing it. So what if a human driver might have hit the pedestrian? The question is whether the Uber car did because of a failure of something that should have prevented it, or a limitation of a system that can be improved, etc.
Hopefully that is exactly what the engineers and programmers for the system are doing. For some of these incidents, there will be more that the computer can do to avoid the accident. For some, there won't be anything that could be done. A lot of the time that will be because of something the other person did or mechanical failures in the vehicle.

ETA: tl;dr: Whether a human driver wouldn't have gotten into the accident shouldn't be part of the development process, but it should be part of the implementation process.

Last edited by GenYus234; 20 March 2018 at 09:42 PM.
Reply With Quote
  #49  
Old 20 March 2018, 09:46 PM
BoKu's Avatar
BoKu BoKu is offline
 
Join Date: 20 February 2000
Location: Douglas Flat, CA
Posts: 3,871
Default

Quote:
Originally Posted by iskinner View Post
The reporting of the speed limit is unclear...
Ah, thanks for that information. Standing by to see how it shakes out.
Reply With Quote
  #50  
Old 20 March 2018, 09:47 PM
Darth Credence's Avatar
Darth Credence Darth Credence is offline
 
Join Date: 28 October 2005
Location: Salt Lake City, UT
Posts: 3,457
Default

As a note, the current claims of self driving cars on roadways is around 10 million miles traveled, among all autonomous vehicles. This is the first death.
The average for pedestrian deaths is one per 480 million miles traveled. For all deaths related to autos, it is one per 88 million miles traveled. Now, there is clearly not even close to enough data yet to make a comparison - we need billions of miles before we can do that. But that goes both ways - we can't say they are more dangerous than humans (they likely aren't), but we also can't say they are safer than humans.
Reply With Quote
  #51  
Old 20 March 2018, 09:52 PM
erwins's Avatar
erwins erwins is offline
 
Join Date: 04 April 2006
Location: Portland, OR
Posts: 11,998
Default

Part of my point is that it should not be part of the fault process. Police seemed to be saying a human driver might not have been able to avoid it, so Uber was likely not at fault. But if Uber designed the car to detect pedestrians in the dark, but negligently failed to switch that system on, (or some similar issue) then Uber could be at fault, and it has nothing to do with human drivers' capabilities. Or what if it had such a system, but the bike and shopping bags prevented it from recognizing her as a pedestrian? Is that reasonable? Should it depend on whether a human could have recognized her?

Also, this needs to be a transparent process, not something that we merely hope is happening behind the scenes. I'm very pleased that the NTSB is involved.

Last edited by erwins; 20 March 2018 at 09:58 PM.
Reply With Quote
  #52  
Old 20 March 2018, 09:57 PM
Errata's Avatar
Errata Errata is offline
 
Join Date: 02 August 2005
Location: Santa Barbara, CA
Posts: 13,118
Default

Quote:
Originally Posted by GenYus234 View Post
The other half being, are there accidents that human drivers cause that self-driving cars would not?
And human drivers cause a lot of accidents that other, outwardly similar drivers wouldn't cause. Every human is different, and some are way worse than others. Even the most competent ones have off days and get distracted. Even if a self-driving car were only up to the imperfect standard of a reasonably competent human driver, it would be that way consistently day after day, which makes it better than a large segment of the population that we're sharing our roads with and getting killed by many times per hour.

To say that even a competent human driver wouldn't have been able to do anything, and holding it to that standard, is holding it to a higher standard than the average pool of human drivers including the competent and incompetent alike.

Software designed to a high safety critical standard (like aviation software, e.g. not consumer grade Windows applications) can be expected to behave deterministically and reliably. Note that I am skeptical that some of the earlier to market companies do have sufficiently high standards, and there should be more government oversight. The hardware sensors it rely on are far less reliable, but they can diagnose faults and flag themselves for service if needed, and the sensor technologies have been improving quite rapidly, which will only accelerate as they are deployed on larger scales.

Crucially, once the software gets better, it tends to stay better, and all of that model of driving software around the world gets better, improving more or less monotonically. Humans very much do not share this property. Every time a new person learns to drive we're starting over back at square one, making a lot of the same mistakes as previous drivers and learning very little from their example. Even a single human driver can get worse than they used to be, with age or impairment. We can't ratchet up human skill level over time, like we can with software. We're stuck at about as good as we'll get at driving, and statistically that's not great. Software will keep improving, and every time it gets better every other self-driving car gets better and stays better. If the starting point by the time it's widely commercially available is already moderately better than humans, that's a good trajectory.
Reply With Quote
  #53  
Old 22 March 2018, 09:21 AM
ganzfeld's Avatar
ganzfeld ganzfeld is offline
 
Join Date: 05 September 2005
Location: Kyoto, Japan
Posts: 23,519
Driver

Quote:
Originally Posted by Errata View Post
Even the most competent ones have off days and get distracted. Even if a self-driving car were only up to the imperfect standard of a reasonably competent human driver, it would be that way consistently day after day, [...]
On what basis do you make the claim that intelligent systems don't get distracted or are not subject to "off days"? What do you mean by "consistently"?
Reply With Quote
  #54  
Old 22 March 2018, 02:34 PM
thorny locust's Avatar
thorny locust thorny locust is online now
 
Join Date: 27 April 2007
Location: Upstate NY
Posts: 9,027
Default

Quote:
Originally Posted by iskinner View Post
Looks like whoever designed the roadway may have some responsibility. "Let's build a pedestrian path in the middle of the road that would be unsafe for pedestrian's to use"
Didn't she come into the road from the median, not from a pedestrian path?

A median generally isn't meant to be used as a pedestrian walkway. It's usually a grassed or otherwise landscaped strip that helps keep traffic going in opposite directions separate, and which may absorb excess water and/or make the area look better.

-- if the road in general was designed in that area in such a way that in order to get around pedestrians were very likely to use the median, however, I'll agree with you that the road design may have been part of the problem.


If the woman who was hit actually stepped out in front of the car with no warning, this may have been an accident that neither a human nor a computer could have been expected to avoid. It does need thorough investigation, though, to decide whether a good human driver could have avoided it, and if so whether the self-driving cars can be taught to do the same thing. I will say that, presuming the pedestrian was visible while on the median, a good human driver should have used extra caution, because a human next to a highway in a location where they shouldn't be may do something else that they shouldn't, such as move unexpectedly into the highway; but that might not have been enough to avoid collision in this particular case. (I once avoided running over a bicyclist because I noted a) that the bicyclist looked a bit wobbly and b) that we were on a bit of road with multiple bars on it and the bicyclist might well have just come out of one and so might be wobbly because drunk. I slowed down considerably: and was therefore able not to hit the person when they fell over right in front of my car. If I'd been going the speed limit I'd almost certainly have gone right over them,)

Self-driving cars will have to be at least as good as a good human driver who's in good shape and paying attention at the moment. If they're only as good as a bad human driver, or as an otherwise good one who's distracted or tired, that really isn't going to be good enough.
Reply With Quote
  #55  
Old 22 March 2018, 02:53 PM
iskinner's Avatar
iskinner iskinner is offline
 
Join Date: 06 May 2011
Location: Sacramento, CA
Posts: 403
Default

Quote:
Originally Posted by thorny locust View Post
Didn't she come into the road from the median, not from a pedestrian path?
Yes, the reports are that she came from the median. But if the google map images are correct there are "decorative" pathways on that particular median with "Don't use" signs posted adjacent to them. Thus why whoever designed that piece of roadway should have some blame.

All your points are reasonable, but the apparent tone of that this isn't or wont happen seems odd to me. I am reasonably sure that many good engineering minds are going over all the data recorded by the car and figuring out exactly what it sensed and how it reacted and how that can be improved on for future iterations of self-driving vehicles.

That is a huge advantage to such a system, once a lesson is learned and improvements are implemented it can be applied to all future vehicles, where as with humans, the lessons have to be learned over and over again.
Reply With Quote
  #56  
Old 22 March 2018, 03:28 PM
Beachlife!'s Avatar
Beachlife! Beachlife! is offline
 
Join Date: 22 June 2001
Location: Lansing, MI
Posts: 28,432
Default

Here's a video of the actual accident. I'm shocked at how invisible she is until she steps out of the shadow. https://www.cnn.com/videos/cnnmoney/...-orig.cnnmoney
I'm also wondering why the pedestrian appears to be entirely oblivious to the car.
Reply With Quote
  #57  
Old 22 March 2018, 03:45 PM
iskinner's Avatar
iskinner iskinner is offline
 
Join Date: 06 May 2011
Location: Sacramento, CA
Posts: 403
Default

I'm guessing the pedestrian thought she was visible, not realizing that she was actually outside the effective range of the nearby streetlight and did not become visible until she was in range of the vehicle's headlights. The dark clothes with no reflective aids did not help either.

I would like to know what type of sensors the car uses and how they worked in this situation. I would hope they where augmented with things that would use wave lengths that are not restricted to the narrow range of visible light.

On the other hand it is apparent that the human safety driver was not looking forward down the street until a moment before the collision. That is not a good case though it is unfortunately a very human one.
Reply With Quote
  #58  
Old 22 March 2018, 04:10 PM
thorny locust's Avatar
thorny locust thorny locust is online now
 
Join Date: 27 April 2007
Location: Upstate NY
Posts: 9,027
Default

Quote:
Originally Posted by iskinner View Post
Yes, the reports are that she came from the median. But if the google map images are correct there are "decorative" pathways on that particular median with "Don't use" signs posted adjacent to them.
Ah. I didn't realize that. Yes, making pathways and then expecting people not to use them, signs or no signs, is not a bright idea.

Quote:
Originally Posted by iskinner View Post
All your points are reasonable, but the apparent tone of that this isn't or wont happen seems odd to me.
Sorry if I sounded like that -- I realize that they are investigating, and didn't mean to give the impression that I thought they won't do so.

Quote:
Originally Posted by iskinner View Post
I'm guessing the pedestrian thought she was visible, not realizing that she was actually outside the effective range of the nearby streetlight and did not become visible until she was in range of the vehicle's headlights. The dark clothes with no reflective aids did not help either.
I haven't watched the video, which I gather includes the actual impact; but the still taken from it which illustrated the article I saw about its release does make it look as if the bicycle itself showed up well in headlights -- but also as if the car's headlights weren't illuminating anything much to the left side of the car, which was the direction the pedestrian appears to have been coming from. If that's accurate, and not just a false impression left by that one still, better headlights would have been a good idea. And I wonder what area the sensors were designed to cover? did they also produce a blind spot area to the left side of the car?

Many people who don't drive have a really poor understanding of how little the driver of a car can see in the dark. There are lots of things readily visible to a pedestrian in low-light conditions that a driver is highly unlikely to see. But since it's impractical to make everyone who doesn't drive spend enough time in a simulator to understand this, design both of roadways and of these cars needs to take it into account. I expect the designers will probably now do so. I don't know whether they were trying to do so before.
Reply With Quote
  #59  
Old 22 March 2018, 04:14 PM
RichardM RichardM is offline
 
Join Date: 27 March 2004
Location: Las Cruces, NM
Posts: 4,282
Default

Headlights in the US and other countries where vehicles travel on the right side of the road are deliberately designed to not illuminate to the left so as to avoid glare in the eyes of on coming traffic.
Reply With Quote
  #60  
Old 22 March 2018, 04:19 PM
erwins's Avatar
erwins erwins is offline
 
Join Date: 04 April 2006
Location: Portland, OR
Posts: 11,998
Default

We also tend to take video as evidence of "reality" when it may not be. What is the evidence that the video shows accurately how visible she was? If the video tends to get washed out by oncoming headlights, e.g., it could be set to a "darker" setting.

And, again, all of the talk of visibility may not mean much, depending on what sensors the car was supposed to be relying on.

I hope they are pulling the cell phone records of the human driver. It sure looks like they were looking down at a phone to me. Part of what these companies say they are doing to make these real world experiments safe is to have human drivers there to back up the tech. Uber already got in trouble in Colorado for hiring felons. (In trouble in CO because it was illegal there for them to hire felons. So they moved to a state where it wasn't illegal. This safety driver was a felon.) Hiring people who can't be bothered to actually watch the road ought to get the whole thing shut down until they can show that they take safety seriously.

ETA: Thorny locust, it does not show the moment of impact in the above link, but shows the view to the front of the vehicle from a few seconds before the person becomes visible in the video until a split second before the impact, I would say. It also shows video from a camera facing the safety driver, and you can see they are looking down, and then they look up, look surprised, and move to take control of the car.

Last edited by erwins; 22 March 2018 at 04:32 PM.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Christy Natsis gets 5-year sentence for impaired, dangerous driving in 2011 crash Sue Moot Court 7 10 May 2018 05:49 PM
Sigma Nu Fraternity suspends ODU chapter after display of offensive banners TallGeekyGirl Social Studies 5 25 August 2015 09:57 PM
Daycare suspends 2-year-old girl over cheese sandwich Sue Social Studies 19 07 March 2014 09:45 PM
Southwest suspends pilots who flew plane to wrong airport WildaBeast Crash and Burn 0 13 January 2014 09:22 PM
NY Suspends Driver's Licenses for Tax Delinquents snopes Police Blotter 1 06 August 2013 05:08 PM


All times are GMT. The time now is 12:05 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2018, vBulletin Solutions, Inc.