snopes.com  

Go Back   snopes.com > Non-UL Chat > Business Bytes

Reply
 
Thread Tools Display Modes
  #1  
Old 26 March 2017, 03:50 PM
Psihala's Avatar
Psihala Psihala is offline
 
Join Date: 28 February 2001
Location: Denver, CO
Posts: 7,387
Driver Uber Suspends Self-Driving Car Program After Arizona Crash

Uber Technologies Inc suspended its pilot program for driverless cars on Saturday after a vehicle equipped with the nascent technology crashed on an Arizona roadway, the ride-hailing company and local police said.

http://www.nbcnews.com/tech/tech-new...-crash-n738591
Reply With Quote
  #2  
Old 27 March 2017, 11:26 AM
ChasFink's Avatar
ChasFink ChasFink is offline
 
Join Date: 09 December 2015
Location: Mineola, NY
Posts: 349
Driver

Are we really so obsessed with making real life like Star Trek meets The Jetsons that we need self-drivng cars like tomorrow? Maybe if we weren't in such a rush they wouldn't crash so much.
Reply With Quote
  #3  
Old 27 March 2017, 02:38 PM
GenYus234's Avatar
GenYus234 GenYus234 is online now
 
Join Date: 02 August 2005
Location: Mesa, AZ
Posts: 24,242
Default

Are they crashing so much or it is just that they get national news when they do? And, in this case, it was the human driver in the other vehicle that was responsible for the crash.

ETA: I know of three recent accidents where a self-driving car was involved. This one, which seems to be the fault of the human in the other car, the Tesla crash, which was the self-driving system's fault, and the bus-car accident in California, which would seem to be the bus driver's fault (the self-driving car had moved to the far right side of the lane to make a right turn and the bus was attempting to pass in the same lane).

Last edited by GenYus234; 27 March 2017 at 02:45 PM.
Reply With Quote
  #4  
Old 27 March 2017, 03:13 PM
Psihala's Avatar
Psihala Psihala is offline
 
Join Date: 28 February 2001
Location: Denver, CO
Posts: 7,387
Driver

I'm curious, is there any outside reference that identifies an automated car from a non-automated car?

Unless there are going to be dedicated lanes for automated cars during for the foreseeable future when automation and human controlled vehicles will co-exist, the automation is going to have to be robust enough to handle unpredictable situations when human controlled cars are around.

~Psihala
Reply With Quote
  #5  
Old 27 March 2017, 03:36 PM
GenYus234's Avatar
GenYus234 GenYus234 is online now
 
Join Date: 02 August 2005
Location: Mesa, AZ
Posts: 24,242
Default

Most of the ones I've seen have a large sensor package on the roof. Google's says "Self Driving Car" on the side as well. They are programmed to attempt to avoid crashes by other drivers, but, as with human drivers, there is only so much that can be done to avoid a crash.

Also, such avoidance behavior can cause its own set of problems. IMS, the first version of cars had issues with 4 way stop intersections because they kept waiting for the other car to stop completely. Since many drivers creep, the self-driving car wouldn't go through the intersection.
Reply With Quote
  #6  
Old 27 March 2017, 05:00 PM
WildaBeast's Avatar
WildaBeast WildaBeast is online now
 
Join Date: 18 July 2002
Location: Folsom, CA
Posts: 14,568
Default

It seems like one thing human drivers do a lot better than computers is anticipating other drivers breaking the rules. The article lacks details, but it sounds as if the self driving had the right of way, so it proceeded and didn't anticipate that the other driver might not yield. Technically it was in the right, but a human driver might have been able to judge based the other driver's behavior that it looked like he wasn't going to yield and react accordingly.
Reply With Quote
  #7  
Old 27 March 2017, 05:45 PM
erwins's Avatar
erwins erwins is offline
 
Join Date: 04 April 2006
Location: Portland, OR
Posts: 11,356
Default

Quote:
Originally Posted by GenYus234 View Post
Are they crashing so much or it is just that they get national news when they do? And, in this case, it was the human driver in the other vehicle that was responsible for the crash.

ETA: I know of three recent accidents where a self-driving car was involved. This one, which seems to be the fault of the human in the other car, the Tesla crash, which was the self-driving system's fault, and the bus-car accident in California, which would seem to be the bus driver's fault (the self-driving car had moved to the far right side of the lane to make a right turn and the bus was attempting to pass in the same lane).
The Tesla crash was not, or not entirely, the Tesla's fault, since it is not a system where the human driver is supposed to even take their hands off the wheel, let alone not pay attention to a semi pulling across the road.

The Google car accident with the bus was admitted by Google, last I read, to be the fault of the self-driving car. The car had recently "learned" that it could drive in the far right of the lane to make a right turn when other traffic ahead had stopped. But, there was a sandbag in the road toward the curb, and the car couldn't get by. When traffic started moving again, it tried to merge back into the lane, and it apparently "expected" the bus in that lane to give way. The Google car literally (at very slow speed) drove into the side of the bus.

Last edited by erwins; 27 March 2017 at 05:53 PM.
Reply With Quote
  #8  
Old 27 March 2017, 06:07 PM
GenYus234's Avatar
GenYus234 GenYus234 is online now
 
Join Date: 02 August 2005
Location: Mesa, AZ
Posts: 24,242
Default

For the Tesla, "fault" is too strong because you are correct about it not supposed to be used as a full autonomous driving system. Tesla has said that there is a technical issue that contributed to the accident.

Google said that their system had partial fault because it assumed the bus would yield. You are correct that the Google car moved over to make a right hand turn. But the Google car was being passed by the bus that was in the same lane. IOW, the bus was going to move itself into the same lane space that another vehicle was already in. I would say that meant than the bus was the vehicle with the majority of responsibility to avoid a crash with the vehicle already in the lane.
Reply With Quote
  #9  
Old 27 March 2017, 06:12 PM
Beachlife!'s Avatar
Beachlife! Beachlife! is online now
 
Join Date: 22 June 2001
Location: Lansing, MI
Posts: 27,981
Jolly Roger

Quote:
Originally Posted by WildaBeast View Post
...Technically it was in the right, but a human driver might have been able to judge based the other driver's behavior that it looked like he wasn't going to yield and react accordingly.
It's hard to say though. Technically it was in the right, but is it possible it paused longer than a human driver would implying that it was giving up its right of way the same way human drivers seem to do at four ways all the time?

It doesn't sound like an accident that would have happened if both drivers were human.
Reply With Quote
  #10  
Old 27 March 2017, 06:48 PM
erwins's Avatar
erwins erwins is offline
 
Join Date: 04 April 2006
Location: Portland, OR
Posts: 11,356
Default

Quote:
Originally Posted by GenYus234 View Post
For the Tesla, "fault" is too strong because you are correct about it not supposed to be used as a full autonomous driving system. Tesla has said that there is a technical issue that contributed to the accident.

Google said that their system had partial fault because it assumed the bus would yield. You are correct that the Google car moved over to make a right hand turn. But the Google car was being passed by the bus that was in the same lane. IOW, the bus was going to move itself into the same lane space that another vehicle was already in. I would say that meant than the bus was the vehicle with the majority of responsibility to avoid a crash with the vehicle already in the lane.
Google said that it is the norm in that spot for right-turning cars to pull to the right, and for cars proceeding forward to drive past them. The lane is two carwidths across. The Google car also came to a complete stop while pulled to the curb. I would be surprised if most drivers would expect that they were required to yield to the Google car in those circumstances. I'm not sure about the legality, but it certainly happens constantly. Many drivers would let the car back over, but not all, and I'm not sure they would be required to.

The legal aspect might depend on how far over the Google car was, and how much it had already nosed back into the center of the lane.

If a car has pulled to the curb and stopped, it may be regarded, at times, as an obstruction to be driven past or around, and not a car occupying in the normal sense a "lane space." If the car then decides to pull back into traffic, it may be required to merge rather than expect other cars to yield.

Google said this as part of its report:

Quote:
This is the social norm because a turning vehicle often has to pause and wait for pedestrians; hugging the curb allows other drivers to continue on their way by passing on the left. Its vital for us to develop advanced skills that respect not just the letter of the traffic code but the spirit of the road.

On February 14, our vehicle was driving autonomously and had pulled toward the right-hand curb to prepare for a right turn. It then detected sandbags near a storm drain blocking its path, so it needed to come to a stop. After waiting for some other vehicles to pass, our vehicle, still in autonomous mode, began angling back toward the center of the lane at around 2 mph and made contact with the side of a passing bus traveling at 15 mph. Our car had detected the approaching bus, but predicted that it would yield to us because we were ahead of it.
Reply With Quote
  #11  
Old 27 March 2017, 06:50 PM
GenYus234's Avatar
GenYus234 GenYus234 is online now
 
Join Date: 02 August 2005
Location: Mesa, AZ
Posts: 24,242
Default

Quote:
Originally Posted by Beachlife! View Post
It's hard to say though. Technically it was in the right, but is it possible it paused longer than a human driver would implying that it was giving up its right of way the same way human drivers seem to do at four ways all the time?
If you are referring to the Uber accident, it sounds like the Uber car was driving down the road, not at an intersection and that the other driver turned in front of it. I can't find detailed descriptions, but the accident happened at speed which wouldn't be the case if the Uber car had stopped at an intersection. Having a human driver in the other car probably wouldn't have made a difference.
Reply With Quote
  #12  
Old 27 March 2017, 07:04 PM
Beachlife!'s Avatar
Beachlife! Beachlife! is online now
 
Join Date: 22 June 2001
Location: Lansing, MI
Posts: 27,981
Default

It was at an intersection. The other car made a left turn into the Uber vehicle. If they hit it hard enough to put it on its side, its almost as if they didn't see it at all. Unless I'm missing something, the other driver (non Uber) was clearly at fault.
Reply With Quote
  #13  
Old 27 March 2017, 09:42 PM
Skeptic's Avatar
Skeptic Skeptic is online now
 
Join Date: 16 July 2005
Location: Logan, Queensland, Australia
Posts: 1,739
Default

Quote:
Originally Posted by WildaBeast View Post
It seems like one thing human drivers do a lot better than computers is anticipating other drivers breaking the rules. The article lacks details, but it sounds as if the self driving had the right of way, so it proceeded and didn't anticipate that the other driver might not yield. Technically it was in the right, but a human driver might have been able to judge based the other driver's behavior that it looked like he wasn't going to yield and react accordingly.
If I'm in a situation where I'm dependent on the other person to yield, I only go if I can see the driver looking at me. Not sure a self drive car can check another driver's level of attentivness.
Reply With Quote
  #14  
Old 27 March 2017, 09:55 PM
erwins's Avatar
erwins erwins is offline
 
Join Date: 04 April 2006
Location: Portland, OR
Posts: 11,356
Default

It's very hard to tell from the vague reports, but it seemed to me that the Uber car was proceeding through an intersection, but one that was not controlled in that direction. So the car that hit the Uber car was on a side street, and the Uber car had no reason to stop. If I'm right, it is a kind of accident that some human drivers would avoid, based on seeing the other car approaching and making a correct prediction that it was not going to stop, (or seeing it begin its turn, or whatever clues there might have been), but not all humans would predict that level of inattention on the part of the other driver.

Sometimes we all have rely on other drivers to follow the rules--if we didn't, we would never get anywhere.
Reply With Quote
  #15  
Old 05 April 2017, 11:43 PM
DrRocket's Avatar
DrRocket DrRocket is offline
 
Join Date: 03 February 2006
Location: Rosemount, MN
Posts: 2,108
Driver

Quote:
Originally Posted by ChasFink View Post
Are we really so obsessed with making real life like Star Trek meets The Jetsons that we need self-drivng cars like tomorrow? Maybe if we weren't in such a rush they wouldn't crash so much.
I think you can attribute much of the rush to driverless cars to competition between the various companies involved as much as anything else. They all want to be the first to be able declare "Eureka, we've done it!!"
Reply With Quote
  #16  
Old 06 April 2017, 12:11 AM
ganzfeld's Avatar
ganzfeld ganzfeld is offline
 
Join Date: 05 September 2005
Location: Kyoto, Japan
Posts: 22,951
Soapbox

Quote:
Originally Posted by GenYus234 View Post
Are they crashing so much or it is just that they get national news when they do?
We don't know because none of them have released any detailed information about how much they've been driving (and the next parts are extremely important) with what kind of operators, with what kind of traffic, what weather, what road conditions, etc etc. Without that kind of detailed info all comparisons are bogus *. For the most part the comparisons that have been made are definitely skewed in favour of them having a low accident rate because of conditions they will practically never ever see in real life: dry weather, extremely well-known roads, engineers who know in detail how they are supposed to work, etc etc. Also, what about near-miss data? We hear about accidents. I've taught a few people how to drive. If I waited for them to have accidents, it would be too late. What they taught me when I learned to drive is to notice when something goes even a little wrong. Ten near misses is very different from two. We have no idea what the real stats are because every company is holding their cards way too close for an issue that affects everyone.

* It doesn't help that some of these groups have been using some of those bogus stats in PR. For example, claiming that an intelligent cruise control is safer than a human because it has fewer accidents on average, failing to recognise the fact that cruise control is only used in very specific situations. My gosh, if it has even an order of magnitude fewer accidents for the general road, that's not a safe cruise control system!
Reply With Quote
  #17  
Old 06 April 2017, 01:49 AM
ganzfeld's Avatar
ganzfeld ganzfeld is offline
 
Join Date: 05 September 2005
Location: Kyoto, Japan
Posts: 22,951
Driver

Just to be clear, with apologies for double posting, I do think that under good conditions and compared to the same conditions for humans they will look OK, in many if not most cases as good or better than humans. But there are certain conditions under which they cannot yet drive at all, let alone like a human. If we knew all that, I think more states would be willing to draw up rules for how and when they can be used and we would have some place to start. Just be completely honest and open; most people want this. Instead, these companies have opted to keep pounding the media with what amounts to nothing more than a campaign with promises of a better world. So whenever the slightest thing goes wrong people reject those slogans and bogus stats. That is a reasonable reaction when they have nothing but articles written from press releases that are full of ad copy rather than actual data.

Let's have some kind of standard for keeping records of self-driving and intelligent assisted driving and agreements to publish that data. Then independent researchers can make more or less valid comparisons. I think there will be some big surprises. I don't think things will be as far along as many claim. Yet I think people will see that it's a good idea to start making rules and laws to make it easier to safely develop and operate these vehicles.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Christy Natsis gets 5-year sentence for impaired, dangerous driving in 2011 crash Sue Moot Court 6 23 November 2015 06:00 AM
Sigma Nu Fraternity suspends ODU chapter after display of offensive banners TallGeekyGirl Social Studies 5 25 August 2015 09:57 PM
Daycare suspends 2-year-old girl over cheese sandwich Sue Social Studies 19 07 March 2014 09:45 PM
Southwest suspends pilots who flew plane to wrong airport WildaBeast Crash and Burn 0 13 January 2014 09:22 PM
NY Suspends Driver's Licenses for Tax Delinquents snopes Police Blotter 1 06 August 2013 05:08 PM


All times are GMT. The time now is 08:30 PM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2017, vBulletin Solutions, Inc.