snopes.com  

Go Back   snopes.com > Non-UL Chat > Business Bytes

Reply
 
Thread Tools Display Modes
  #201  
Old 09 May 2018, 05:26 AM
GenYus234's Avatar
GenYus234 GenYus234 is offline
 
Join Date: 02 August 2005
Location: Mesa, AZ
Posts: 26,109
Default

Stop in place, then turn control over to a waiting human drone pilot who can remotely pilot the taxi to safety.
Reply With Quote
  #202  
Old 09 May 2018, 01:28 PM
Alarm's Avatar
Alarm Alarm is offline
 
Join Date: 26 May 2011
Location: Nepean, ON
Posts: 5,694
Airplane

Quote:
Originally Posted by GenYus234 View Post
Stop in place, then turn control over to a waiting human drone pilot who can remotely pilot the taxi to safety.
I guess "Stop, Drop and roll" is out of the question if there's a fire?

Reply With Quote
  #203  
Old 09 May 2018, 02:17 PM
GenYus234's Avatar
GenYus234 GenYus234 is offline
 
Join Date: 02 August 2005
Location: Mesa, AZ
Posts: 26,109
Default

Probably marginally more useful than "duck and cover".
Reply With Quote
  #204  
Old 10 May 2018, 08:00 PM
erwins's Avatar
erwins erwins is offline
 
Join Date: 04 April 2006
Location: Portland, OR
Posts: 12,162
Default

I have been thinking about some of our original debates about AI vehicles, and the case that some have made that real world testing is simply a necessary step that we have to accept, and that along with that, we have to accept that there will be AI-caused crashes.

In the abstract, I agree that, if we are going to have fully AI vehicles, there will have to be real world, not closed environment, development. And there will be crashes. I've also consistently said that we have to be prepared for the fact that AI crashes will look very different from human driver crashes.*

But I think this crash and the Tesla crashes show strongly that this technology is not at the real world testing stage yet. There has been an utter failure of imagination on an incomprehensible scale to me that controlled environment testing would not have included these scenarios. How could you allow tuning the data that can be ignored to include something that is on the scale of something like 2 m˛? If the car is ignoring that, it is also not "aware" of what's behind it. And it would have been tuned this way to avoid unnecessary slowing or stopping. If you can't avoid a pedestrian and bike without the car slowing and stopping so much that it is a problem, then that vehicle is not safe to be on public streets.

The Tesla crashes are similarly disturbing. (Regardless of the fact that a human is supposed to be paying attention.) You have a system that steers the car by following the lane markings on the road. And you have obviously not thoroughly tested the system in construction zone scenarios? There is no reason that that could not be done on closed courses until you are satisfied that commonly occurring scenarios of confusing or multiple lane markings can either be handled, or result in a safe failure mode like stopping, and do not result in a car that will drive straight into a concrete barrier because it is following a lane line that leads there.

All autonomous vehicles that are on the roads now ought to have been tested in those conditions until they can safely negotiate them one way or another. If they can't, they should be in closed course testing until they can.

*This crash is a prime example -- If the released information is accurate, the vehicle sensors detected the pedestrian and her bike, but the software had been tuned to ignore certain sensor data. (As it would have to, in order to not stop every time a leaf or plastic bag blew across the road). It was mis-tuned, so that it ignored the detection data from a person/bike/plastic bags combo. Because of that, the vehicle took no action, and drove "through" the pedestrian. A human would generally only act in that way if they never saw the person (like the "safety" driver, who was looking away from the road).

Last edited by erwins; 10 May 2018 at 08:07 PM.
Reply With Quote
  #205  
Old 10 May 2018, 10:54 PM
Richard W's Avatar
Richard W Richard W is offline
 
Join Date: 19 February 2000
Location: High Wycombe, UK
Posts: 26,209
Default

Quote:
Originally Posted by erwins View Post
All autonomous vehicles that are on the roads now ought to have been tested in those conditions until they can safely negotiate them one way or another. If they can't, they should be in closed course testing until they can.
Absolutely - you've put it more clearly than I've managed to do, or seen before.

It's the same error that pseudo-scientists make when testing their "theories". It seems they've only been testing for success, not failure. Their test environments have been set up to be as easy as possible for the cars, and they've gone straight from that to "real world" without going through the hard bits in between, which might have shown up problems.

Put like this it sounds so obvious that it's hard to believe it's true, but nothing I've read has contradicted it - and all the descriptions I've seen of the automated testing, and even the real-world testing (always in the easiest locations) have seemed to deliberately emphasise it, as though it's a good thing!
Reply With Quote
  #206  
Old 10 May 2018, 11:15 PM
RichardM RichardM is offline
 
Join Date: 27 March 2004
Location: Las Cruces, NM
Posts: 4,449
Default

Waymo has apparently been testing in a closed environment for some time now. Not sure what testing others have done. I do know that early DARPA contests for autonomous vehicles were done in closed environments.

http://www.govtech.com/fs/transporta...Test-Site.html
Reply With Quote
  #207  
Old 10 May 2018, 11:19 PM
Richard W's Avatar
Richard W Richard W is offline
 
Join Date: 19 February 2000
Location: High Wycombe, UK
Posts: 26,209
Default

Yes, there must be some people doing it properly. The reason that Uber and Tesla are the subjects of so much of the conversation is that they're the ones who are pushing it all out onto the roads much too quickly - deliberately and blatantly so in Uber's case, and by pretending that's not what they're doing at all in Tesla's. Uber is obviously worse, but I haven't a lot of respect for Tesla's attempts to market the benefits while simultaneously claiming that they're not trying to do any such thing, either.
Reply With Quote
  #208  
Old 10 May 2018, 11:37 PM
ganzfeld's Avatar
ganzfeld ganzfeld is offline
 
Join Date: 05 September 2005
Location: Kyoto, Japan
Posts: 23,606
Default

Quote:
Originally Posted by Richard W View Post
Yes, there must be some people doing it properly.
We don't know because they aren't giving nearly enough details about what they're doing. The statistical evidence is poor, sparse, and inconclusive.
Reply With Quote
  #209  
Old 11 May 2018, 12:24 AM
erwins's Avatar
erwins erwins is offline
 
Join Date: 04 April 2006
Location: Portland, OR
Posts: 12,162
Default

I think they've all done some amount of closed course testing, or are building on stuff that did. But that misses the point. The DARPA competition HAD to be a closed course because it needed to be made easy enough to be an achievable goal.

Closed course testing has been used to make things easier. So having done closed course testing at some point is not a virtue, and doesn't address my point. My point is that the cars need to stay in closed course testing until they can handle, at the very least, the kind of challenges that crop up all the time in the "real world." They should be throwing those challenges and others at the vehicles until they work out the solutions.

I suppose it is possible that a company could have a robust enough safety driver program that "real world" testing is made safe enough, and then closed course challenges could be set up to work on the issues that are observed. I don't think Waymo has that mindset, since they've announced plans to deploy cars without safety drivers soon. (There would be remote monitoring and the ability to take control remotely, and they won't say exactly how much the cars will be remotely controlled.)

And even Waymo shows the "we'll solve that problem later" mentality. They are testing in extremely fair-weather, unchallenging locations. Can these cars operate safely in precipitation? What happens if a remotely monitored vehicle encounters an unexpected rain storm?
Reply With Quote
  #210  
Old 11 May 2018, 12:36 AM
BoKu's Avatar
BoKu BoKu is offline
 
Join Date: 20 February 2000
Location: Douglas Flat, CA
Posts: 3,881
Default

Quote:
Originally Posted by erwins View Post
I have been thinking about some of our original debates about AI vehicles, and the case that some have made that real world testing is simply a necessary step that we have to accept, and that along with that, we have to accept that there will be AI-caused crashes...
{rest snipped}

That sounds a very effective expression of the situation.

Quote:
Originally Posted by Richard W View Post
...Their test environments have been set up to be as easy as possible for the cars, and they've gone straight from that to "real world" without going through the hard bits in between, which might have shown up problems...
"The amateur practices until they get it right. The professional practices until they can't get it wrong."

--Bob K.
Reply With Quote
  #211  
Old 24 May 2018, 02:19 PM
Psihala's Avatar
Psihala Psihala is offline
 
Join Date: 28 February 2001
Location: Denver, CO
Posts: 8,019
Default

The NTSB released the preliminary report this morning:

https://www.ntsb.gov/investigations/...010-prelim.pdf

~Psihala
Reply With Quote
  #212  
Old 24 May 2018, 02:21 PM
Psihala's Avatar
Psihala Psihala is offline
 
Join Date: 28 February 2001
Location: Denver, CO
Posts: 8,019
Default

Aaaand, on the same day:

Uber self-drives robo-cars out of Arizona after fatal crash

Quote:
Uber is pulling its self-driving cars out of Arizona. The ride-sharing company's reversal was triggered by the recent death of woman who was run over by one of its robotic vehicles while crossing a darkened street in a Phoenix suburb.
https://www.cbsnews.com/news/uber-en...aine-herzberg/

~Psihala
Reply With Quote
  #213  
Old 22 June 2018, 06:38 PM
Psihala's Avatar
Psihala Psihala is offline
 
Join Date: 28 February 2001
Location: Denver, CO
Posts: 8,019
Default Uber car's 'safety' driver streamed TV show before fatal crash: police

The safety driver behind the wheel of a self-driving Uber car in Tempe, Arizona, was streaming a television show on her phone until about the time of a fatal crash, according to a police report that deemed the March 18 incident “entirely avoidable.”

https://www.reuters.com/article/us-u...-idUSKBN1JI0LB
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Christy Natsis gets 5-year sentence for impaired, dangerous driving in 2011 crash Sue Moot Court 7 10 May 2018 05:49 PM
Sigma Nu Fraternity suspends ODU chapter after display of offensive banners TallGeekyGirl Social Studies 5 25 August 2015 09:57 PM
Daycare suspends 2-year-old girl over cheese sandwich Sue Social Studies 19 07 March 2014 09:45 PM
Southwest suspends pilots who flew plane to wrong airport WildaBeast Crash and Burn 0 13 January 2014 09:22 PM
NY Suspends Driver's Licenses for Tax Delinquents snopes Police Blotter 1 06 August 2013 05:08 PM


All times are GMT. The time now is 08:14 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2018, vBulletin Solutions, Inc.