snopes.com  

Go Back   snopes.com > Non-UL Chat > Business Bytes

Reply
 
Thread Tools Display Modes
  #21  
Old 13 June 2018, 08:54 AM
Hans Off's Avatar
Hans Off Hans Off is offline
 
Join Date: 14 May 2004
Location: West Sussex, UK
Posts: 4,600
Default

Quote:
Originally Posted by GenYus234 View Post
That's not the case currently. Drivers do have to intervene for safety or other reasons, even when the system is otherwise fully functional. The numbers vary tremendously, partially because there is no standard for reporting. Waymo cars need driver safety intervention about every 5,600 miles and Cruise every 1,250.
I wonder what the threashold needs to be before they are objectivley considered “Safe” from a risk perpective? (or when they become safer that human drivers)

At the rate of AI development I speculate that we cannot be that far off, much less than a decade.

Selling the concept of them being “safe” to a subjective public/legislative/political viewpoint is another thing entirely!
Reply With Quote
  #22  
Old 13 June 2018, 09:29 AM
Richard W's Avatar
Richard W Richard W is offline
 
Join Date: 19 February 2000
Location: High Wycombe, UK
Posts: 26,294
Default

Quote:
Originally Posted by GenYus234 View Post
Waymo cars need driver safety intervention about every 5,600 miles and Cruise every 1,250.
Didn't it turn out that Waymo had been defining "need" in that sentence not to mean when the driver actually intervened, but only when the driver intervened and their own simulation said that they were right to do so? In other words they ignored the times when the driver intervened, but their later simulation said that the car wouldn't have hit anything anyway... which seems to be begging the question a little.
Reply With Quote
  #23  
Old 13 June 2018, 01:29 PM
ChasFink's Avatar
ChasFink ChasFink is offline
 
Join Date: 09 December 2015
Location: Mineola, NY
Posts: 893
Military

Quote:
Originally Posted by GenYus234 View Post
Also, military drones crash. According to the Smithsonian, there were 418 serious crashes of drones since between 2001 and 2014, about 1/4 of those in training or testing. That's out of a fleet of about 10,000 drones, there are about 263 million registered vehicles in the US.
Also, that's war we're talking about. You know, that thing where killing people and taking potentially fatal risks is considered normal. (Actually less so these days, but you get my drift.)
Reply With Quote
  #24  
Old 13 June 2018, 03:42 PM
GenYus234's Avatar
GenYus234 GenYus234 is offline
 
Join Date: 02 August 2005
Location: Mesa, AZ
Posts: 26,270
Default

Quote:
Originally Posted by Hans Off View Post
I wonder what the threashold needs to be before they are objectivley considered “Safe” from a risk perpective? (or when they become safer that human drivers)
I assume it would be based on accidents, injuries, and/or fatalities (both occupants and pedestrians) for every x number of miles driven. Ideally it would be based on similar locations and times, not just generalized. If a vehicle could be driven or set to self-drive, I would imagine it would be used more on long stretches of freeway where the driving is pretty boring. That type of driving would typically be safer for human drivers, so it might skew the numbers. On the flip side, numbers of DUIs and their attendant crashes would hopefully go down so that could increase the time before auto-driving cars are safer.

Quote:
Originally Posted by Richard W View Post
Didn't it turn out that Waymo had been defining "need" in that sentence not to mean when the driver actually intervened, but only when the driver intervened and their own simulation said that they were right to do so? In other words they ignored the times when the driver intervened, but their later simulation said that the car wouldn't have hit anything anyway... which seems to be begging the question a little.
Probably. This report from Waymo to the California DMV (pdf) includes this section about disengagements:

Quote:
To help evaluate the safety significance of driver disengagements, we employ a powerful simulator program-developed in-house by our engineers- that allows the team to "replay" each incident and predict the behavior of the self-driving car (had the driver not taken control of it) as well as the behavior and positions of other road users in the vicinity (such as pedestrians, cyclists, and other vehicles). The simulator can also create thousands of variations on that core event so we can evaluate what would have happened under slightly different circumstances, such as our vehicle and other road users moving at different times , speeds, and angles. Through this process we can determine the events that have safety significance and should receive prompt and thorough attention from our engineers in resolving them. The rate of this category of disengagements declined from 0.16 disengagements per thousand miles to 0.13 , representing a 19% reduction.
The California DMV's definition of what is reportable is:

Quote:
when a failure of the autonomous technology is detected,
when the safe operation of the vehicle requires that the autonomous vehicle test driver disengage the autonomous mode and take immediate manual control of the vehicle.
If their software simulation said the disengagement wasn't necessary, then, according to the California DMV, it isn't reportable.
Reply With Quote
  #25  
Old 14 June 2018, 10:16 AM
Richard W's Avatar
Richard W Richard W is offline
 
Join Date: 19 February 2000
Location: High Wycombe, UK
Posts: 26,294
Default

Here's a video of the way Tesla's "autopilot" behaves on a test track in a particular situation that I assume must be easily reproducible (because they've set it up to happen, and aren't surprised by it):

https://www.bbc.co.uk/news/av/busine...t-happens-next

Hardly an unfeasible situation, and it's happening in an ideal scenario of the kind that the autopilot is supposed to be designed for (highway driving in lanes on a level, open road with good visibility). Of course if that happened in reality, Tesla would blame the driver for not paying attention...
Reply With Quote
  #26  
Old 14 June 2018, 12:09 PM
Don Enrico's Avatar
Don Enrico Don Enrico is offline
 
Join Date: 05 October 2004
Location: Hamburg, Germany
Posts: 7,636
Default

The video makes a very good point: If I'm expected to monitor the road and keep control at all times, it's not a self-driving car - it's a car with driving assitance systems.

Tesla shouldn't call the feature "autopilot". It isn't an autopilot.
Reply With Quote
  #27  
Old 14 June 2018, 12:48 PM
Richard W's Avatar
Richard W Richard W is offline
 
Join Date: 19 February 2000
Location: High Wycombe, UK
Posts: 26,294
Default

It makes me wonder what it is supposed to be for, if not a device for letting the driver pay less attention to the road.

(eta) I agree that the Autopilot is misleadingly named, as people have been arguing for a while, but to be fair Tesla do distinguish between their Autopilot, Advanced Autopilot and the "Full Self-Driving Capability" that they're already offering on their website as an optional extra for £2800. I wonder if you're allowed to assume that the "Full Self-Driving Capability" means you don't have to pay attention to the road? Or if people will assume that having paid for it, they will actually have it, and miss the part where it doesn't yet exist?

https://www.tesla.com/en_GB/models/design

Last edited by Richard W; 14 June 2018 at 12:55 PM.
Reply With Quote
  #28  
Old 14 June 2018, 03:30 PM
erwins's Avatar
erwins erwins is offline
 
Join Date: 04 April 2006
Location: Portland, OR
Posts: 12,194
Default

Quote:
Originally Posted by GenYus234 View Post
I assume it would be based on accidents, injuries, and/or fatalities (both occupants and pedestrians) for every x number of miles driven. Ideally it would be based on similar locations and times, not just generalized. If a vehicle could be driven or set to self-drive, I would imagine it would be used more on long stretches of freeway where the driving is pretty boring. That type of driving would typically be safer for human drivers, so it might skew the numbers. On the flip side, numbers of DUIs and their attendant crashes would hopefully go down so that could increase the time before auto-driving cars are safer.
There is no fair comparison now, because the AV numbers are based on driving the cars in the best case scenario. They don't drive in bad weather, they drive in geofenced, well-mapped areas, probably more freeway driving (which humans have much better numbers on as well), and they probably give up control or are not driven in other difficult circumstances, like construction zones, detours, etc.
While, of course, humans drive in all kinds of conditions, and the difficult conditions are often a factor in human crashes.

Also, should we be comparing AVs to current human driving, or to humans driving cars with advanced safety features, like lane keeping, blind spot detection, and collision avoidance systems like emergency braking? Isn't that the alternative to future fully automated cars? Those systems are already available and will also be improving alongside AV development.
Reply With Quote
  #29  
Old 14 June 2018, 03:32 PM
GenYus234's Avatar
GenYus234 GenYus234 is offline
 
Join Date: 02 August 2005
Location: Mesa, AZ
Posts: 26,270
Default

Quote:
Originally Posted by Richard W View Post
It makes me wonder what it is supposed to be for, if not a device for letting the driver pay less attention to the road.
It could (and should) be a system to "fill in the gaps" when a human driver has a lapse of concentration or is simply unable to respond because of human limitations. The failures are kind of similar (but more severe) to what supposedly happened when anti-lock brakes were introduced. Drivers, used to pumping the brakes to prevent their tires from locking up, kept pumping the brakes on the anti-lock cars, preventing them from working most effectively. In some cases, the resulting stopping distance was greater than it would have been without anti-lock brakes.

Also agreed, the name "Autopilot" is extremely misleading.

ETA:
Quote:
Originally Posted by erwins View Post
Also, should we be comparing AVs to current human driving, or to humans driving cars with advanced safety features, like lane keeping, blind spot detection, and collision avoidance systems like emergency braking? Isn't that the alternative to future fully automated cars? Those systems are already available and will also be improving alongside AV development.
It is hard to say where the comparison should be made. Comparing to humans driving advanced safety cars might not be that useful because (as you said) the technologies that would cause self-driving cars to be safer would be incorporated into the advanced human driven cars as well. While those features would lag behind by several years, we would want several years of data to compare for better reliability. Effectively you'd be comparing a self-driving car to a car that has nearly all of the advantages of a self-driving car. I guess it depends on what the overall goal is. It seems that most manufacturers are pushing self-driving cars as a convenience or as a new model of car usage. The safety comparison is to justify why this should be allowed. So of course they would want to compare to a human in a plain car. If the goal is overall safety (which seems to be a hard sell for most manufacturers) then the comparison would be made to the best alternative, a human driver with full assist.

And of course, it will be come murkier in the future as driving assists draw closer and closer to essentially auto driving.

Last edited by GenYus234; 14 June 2018 at 03:43 PM.
Reply With Quote
  #30  
Old 14 June 2018, 03:50 PM
erwins's Avatar
erwins erwins is offline
 
Join Date: 04 April 2006
Location: Portland, OR
Posts: 12,194
Default

Autopilot may also just be incompatible with how human brains work. I mentioned this in more detail in the other recent thread, but our brains, for good reason, tend to stop paying attention to things that don't seem to require our attention. We can, in a sense "override" that, and deliberately pay attention to something that seemingly does not require it, but it is difficult, error prone, and we can't do it for very long without attention lapses.

Another thing about autopilot is, Musk recently said, I think, that it was intended for use only on highways. But it is not limited by the quite available tech to being turned on only on highways.
Reply With Quote
  #31  
Old 14 June 2018, 04:32 PM
GenYus234's Avatar
GenYus234 GenYus234 is offline
 
Join Date: 02 August 2005
Location: Mesa, AZ
Posts: 26,270
Default

Both fatal accidents where the Autopilot was in control were on highways*, so using such technology wouldn't have prevented either of them.

ETA: So, ironically, the Notopilot** *** on the Teslas might be safer if it was "detuned" so that it didn't keep to the lane so well, making driver's brains pay more attention?

* Classic definition. If he meant freeway instead, then 50% of the fatal accidents were on the designed roadway.
** Trademark applied for
*** Not-o-pilot? Naughopilot? Naught-opilot?
Reply With Quote
  #32  
Old 14 June 2018, 04:37 PM
thorny locust's Avatar
thorny locust thorny locust is offline
 
Join Date: 27 April 2007
Location: Upstate NY
Posts: 9,523
Default

Quote:
Originally Posted by GenYus234 View Post
It seems that most manufacturers are pushing self-driving cars as a convenience or as a new model of car usage. The safety comparison is to justify why this should be allowed.
It seems to me that there are two potential large advantages, if they can actually get fully self-driving to work:

-- 1) if the self-driving systems becomes actually safer than human drivers, it would save lives. (Doesn't sound to me like they're there yet.)

-- 2) if they could become fully self-driving on all types of roads and in varying weather conditions, this would allow transportation for quite a lot of people who can't or shouldn't be driving: children; adults with medical conditions and/or age related conditions that make driving unsafe; exhausted people; drunk people . . . (They're not there yet, either. I hope they get there before I have to stop driving.)

Quite a lot of people in the USA, and to varying extents elsewhere, have no access to good public transportation that will take them when and where they need to go.

Quote:
Originally Posted by erwins View Post
Autopilot may also just be incompatible with how human brains work.
Yes. I think there may be a sort of equivalent to the 'uncanny valley' problem, and the technology's in it at the moment. A certain amount of assistance makes the car safer -- an alarm that tells you you're about to back up over somebody, antilock brakes, etc. An automatic system that could actually entirely drive the car could potentially also be safer. But there's an inbetween ground that seems to be making the car more dangerous, in that it requires continuous human alertness combined with generally nothing to be alert for. Humans are really bad at that.

And I think that the general societal tendency for short-term competitive business thinking, and also to Adopt Every New Thing Quick And Deal With The Problems Later, may be causing the companies involved to try to release technology that's in that dangerous valley; instead of sorting out and only releasing the tech that's on one side of the valley, and spending the additional years needed to develop technology that's out on the other side before it's released onto the public roads.
Reply With Quote
  #33  
Old 14 June 2018, 04:45 PM
erwins's Avatar
erwins erwins is offline
 
Join Date: 04 April 2006
Location: Portland, OR
Posts: 12,194
Default

Quote:
Originally Posted by GenYus234 View Post
Both fatal accidents where the Autopilot was in control were on highways*, so using such technology wouldn't have prevented either of them.

ETA: So, ironically, the Notopilot** *** on the Teslas might be safer if it was "detuned" so that it didn't keep to the lane so well, making driver's brains pay more attention?

* Classic definition. If he meant freeway instead, then 50% of the fatal accidents were on the designed roadway.
** Trademark applied for
*** Not-o-pilot? Naughopilot? Naught-opilot?
But there have been multiple (many? I haven't looked for numbers) crashes of Teslas with autopilot engaged where the Tesla hit the back of a stationary car on surface streets. Most recently in the news was a police car and also recently a fire engine. That was what prompted the statement about highway only use. This was the statement from Tesla: "Autopilot is designed for use on highways that have a center divider and clear lane markings[.]" I was remembering Musk saying essentially the same thing.

ETA: (responding to your ETA) No, detuning is not what I would think would be safer. Many manufacturers have decided that the commercially available technology needs to skip the phase that AVs being tested now are in -- once you go beyond driver-assist technology to where the car is driving most of the time, then the car has to be responsible for handling emergencies. It could still have a failure mode in non-emergencies where it comes to a safe stop and hands over control, but you can't expect to hand over control in an emergency to a person who was not previously in control of the car.

Last edited by erwins; 14 June 2018 at 04:54 PM.
Reply With Quote
  #34  
Old 14 June 2018, 05:13 PM
Richard W's Avatar
Richard W Richard W is offline
 
Join Date: 19 February 2000
Location: High Wycombe, UK
Posts: 26,294
Default

Quote:
Originally Posted by thorny locust View Post
And I think that the general societal tendency for short-term competitive business thinking, and also to Adopt Every New Thing Quick And Deal With The Problems Later, may be causing the companies involved to try to release technology that's in that dangerous valley...
That seems quite clear from the fact that Tesla's already selling "Full Self-Driving Capability" which means "All you will need to do is get in and tell your car where to go." (See their website below)

https://www.tesla.com/en_GB/models/design

Annoyingly you can't cut and paste the text which makes it hard to quote, but under the disclaimers it says "It is not possible to know exactly when each element of the functionality described above will be available, as this is highly dependent on local regulatory approval".

Several things strike me from that (apart from that they're selling something that doesn't exist yet):

1. They're admitting that "the functionality" isn't a single thing, and it's composed of many elements, but they're still presenting it as a unit called "Full Self-Driving Capability".

2. They're putting the unpredictability and lack of availability entirely onto the regulatory process, and not onto the readiness of the technology.

3. It's possible in future that this "Full Self-Driving Capability" is going to mean different things in different places, even for customers who think they've bought the same thing.

Point 3 makes me think of Tesla making similar arguments about crashes as they do now, but refining them: "Of course this customer using Full Self-Driving Capability in the UK should have been paying attention to the road. Our Full Self-Driving Capability was never intended to be used in the UK, even though we're selling it there. Full Self-Driving Capability works only in California, Nevada and other jurisdictions that have legally allowed it"... or something like that...

They also haven't left themselves many places to go with the name, if Full Self-Driving Capability turns out not to mean that either. Will the next version be "No, We Really Mean It This Time, This One Really Can Drive Itself Capability"?

(eta)

Quote:
Originally Posted by erwins View Post
That was what prompted the statement about highway only use. This was the statement from Tesla: "Autopilot is designed for use on highways that have a center divider and clear lane markings[.]" I was remembering Musk saying essentially the same thing.
Did you watch the video I linked above? It shows a Tesla on Autopilot driving straight into the back of another "car" (it's actually a test dummy car that collapses when hit), under what should be ideal highway conditions, apart from the stationary car. And presumably it's easily reproducible, because it's deliberately set up as a test scenario in the video. (Specifically, Autopilot is behind another, easily visible black car, which pulls out to drive around a stationary white car in the same lane. Autopilot apparently is unable to see the white car and drives straight into it without slowing.)

Last edited by Richard W; 14 June 2018 at 05:24 PM.
Reply With Quote
  #35  
Old 14 June 2018, 05:40 PM
thorny locust's Avatar
thorny locust thorny locust is offline
 
Join Date: 27 April 2007
Location: Upstate NY
Posts: 9,523
Default

Quote:
Originally Posted by Richard W View Post
That seems quite clear from the fact that Tesla's already selling "Full Self-Driving Capability" which means "All you will need to do is get in and tell your car where to go." (See their website below)

https://www.tesla.com/en_GB/models/design

Annoyingly you can't cut and paste the text which makes it hard to quote
Also, they seem to either think themselves, or to want to imply to the customers, that getting there from where they are is only a matter of adding more cameras.
Reply With Quote
  #36  
Old 14 June 2018, 06:55 PM
erwins's Avatar
erwins erwins is offline
 
Join Date: 04 April 2006
Location: Portland, OR
Posts: 12,194
Default

Richard, I haven't had the chance to watch the video yet, but part of Tesla's response to these accidents (I'll get a link to where I grabbed the quote from) has been exactly what you pointed out -- they have put this exact scenario in the car's manual as something Tesla Autopilot can't do reliably. (I guess that should say, reliably can't do, though that's not exactly what the manual says.)

The manual says
Quote:
Traffic-Aware Cruise Control cannot detect all objects and may not brake/decelerate for stationary vehicles, especially in situations when you are driving over 50 mph (80 km/h) and a vehicle you are following moves out of your driving path and a stationary vehicle or object is in front of you instead. Always pay attention to the road ahead and stay prepared to take immediate corrective action. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death.
From this article (first I found that had it) https://www.google.com/amp/s/mashabl...ry-crashes.amp

Also as you point out, would a driver engaging "Autopilot" know that "Traffic Aware Cruise Control" is a component of Autopilot? I'm not sure what the interface is, and whether there is a single Autopilot control to engage.

Last edited by erwins; 14 June 2018 at 07:00 PM.
Reply With Quote
  #37  
Old 14 June 2018, 09:32 PM
GenYus234's Avatar
GenYus234 GenYus234 is offline
 
Join Date: 02 August 2005
Location: Mesa, AZ
Posts: 26,270
Default

So basically, Autopilot is designed for the freeway but only the idealized parts of it?
Reply With Quote
  #38  
Old 14 June 2018, 10:02 PM
ganzfeld's Avatar
ganzfeld ganzfeld is offline
 
Join Date: 05 September 2005
Location: Kyoto, Japan
Posts: 23,627
Default

They have a good idea what their cars can and can't do. They seem to have far less understanding about what ordinary drivers can and can't understand and do.
Reply With Quote
  #39  
Old 14 June 2018, 11:20 PM
BoKu's Avatar
BoKu BoKu is offline
 
Join Date: 20 February 2000
Location: Douglas Flat, CA
Posts: 3,885
Default

Quote:
Originally Posted by GenYus234 View Post
So basically, Autopilot is designed for the freeway but only the idealized parts of it?
In other words, is it libertarian?
Reply With Quote
  #40  
Old 14 June 2018, 11:53 PM
Richard W's Avatar
Richard W Richard W is offline
 
Join Date: 19 February 2000
Location: High Wycombe, UK
Posts: 26,294
Default

That reminds me of a developer I used to know. "It's not a bug, it's clearly documented!"
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Denver-area Maserati salesman, 24, killed driving one at "reckless speeds" Psihala Crash and Burn 6 31 January 2017 09:01 PM
Gun industry’s helping hand triggers a surge in college shooting teams A Turtle Named Mack Amusement Bark 29 18 March 2015 08:04 PM
"Star Trek" & "Planet of the Apes" Comic Crossover Planned catty5nutz Amusement Bark 11 31 July 2014 11:06 AM
"Password" unseated by "123456" on SplashData's annual "Worst Passwords" list snopes Techno-Babble 32 04 February 2014 01:42 AM
The "Hand of Hope" in Dr House SsnakeBite Sightings 2 28 March 2008 12:45 AM


All times are GMT. The time now is 01:28 AM.


Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2018, vBulletin Solutions, Inc.