![]() |
#1
|
||||
|
||||
![]()
Uber Technologies Inc suspended its pilot program for driverless cars on Saturday after a vehicle equipped with the nascent technology crashed on an Arizona roadway, the ride-hailing company and local police said.
http://www.nbcnews.com/tech/tech-new...-crash-n738591 |
#2
|
||||
|
||||
![]()
Are we really so obsessed with making real life like Star Trek meets The Jetsons that we need self-drivng cars like tomorrow? Maybe if we weren't in such a rush they wouldn't crash so much.
|
#3
|
||||
|
||||
![]()
Are they crashing so much or it is just that they get national news when they do? And, in this case, it was the human driver in the other vehicle that was responsible for the crash.
ETA: I know of three recent accidents where a self-driving car was involved. This one, which seems to be the fault of the human in the other car, the Tesla crash, which was the self-driving system's fault, and the bus-car accident in California, which would seem to be the bus driver's fault (the self-driving car had moved to the far right side of the lane to make a right turn and the bus was attempting to pass in the same lane). Last edited by GenYus234; 27 March 2017 at 03:45 PM. |
#4
|
||||
|
||||
![]()
I'm curious, is there any outside reference that identifies an automated car from a non-automated car?
Unless there are going to be dedicated lanes for automated cars during for the foreseeable future when automation and human controlled vehicles will co-exist, the automation is going to have to be robust enough to handle unpredictable situations when human controlled cars are around. ~Psihala |
#5
|
||||
|
||||
![]()
Most of the ones I've seen have a large sensor package on the roof. Google's says "Self Driving Car" on the side as well. They are programmed to attempt to avoid crashes by other drivers, but, as with human drivers, there is only so much that can be done to avoid a crash.
Also, such avoidance behavior can cause its own set of problems. IMS, the first version of cars had issues with 4 way stop intersections because they kept waiting for the other car to stop completely. Since many drivers creep, the self-driving car wouldn't go through the intersection. |
#6
|
||||
|
||||
![]()
It seems like one thing human drivers do a lot better than computers is anticipating other drivers breaking the rules. The article lacks details, but it sounds as if the self driving had the right of way, so it proceeded and didn't anticipate that the other driver might not yield. Technically it was in the right, but a human driver might have been able to judge based the other driver's behavior that it looked like he wasn't going to yield and react accordingly.
|
#7
|
||||
|
||||
![]() Quote:
The Google car accident with the bus was admitted by Google, last I read, to be the fault of the self-driving car. The car had recently "learned" that it could drive in the far right of the lane to make a right turn when other traffic ahead had stopped. But, there was a sandbag in the road toward the curb, and the car couldn't get by. When traffic started moving again, it tried to merge back into the lane, and it apparently "expected" the bus in that lane to give way. The Google car literally (at very slow speed) drove into the side of the bus. Last edited by erwins; 27 March 2017 at 06:53 PM. |
#8
|
||||
|
||||
![]()
For the Tesla, "fault" is too strong because you are correct about it not supposed to be used as a full autonomous driving system. Tesla has said that there is a technical issue that contributed to the accident.
Google said that their system had partial fault because it assumed the bus would yield. You are correct that the Google car moved over to make a right hand turn. But the Google car was being passed by the bus that was in the same lane. IOW, the bus was going to move itself into the same lane space that another vehicle was already in. I would say that meant than the bus was the vehicle with the majority of responsibility to avoid a crash with the vehicle already in the lane. |
#9
|
||||
|
||||
![]() Quote:
It doesn't sound like an accident that would have happened if both drivers were human. |
#10
|
||||
|
||||
![]() Quote:
The legal aspect might depend on how far over the Google car was, and how much it had already nosed back into the center of the lane. If a car has pulled to the curb and stopped, it may be regarded, at times, as an obstruction to be driven past or around, and not a car occupying in the normal sense a "lane space." If the car then decides to pull back into traffic, it may be required to merge rather than expect other cars to yield. Google said this as part of its report: Quote:
|
#11
|
||||
|
||||
![]()
If you are referring to the Uber accident, it sounds like the Uber car was driving down the road, not at an intersection and that the other driver turned in front of it. I can't find detailed descriptions, but the accident happened at speed which wouldn't be the case if the Uber car had stopped at an intersection. Having a human driver in the other car probably wouldn't have made a difference.
|
#12
|
||||
|
||||
![]()
It was at an intersection. The other car made a left turn into the Uber vehicle. If they hit it hard enough to put it on its side, its almost as if they didn't see it at all. Unless I'm missing something, the other driver (non Uber) was clearly at fault.
|
#13
|
||||
|
||||
![]() Quote:
|
#14
|
||||
|
||||
![]()
It's very hard to tell from the vague reports, but it seemed to me that the Uber car was proceeding through an intersection, but one that was not controlled in that direction. So the car that hit the Uber car was on a side street, and the Uber car had no reason to stop. If I'm right, it is a kind of accident that some human drivers would avoid, based on seeing the other car approaching and making a correct prediction that it was not going to stop, (or seeing it begin its turn, or whatever clues there might have been), but not all humans would predict that level of inattention on the part of the other driver.
Sometimes we all have rely on other drivers to follow the rules--if we didn't, we would never get anywhere. |
#15
|
||||
|
||||
![]()
I think you can attribute much of the rush to driverless cars to competition between the various companies involved as much as anything else. They all want to be the first to be able declare "Eureka, we've done it!!"
|
#16
|
||||
|
||||
![]() Quote:
* It doesn't help that some of these groups have been using some of those bogus stats in PR. For example, claiming that an intelligent cruise control is safer than a human because it has fewer accidents on average, failing to recognise the fact that cruise control is only used in very specific situations. My gosh, if it has even an order of magnitude fewer accidents for the general road, that's not a safe cruise control system! |
#17
|
||||
|
||||
![]()
Just to be clear, with apologies for double posting, I do think that under good conditions and compared to the same conditions for humans they will look OK, in many if not most cases as good or better than humans. But there are certain conditions under which they cannot yet drive at all, let alone like a human. If we knew all that, I think more states would be willing to draw up rules for how and when they can be used and we would have some place to start. Just be completely honest and open; most people want this. Instead, these companies have opted to keep pounding the media with what amounts to nothing more than a campaign with promises of a better world. So whenever the slightest thing goes wrong people reject those slogans and bogus stats. That is a reasonable reaction when they have nothing but articles written from press releases that are full of ad copy rather than actual data.
Let's have some kind of standard for keeping records of self-driving and intelligent assisted driving and agreements to publish that data. Then independent researchers can make more or less valid comparisons. I think there will be some big surprises. I don't think things will be as far along as many claim. Yet I think people will see that it's a good idea to start making rules and laws to make it easier to safely develop and operate these vehicles. |
#18
|
||||
|
||||
![]()
https://www.theverge.com/2018/3/19/1...-tempe-arizona
Uber halts self-driving tests after pedestrian killed in Arizona [There seems to be some confusion about if she was a pedestrian or a bicycle rider.] |
#19
|
|||
|
|||
![]()
I've always figured that the biggest challenge to driverless cars is the liability issue and not any technical challenges. Car makers really don't want the liability for crashes. Even if the driverless car is not at fault 99.9% of the time the liability for that 0.1% is huge because of the car maker's deep pockets.
A related issue is one of insurance. Will insurers cover self driving (full time or part time) cars? What will the rates be? Where are the accident statistics the insurers need to set the rates? |
#20
|
||||
|
||||
![]()
I don't know that it would be that different from existing liability laws. I believe most states hold that the parent of a 16 or 17 year old is still ultimately financially responsible for damages even though the child is designated to have sufficient decision making capabilities with regards to driving. An owner of a animal is responsible for any damages that that entity does, even though an animal has a great deal of autonomy and decision making capability. The manufacturer would probably be held liable only if the car did not meet whatever ends up being an industry standard for self-driving cars or if there was a defect in the car.
For example, drum brakes have long been accepted on the rear wheels of a vehicle despite the fact that disc brakes usually provide better stopping power. A car manufacturer that uses drum brakes is generally not going to be held liable even if an accident could have been avoided if the car had disc brakes. A similar standard would probably hold true with autonomous cars. If the car meets the standard for avoidance of objects traveling at oblique angles x - x`, then an accident from angle x` + 20*° will probably be considered not the car manufacturer's fault. |
![]() |
Thread Tools | |
Display Modes | |
|
|
![]() |
||||
Thread | Thread Starter | Forum | Replies | Last Post |
Christy Natsis gets 5-year sentence for impaired, dangerous driving in 2011 crash | Sue | Moot Court | 7 | 10 May 2018 06:49 PM |
Sigma Nu Fraternity suspends ODU chapter after display of offensive banners | TallGeekyGirl | Social Studies | 5 | 25 August 2015 10:57 PM |
Daycare suspends 2-year-old girl over cheese sandwich | Sue | Social Studies | 19 | 07 March 2014 10:45 PM |
Southwest suspends pilots who flew plane to wrong airport | WildaBeast | Crash and Burn | 0 | 13 January 2014 10:22 PM |
NY Suspends Driver's Licenses for Tax Delinquents | snopes | Police Blotter | 1 | 06 August 2013 06:08 PM |