Discussion in 'Science and Technology' started by Into Darkness, Sep 9, 2013.
The theory is they could compensate for any lack of anticipation with a faster reaction time.
Strange concidence, this story is running today:
Nevermind, any examples I give would be complete nonsense to you as well.
For those who can't afford a car or just don't like driving, what's needed is better, more flexible public transportation. Like "people mover" rail systems that use small, driverless, computer-directed pods instead of trams or trains.
Personally, I find the idea of a car that drives itself about as appealing as swallowing food pills instead of eating actual food.
So, when do I get my robot butler?
I'll take that as you conceding the point.
A hijacking. A toxic spill. Any situation where the best thing to do is put the pedal to the metal and get out of there. Actually it's not that the computer couldn't respond, but I suspect it would respond wrong.
Everyone's been saying that in an emergency, the car would just pull over to the side and brake. Is it going to understand that there are times when that's MORE dangerous?
I don't think anyone is proposing a system that can't be overridden.
Yeah, if a driver slams on the gas pedal, the system should take that as "get the hell out of here" command.
A computer's not going to be able to handle every emergency, but it's going to do a better job of handling most emergencies than a human ever could. The emergencies where it's likely to fail are ones that humans are at least as bad at avoiding, if not worse. Poor traction situations are probably the best example of just how bad humans are at coping with dangerous conditions.
Do you think that driverless car technology would be able to do a better job in such situtaions as icy roads than a human can? I suppose that's when the variables [and programming the car in such situations] becomes very complicated.
For example sometimes when driving in an icy situation and when the car starts to slip in one directions the solution is to counter steer as opposed to steer in the direciton of the movement of the vehicle.
No argument with either point. But it's hard to know where to draw the line between where a machine should be responsible and where a human must be. I'd hate for that line to be drawn by the lawyers.
Certainly not, and see Silvercrest's post pointing out your contradicting yourself.
That's another point. Who is responsible when a car on autopilot runs over a kid that suddenly appeared between two parking cars?
Despite Maxwell's nonsense talk and Lindley's reaction time argument, there are many situations that cannot be avoided solely by faster reaction times. The driver needs foresight/anticipation. An AI that advanced is far away. And the idea that the computer should interpret a slam on the gas pedal as get the hell out of here command is nonsense as well, since there have been many cases where drivers hit the wrong pedal in panic. So the AI has also to properly interpret what the driver meant in a dangerous situation? Not feasible. You cannot remove the human element, and you cannot make sure that you catch all human errors. You can enhance safety with driver assistance systems, but that's all. You will never be able to remove the driver's responsibility.
Then perhaps we shouldn't trust humans to operate kitchen appliances that can reach temperatures of 500+ degrees and potentially cause serious burns or start fires. Or maybe people shouldn't be allowed to use power tools that can possibly injure, cripple and maim.
Henry Ford let the genie out of the bottle more than 100 years ago. It ain't going back in.
There was a NY Times article about that. If cars were invented today, and you told them how many people got killed in traffic accidents, and how much pollution it caused, and all other side effects, people would tell you to fuck off.
Cars are a perfect example of the frog in the boiling water effect. We slowly got used to TENS OF THOUSANDS of deaths per year in each country by car accidents. We are absolutely okay with that, because it slowly developed over 100 years.
You aren't impressed with the technology that you can get in a 2014 Mercedes S class? The car - for all intensive purposes - drives itself. And that tech is available today provided you have ~$92,000 to lay down on a new car.
Oh, I am impressed. But it doesn't drive itself. In a slow moving traffic jam situation, yeah. That's it. And a few assistance systems for parking, active cruise control, and pedestrian warning, and stuff. But those are not autonomous, you certainly can't let the car handle it all on its own.
Oh boy, I hope nobody buys these cars and just sits back and relaxes. Like those idiots who blindly listen to their navigation system and drive over the bridge that doesn't exist.
I'm actually going to agree with this. Any automated systems should be treated like an airplane's autopilot----available as a tool, but not in any way alleviating the pilot's responsibility for the safe completion of the flight.
It may come about that someday we can take things farther than that, but we have a ways to go before we get there. In the meantime, let's just get as much driver-assist technology in place as we can.
One negative side effect of driver assistance systems is that people unlearn how to drive, basically. Rely on your parking assistance way too often and you stop being able to park your car properly in case the system fails, for example.
Also the reliance on your assistance systems makes you pay less attention, and increases your reaction times in case something goes wrong, that's been shown as well.
That ship sailed a long time ago: automatic transmissions, power steering, anti-lock brakes, cruise control, traction/stability control, etc. There's so much that separates the driver from the actual mechanical behavior of the car at this point that it's almost a joke to still call it "driving." The more automated cars get, the more absurd the idea of "driving" yourself becomes.
My original sentence should have ended with "that a human could respond to better." Does that settle the contradiction for you and JarodRussell?
A loss of vehicle electrical power. That happens, a lot, and then you've got a 2,000 pound ballistic object traveling at 70 mph, just as if the driver got shot in the head by a sniper.
A system crash. Suppose the blue screen of death really did often end in death?
A blown tire or hydroplaning, ending in a big ugly wreck. The automobile company would then be liable for not anticipating and reacting correctly.
Hacking. With a driverless car, a car bomb is automatically upgraded to a smart bomb.
Modern power steering and power brakes give the driver plenty of feedback as to what the car is doing. The power assist just takes some of the physical work out of driving.
I've never owned a car with cruise control, and never will. And I drive a stick shift.
Separate names with a comma.