• Welcome! The TrekBBS is the number one place to chat about Star Trek with like-minded fans.
    If you are not already a member then please register an account and join in the discussion!

Google Cars Drive Themselves, in Traffic

Snaploud

Admiral
Admiral
Google Cars Drive Themselves, in Traffic


Dmitri Dolgov, a Google engineer, in a self-driving car parked in Silicon Valley after a road test.
By JOHN MARKOFF

Published: October 9, 2010

MOUNTAIN VIEW, Calif. — Anyone driving the twists of Highway 1 between San Francisco and Los Angeles recently may have glimpsed a Toyota Prius with a curious funnel-like cylinder on the roof. Harder to notice was that the person at the wheel was not actually driving.


The car is a project of Google, which has been working in secret but in plain view on vehicles that can drive themselves, using artificial-intelligence software that can sense anything near the car and mimic the decisions made by a human driver.



With someone behind the wheel to take control if something goes awry and a technician in the passenger seat to monitor the navigation system, seven test cars have driven 1,000 miles without human intervention and more than 140,000 miles with only occasional human control. One even drove itself down Lombard Street in San Francisco, one of the steepest and curviest streets in the nation.The only accident, engineers said, was when one Google car was rear-ended while stopped at a traffic light.



http://www.nytimes.com/2010/10/10/science/10google.html?_r=2&hp

Interesting article. Here's where we're going to run into some trouble, though:

But the advent of autonomous vehicles poses thorny legal issues, the Google researchers acknowledged. Under current law, a human must be in control of a car at all times, but what does that mean if the human is not really paying attention as the car crosses through, say, a school zone, figuring that the robot is driving more safely than he would?



And in the event of an accident, who would be liable — the person behind the wheel or the maker of the software?
 
Just saw this article on slashdot. Pretty neat how this tech has progressed to the point where they're actually testing it out on the roads in traffic.

I wonder if they've come across a situation where they had to make way for an emergency vehicle and how the car reacted in that case. It's been my experience that most real drivers are utterly clueless about what to do.
 
It's always the lawyers that spoil the pot. Think how much more efficient it would be if it was all computer controlled.
 
It's fine to "blame lawyers", but who's going to pay for the damages caused by a malfunctioning automated driving system? The insurance company? The car company? The driver who was reading his newspaper when his car careened through a red light or a stop sign? Will we still expect people behind the wheel to be sober? Why? If they're paying attention like they should be, there would be no need for automated drivers, would there be?

The more complex a system is, the more things can go wrong. Toyota can't even get it's cars to maintain a constant speed but we want freaking robots to take the job over completely? Not while I can grasp a steering wheel on my own, thanks very much. People need to buck up and take responsibility for their actions. People are too stupid to drive responsibly and put the damned phones away or slow down when it's wet and instead of forcing them to be better drivers, we'll add thousands of dollars in hardware and software to every car. If you want someone else to drive you, take public transportation or a taxi.
 
And in the event of an accident, who would be liable — the person behind the wheel or the maker of the software?

That was the first thing I thought when reading this. I think google's just setting them up for liability more than anything else. Unless they can make sure this software is better than people, it's always going to be scary based on the chance that it malfunctions.
 
People are too stupid to drive responsibly and put the damned phones away or slow down when it's wet and instead of forcing them to be better drivers, we'll add thousands of dollars in hardware and software to every car. If you want someone else to drive you, take public transportation or a taxi.
Amen and hallelujah to that!
 
It's fine to "blame lawyers", but who's going to pay for the damages caused by a malfunctioning automated driving system? The insurance company? The car company? The driver who was reading his newspaper when his car careened through a red light or a stop sign? Will we still expect people behind the wheel to be sober? Why? If they're paying attention like they should be, there would be no need for automated drivers, would there be?

The more complex a system is, the more things can go wrong. Toyota can't even get it's cars to maintain a constant speed but we want freaking robots to take the job over completely? Not while I can grasp a steering wheel on my own, thanks very much. People need to buck up and take responsibility for their actions. People are too stupid to drive responsibly and put the damned phones away or slow down when it's wet and instead of forcing them to be better drivers, we'll add thousands of dollars in hardware and software to every car. If you want someone else to drive you, take public transportation or a taxi.

Agree to disagree, I guess. That's the reason why I'd rather go all automated at the stroke of midnight, tonight. Driving safe isn't enough when it's the other guy who ruins your day... or life. People aren't responsible and machines are. You could argue machines deal with chaos less eloquently, but have you seen the way people drive? A converging lane is to 90% of drivers what a trigonometric substitution integration problem is to a grammar school student.

Maybe I'm too cynical. Maybe I'm too untrusting. Maybe I'm just too against the idea of giving everyone two ton death machines based only on a simple test that never has to be repeated and takes less time than a dentist appointment. Who knows.
 
The point is people have the possibility of reacting, a malfunctioning machine cannot. In the end, it's still your responsibility to drive safely, not your cars. Would you agree that, if the car that drove itself suddenly stopped receiving messages of other things on the road, it would be your responsibility to control the wheel or apply the break? If not, would google be responsible for the machine malfunctioning? Should they be responsible for a car accident 500 miles away because they sold somebody something they thought would work, but it didn't?

I'm fine either way, but somebody would have to take responsibility, no? If not, should it simply be the innocent other party who gets hit who is responsible? You talk about the other guy ruining your day or your life. Today, he's at least responsible. If all cars become automated, who is? If his automated car broadsides you can puts you in the hospital, who pays your bills?
 
Next James Bond villain should be the head of Google. :wtf:
Evil world domination plan. Know everything about everyone and control almost every part of everybody's life, including driving cars. ;)


A converging lane is to 90% of drivers what a trigonometric substitution integration problem is to a grammar school student.

You'd be surprised how complicated it is for a machine to converge lanes. There are problems that can be solved by humans in an instant but take ages for a computer to calculate. So far, humans are so ultimately superior to computers when it comes to control vehicles of any sort. The amount of visual, acoustic and haptic information processed by humans in a second could never be processed by current computers in time. I'd trust a DRUNK driver more than I'd trust a machine because of that.
 
They're on the west coast. Have they tried these things in a New England winter? What if the sensors get covered in snow? Does the car rely on GPS, and what happens if it enters a GPS-denied area?

I'm glad they're making so much progress on these things, but there are still lots of potential pitfalls to address.
 
It's fine to "blame lawyers", but who's going to pay for the damages caused by a malfunctioning automated driving system? The insurance company? The car company? The driver who was reading his newspaper when his car careened through a red light or a stop sign? Will we still expect people behind the wheel to be sober? Why? If they're paying attention like they should be, there would be no need for automated drivers, would there be?

The more complex a system is, the more things can go wrong. Toyota can't even get it's cars to maintain a constant speed but we want freaking robots to take the job over completely? Not while I can grasp a steering wheel on my own, thanks very much. People need to buck up and take responsibility for their actions. People are too stupid to drive responsibly and put the damned phones away or slow down when it's wet and instead of forcing them to be better drivers, we'll add thousands of dollars in hardware and software to every car. If you want someone else to drive you, take public transportation or a taxi.

Agree to disagree, I guess. That's the reason why I'd rather go all automated at the stroke of midnight, tonight. Driving safe isn't enough when it's the other guy who ruins your day... or life. People aren't responsible and machines are. You could argue machines deal with chaos less eloquently, but have you seen the way people drive? A converging lane is to 90% of drivers what a trigonometric substitution integration problem is to a grammar school student.

Maybe I'm too cynical. Maybe I'm too untrusting. Maybe I'm just too against the idea of giving everyone two ton death machines based only on a simple test that never has to be repeated and takes less time than a dentist appointment. Who knows.

I'm an industrial controls engineer. I work with many systems that have fewer inputs and outputs than something as complex as a self-driving car. Our hardware and software costs hundreds of thousands or even million of dollars, and it requires constant adjustment and maintenance to operate within tolerances.

People who think you can put millions of self-driving cars on the road without accidents are fooling themselves. Again, when those accidents occur, who's at fault?

I reiterate: in order for the system to work, the driver still needs to pay attention to the road. If they're paying attention to the road, why do they need an automated system?

Continuing to improve safety features like stability control, lane collision guidance and emergency brake assist, and drivers education are what will make the roads safer, not relying on technology to do 100% of the driving.
 
What worries me about this is eventually the reliability will become such that people won't be allowed to drive their cars without using automation.

Personally, I think technology should be used assist the driver when reasonable to do so, not replace him...
 
So you're worried about things getting too safe?

Where do you get the idea that taking away control from humans results in safety? There are so many situations, especially in traffic, where a piece of software can't make the right decision.
 
All it would take is one tiny glitch, and you could be dead in the middle of a 10-car pile-up. I do not trust technology well enough yet for an automated car to be a good idea.
 
i78j06.jpg
 
I think you people have far too much faith in the average driver, and human intelligence in general.

But that's just what I think. I'll be first in line to never bore myself to death whilst driving via marvelous automation.
 
It's always the lawyers that spoil the pot. Think how much more efficient it would be if it was all computer controlled.

Agreed.

As much as I love my car, I would love to not need to drive. Whether it be driverless cars or Personal Rapid Transit, I don't really care...driving is overrated. People are too attached to their damned cars. They think they want power and control, but forget that driving is a means, not an end.
 
I think you people have far too much faith in the average driver, and human intelligence in general.
No, I just have less faith in technology.

As much as I love my car, I would love to not need to drive. Whether it be driverless cars or Personal Rapid Transit, I don't really care...driving is overrated. People are too attached to their damned cars. They think they want power and control, but forget that driving is a means, not an end.

I just think driving is fun, and I would hate to give it up.
 
If you are not already a member then please register an account and join in the discussion!

Sign up / Register


Back
Top