Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

California DMV Told Google Cars Still Need Steering Wheels

samzenpus posted about three weeks ago | from the keep-your-hands-on-the-wheel dept.

Transportation 506

cartechboy writes Google showed us what it feels is the car of the future. It drives itself, it doesn't have a gas or brake pedal, and there's no steering wheel. But that last one might be an issue. Back in May California's Department of Motor Vehicles published safety guidelines aimed at manufacturers of self-driving vehicles. After seeing Google's self-driving car vision, the California DMV has told the company it needs to add all those things back to their traditional locations so that occupants can take "immediate physical control" of the vehicle if necessary. Don't for a second think this is a major setback for Google, as the prototypes unveiled weren't even close to production ready. While the DMV may loosen some of these restrictions in the future as well all become more comfortable with the idea of self-driving vehicles, there's no question when it comes down to the safety of those on the road.

cancel ×

506 comments

Sorry! There are no comments related to the filter you selected.

Not surprising (5, Interesting)

gurps_npc (621217) | about three weeks ago | (#47758135)

California is playing it safe. It will take a while for us to trust the software enough to remove the steering wheel.

In fact, it would not surprise at all if the brake itself is NEVER removed. I can easily foresee a situation where these vehicles are used to transport unwilling people, or simply undergo a malfunction and the occupant will always want the ability to stop the device.

But I can see the steering wheel and accelerator going away completely - don't want to let untrained people having the ability to make things worse.

Re:Not surprising (-1)

The New Guy 2.0 (3497907) | about three weeks ago | (#47758209)

Most "driverless car" situations involve a human with controls sitting on the other side of a radio signal connection... if the human in the car and controller in the "studio" can't match, then they're going to end up fighting over the wheel, and it'll be impossible to determine the proper angle.

Re:Not surprising (-1)

Anonymous Coward | about three weeks ago | (#47758403)

Most "driverless car" situations involve a human with controls sitting on the other side of a radio signal connection

WTF?

Re:Not surprising (1)

beelsebob (529313) | about three weeks ago | (#47758675)

Uhhhhh... no. That's not how driverless cars work.

Re:Not surprising (4, Informative)

Wootery (1087023) | about three weeks ago | (#47758747)

Most "driverless car" situations involve a human with controls sitting on the other side of a radio signal connection...

Uh... no, no they most certainly do not [wikipedia.org] . Where are you getting this?

Re:Not surprising (4, Interesting)

TWX (665546) | about three weeks ago | (#47758225)

It's going to depend on who's allowed to use a self-driving car and under what conditions, and even so far as what seats are allowed to be occuppied.

I can see a tiered system where licensed drivers with a normal operator's permit are allowed to always occupy the driver's seat in a vehicle with the capability of full control. I could see a special provision of license for those who once held normal operators' permits that voluntarily gave up those licenses (elderly, poor vision, etc) so that they could basically pull-over the vehicle in a crisis. There could also be a special class of license for learners' permit operator licenses that allow the person to occupy that seat. Everyone else will be required to occupy any-other-seat unless all seats are occupied, then there would have to be conditions to allow that seat to be occupied while the controls are disabled.

Re:Not surprising (5, Interesting)

ShanghaiBill (739463) | about three weeks ago | (#47758795)

It's going to depend on who's allowed to use a self-driving car and under what conditions, and even so far as what seats are allowed to be occuppied.

Under California law, a licensed driver must be seated in the "driver's seat", and must be paying attention (no yacking on the cell phone). These requirements won't be permanent, but at least for the first few years of SDCs, that is how it will be. Once a safety record is established, and the pubic is more comfortable with the technology, the restrictions will be relaxed. In a decade or so, cars will likely be able to drive with no people on board, or even transport children with no adult in the car.

Re:Not surprising (4, Interesting)

jellomizer (103300) | about three weeks ago | (#47758279)

I agree, having a manual break should be required as a bare minimum.
Even if the software is perfect, if there is an unexpected power outage, you will need a manual break to stop the car that just may be aimlessly costing.

Re:Not surprising (4, Insightful)

TWX (665546) | about three weeks ago | (#47758331)

As Toyota demonstrated to us, that manual break needs to be damn-near hardware-level too, or at least allow for an emergency override that interrupts the computer entirely if the main 'stop the car now' brake fails to work properly.

It would be terrifying to be in a self-driving runaway car without any controls whatsoever.

Re:Not surprising (2, Interesting)

Xoltri (1052470) | about three weeks ago | (#47758593)

The cause of the Toyota problem was people hitting the accelerator instead of the brake. http://www.caranddriver.com/fe... [caranddriver.com] So if you take away the accelerator pedal I think we're good to go.

Re:Not surprising (4, Informative)

TWX (665546) | about three weeks ago | (#47758749)

No, it wasn't [sddt.com] , at least in all cases. There were definite computer control problems that led to the computer getting stuck in a mode where it had the throttle applied, and ignored the brake, shut-off, and gear-selector inputs because since logically the throttle was applied, those other inputs must be erroneous.

Re:Not surprising (0)

Anonymous Coward | about three weeks ago | (#47758691)

I completely agree with the big red button or the big red lever to stop the car. But the restriction that occupants be able to take "immediate physical control" of the vehicle if necessary implies that they will require a licensed driver to be present in the vehicle and be fully capable of operating the vehicle. That is what is so troubling about this requirement.

Re:Not surprising (5, Interesting)

rossdee (243626) | about three weeks ago | (#47758499)

"I agree, having a manual break should be required as a bare minimum."

A manual brake would be even more useful, along with a kill switch for the engine.

Re:Not surprising (1)

tooslickvan (1061814) | about three weeks ago | (#47758617)

"I agree, having a manual break should be required as a bare minimum."

A manual brake would be even more useful, along with a kill switch for the engine.

No, a manual break, such as sledgehammer to the wheel or knife to the tire, will work just fine.

Re:Not surprising (0)

Anonymous Coward | about three weeks ago | (#47758377)

"It'll take awhile for us to trust the software"

That's a weird way to phrase it, wouldn't it rather be:
"It'll take awhile for the software to be sufficiently robust"

or

"It'll take awhile for the software to be debugged"

or any number of other things; I don't really think trust is anything in here from a regulatory standpoint. The software/hardware haven't been fully tested, and such, it's just poor to think your failsafe mode is nothing more than an off button.

Re:Not surprising (2, Insightful)

Anonymous Coward | about three weeks ago | (#47758395)

Depends on what you call "Safe"

This is merely politically safe.

Realistically, these things are going to be packed to the gills with dozens of sensors covering thousands of metrics and they will be logged every second. I'd be willing to bet that said data will show that the gross majority of accidents happen just after the driver takes control, and are a direct result of driver actions.

What I don't get is why bother with a traditional seating arrangement once you no longer have to drive? Fuck being upright, cramped, and crammed in to the front of a car. I want to lounge back in comfort, read the news, catch up on email, etc.

Re:Not surprising (1)

machineghost (622031) | about three weeks ago | (#47758525)

I want to lounge back in comfort, read the news, catch up on email, etc.

And we'll be able to, eventually. These are just the very first set of rules for the very first automated cars; you can't go from Simpsons to Jetsons overnight.

Re:Not surprising (1)

poetmatt (793785) | about three weeks ago | (#47758719)

I get your idea, but we're going from simpsons to flintstones. So requiring a steering wheel is not exactly a step in the right direction.

Driverless (1)

DrYak (748999) | about three weeks ago | (#47758405)

In fact, it would not surprise at all if the brake itself is NEVER removed.

That's the current situation with driverless trains/subs:
No cockpit in the front of the wagon, but you still have "emergency brakes" lever everywhere.

Re:Not surprising (1)

bigpat (158134) | about three weeks ago | (#47758503)

I think California is playing it wrong and unsafe. I agree there needs to be a big red button on cars which brings the vehicle to a safe stop much like there is on passenger trains, but this move by California seems more like something pushed for by entrenched vested interests and not driven by safety considerations. Lives will be saved when we allow cars to go pick up people that can't drive, don't have licenses or don't want to drive themselves. The implication of this move is that a human driver is going to be responsible for the operation of the vehicle at all times. Rather it should be the manufacturer of the vehicle which is liable for any defects of the autonomous system when it is driving autonomously. And it should be an option moving forward, even a safety feature, to allow cars without manual driving options except for the big red button.

Re:Not surprising (4, Insightful)

pavon (30274) | about three weeks ago | (#47758513)

They may never be removed. Everyone is focused on the split-second decision scenario when talking about this issue, and on that I agree that humans will cause more problems than they solve. But there are many more situations where manual override is needed and beneficial. What happens when the car runs out of gas/charge and you need to push it to the side of the road out of traffic. Or the computer is malfunctioning somehow (software bug, squirrel chewed halfway through a wire, dead battery/alternator). Or when I need to move the car somewhere off-road that the AI refuses to recognize as a valid driving path. There are plenty of not so time critical scenarios where some sort of manual override is needed and those aren't going to go away even when we trust the software to do all the driving. Once we admit that they don't have to be intuitive for split-second reactions, then they don't have to retain the traditional layout, nor be designed for comfortable everyday use, but some sort of steering, brake control, and neutral gear select will always be needed.

Re:Not surprising (1)

disposable60 (735022) | about three weeks ago | (#47758611)

Like I'm going to even be looking out the windshield. If I and my partner are in an autonomous vehicle, odds are pretty good neither one of us will be paying attention to anything but each other, if you catch my drift.

Re:Not surprising (2)

gnasher719 (869701) | about three weeks ago | (#47758783)

What happens when the car runs out of gas/charge and you need to push it to the side of the road out of traffic.

What about the car driving to the side of the road out of traffic with the last bit of kinetic energy available? People might be stupid enough to drive until the tank is absolutely empty and be stuck, a driverless car wouldn't. And then there are driverless Diesel cars which most definitely won't run until the tank is empty, because that kind of thing is _expensive_.

Re:Not surprising (0)

Anonymous Coward | about three weeks ago | (#47758555)

Also, there are certain situations (like heavy rain) where the lasers and all can't get an accurate read on their surroundings. In production, people are going to want to have a manual override for those types of situations, at least for the near term.
Now, if the lasers can't see, then probably the human driver can't either, but the human can weigh the risks and take the chance if they so choose. Imagine you're on your own property, 1/2 a mile drive to go; you know there's no traffic, you know the road, you're going to creep on home. The software, rightly, is going to sit tight and let you make the call.

Re:Not surprising (0)

Anonymous Coward | about three weeks ago | (#47758563)

I am ok with this. My guess is it will be a short term and could also actually help adoption by providing a comfort factor to buyers.

Re:Not surprising (1)

tverbeek (457094) | about three weeks ago | (#47758619)

One of the things that bugs me about so many high-tech devices is the lack of an "off" switch (and in the case of a vehicle, substitute "stop"). On ye olde personal computers, IBM put a big red paddle-switch that summarily deprived the electronics of electricity. Flip that, and it was OFF. (Even the clock.) These days, it's a button (and pretty soon just a contact-sensitive control spot) that asks the system to... not shut off, exactly, but to put itself into a low-power state in which it looks as if it were off. And I've had a few situations where the OS or firmware was so borked up that the only way to restart a device was to physically plug the plug. So for a computer-controlled device that has the physical ability to act as a lethal weapon, I don't think it's unreasonable to insist on a manual "stop" override.

Re:Not surprising (2)

mrchaotica (681592) | about three weeks ago | (#47758643)

I'd like to see self-driving cars have some kind of fail-safe mechanical brake (where power is required to hold the friction surfaces apart) and a big red button that cuts all power.

Re:Not surprising (1)

apraetor (248989) | about three weeks ago | (#47758655)

We need to start pushing for formal regulations with regard to what the cars will do when a collision between vehicles is inevitable. Should your car drive off a bridge, killing you, if it means saving a school bus full of kids? Probably. But I'd like to know how such failure modes are defined.

Re:Not surprising (1)

Rich0 (548339) | about three weeks ago | (#47758687)

In fact, it would not surprise at all if the brake itself is NEVER removed.

How is having a brake a safety feature when there is interleaved traffic crossing an intersection? If a car were to rapidly stop, it would get broadsided by another car expecting it to not stop.

I could see the brake sticking around until it is illegal to manually drive a car, however.

Re:Not surprising (1)

kheldan (1460303) | about three weeks ago | (#47758729)

I side with the DMV on this one; I don't ever want to see vehicles on public roads that have no manual controls whatsoever, it's just a bad idea.

Re:Not surprising (1)

NotDrWho (3543773) | about three weeks ago | (#47758771)

the occupant will always want the ability to stop the device.

I'm afraid I can't let you do that, Dave.

Flight controls instead? (1)

spiritgreywolf (683532) | about three weeks ago | (#47758153)

Personally I'd like to get a car that has controls similar to a jet fighter - or even more basic if it's all drive-by-wire anyway. Gimme a throttle lever in one hand, and a twist stick for proportional steering in the other - or combine them. More display room and less clutter of a wheel.

Re:Flight controls instead? (2)

TWX (665546) | about three weeks ago | (#47758353)

Chrysler experimented with a stick-approach in the sixties, it really didn't work very well. The steering wheel that is capable of multiple revolutions allows for fine-grain control over steering, same goes for long-pedal-travel analog brakes and throttle position.

Think back to playing with cheap radio-controlled cars, it was difficult to navigate tight courses because the cars couldn't steer accurately enough, and if they were really cheaply made and open-wheel types, breaking a control arm at a front wheel was a real possibility.

Re:Flight controls instead? (1)

gstoddart (321705) | about three weeks ago | (#47758531)

Saab did that once. It was universally panned as a terrible idea.

I'm betting there's just some things you wouldn't be able to do with that joystick, like controlling a skid in the snow.

Me, I'll stick with the old fashioned steering wheel. I know it works.

Your jet fighter controls? Not so much.

Re:Flight controls instead? (2)

istartedi (132515) | about three weeks ago | (#47758585)

Ejection seat, LOL.

Re:Flight controls instead? (0)

Anonymous Coward | about three weeks ago | (#47758697)

It's obvious that you have never flown an aircraft equipped with a stick. For steering a car it would be terrible. There's a reason you use the pedals to control direction when taxiing an aircraft as opposed to the stick...

It's preposterous (0)

Anonymous Coward | about three weeks ago | (#47758159)

Is the human driver going to be "responsable" for failing to take action if his autonomous car goes haywire and causes an accident ? I mean, if the car besides the AI is going to have a steering wheel, gas pedal, brake pedal, clutch pedal; what's the advantage for the human passenger ? You'll end up babysitting the AI for the whole drive. No thanks, just use a normal car.

Re:It's preposterous (0)

The New Guy 2.0 (3497907) | about three weeks ago | (#47758237)

Liability law needs to be rewritten... whoever inputs the bad command should be responsible for the accident if there is one. Therefore, there must be "black box" logging telling where the command came from. If the analysts say the car control people sent the command, then they're the ones who have to pay.

No question, really? (1)

Anonymous Coward | about three weeks ago | (#47758173)

What if I happen to be a much worse driver than the car's software and in "wresting control" away from the car I inadvertently cause an accident that the software could have avoided (or was in the process of doing)?

Re:No question, really? (0)

Anonymous Coward | about three weeks ago | (#47758455)

Your bound to be a worse driver, The American driving test is a joke

Of course (4, Interesting)

Meneth (872868) | about three weeks ago | (#47758213)

Have they not seen "I, Robot" (2004)? Of course you need a manual override.

Horseless cars must accept horse harness (1)

Garabito (720521) | about three weeks ago | (#47758223)

.. So that real horses can take "immediate physical traction" of the vehicle if necessary.

Re:Horseless cars must accept horse harness (4, Interesting)

HornWumpus (783565) | about three weeks ago | (#47758267)

Early cars were required to have a harness attachment point. Which was actually sane at the time. So is this.

Re:Horseless cars must accept horse harness (1)

westlake (615356) | about three weeks ago | (#47758565)

So that real horses can take "immediate physical traction" of the vehicle if necessary.

You have no idea how punishing the roads were in the early days of the automobile, how often cars broke down or became hopelessly mired in mud or snow. In rural states, the horse was still in the towing business as late as 1940.

In 1919 a cross country drive was dangerous ... (2)

perpenso (1613749) | about three weeks ago | (#47758711)

So that real horses can take "immediate physical traction" of the vehicle if necessary.

You have no idea how punishing the roads were in the early days of the automobile, how often cars broke down or became hopelessly mired in mud or snow. In rural states, the horse was still in the towing business as late as 1940.

In 1919 Lt Col Eisenhower, yes the later Supreme Allied Commander of WW2 and the 1950s President of the US, led a convoy of 24 vehicles from the east coast to the west coast. 9 vehicles were lost, 21 men were injured and unable to continue.

Re:Horseless cars must accept horse harness (1)

perpenso (1613749) | about three weeks ago | (#47758639)

.. So that real horses can take "immediate physical traction" of the vehicle if necessary.

Joking aside, early cars broke down frequently and the horse was a very common towing option. In these early days people didn't necessarily drive themselves, many paid their mechanic to act as their driver. If a person drove themselves they were probably a hobbyist mechanic.

Backward-thinking by the DMV (5, Insightful)

brunes69 (86786) | about three weeks ago | (#47758229)

Any car that allows the driver to take "immediate physical control" makes the roads unsafer for all. The safest roads will be when ALL cars are autonomous. Having humans in the mix will just ruin all the gains that autonomous cars provide. Can a human wirelessly communicate with a car 5 miles ahead to know of a road condition and adjust it's speed in tandem with all the other cars in between to mitigate any and all danger in advance? Can a human react in sub-millisecond time to avoid obstacles thrown in their way. No and no.

Re:Backward-thinking by the DMV (4, Insightful)

TWX (665546) | about three weeks ago | (#47758385)

Autonomous cars need to prove that they're capable of being safer than operator-driven cars. Right now they haven't done so, and until there's data there will be a need for autonomous cars to be manually operatable.

I expect to drive myself around for the next 30 years or more; I doubt self-driving cars in the price range that I can justify paying will come out any time soon.

Re:Backward-thinking by the DMV (2)

jklovanc (1603149) | about three weeks ago | (#47758435)

The safest roads will be when ALL cars are autonomous.

Agreed, but having only autonomous cars on the road will not happen for decades to come. First there will need to be a viable autonomous car which has not happened yet and may not for up to 20 years. Then there will need to be at least ten years of testing. Then all manual cars will need to age off the road which will not happen for decades as people will want to keep classic cars on the road. Notice that there are cars built in the 30's that are still on the road. So your utopia of all autonomous cars will not happen for many decades to come. Also, autonomous motorcycles are not even on the drawing board.

Re:Backward-thinking by the DMV (0)

Anonymous Coward | about three weeks ago | (#47758741)

False. Though not intended for human transport, at least one autonomous motorcycle was designed and tested. A team (ghostrider ? ) entered one of the earlier darpa grand challenges with an autonomous motorcycle. Said that the narrow single-track layout, made avoiding obstacles easier.

Re:Backward-thinking by the DMV (1)

Cabriel (803429) | about three weeks ago | (#47758439)

And what about for a situation the car doesn't have programmed to deal with? Such as narrowly avoiding an accident that takes up the road in front of it? How does a driverless car deal with that? Just sit there assisting in blocking traffic? What about when an officer on the road is directing traffic? What about when something else is blocking the lane of traffic, like road construction where the workers direct traffic into the lane travelling the opposite direction?

Yes, human error is most likely to cause accidents, but that doesn't mean there's no need at all for a steering wheel for the just-in-case moments that Google didn't think of ahead of time or just can't deal with in software.

Re:Backward-thinking by the DMV (1)

CastrTroy (595695) | about three weeks ago | (#47758571)

Not only that, but an autonomous car that isn't good enough to drive itself without the person having controls probably isn't good enough to be on the road at all.

The car should either have controls for a human, and expect the human to be operating them. Or it should not have human controls and do all the driving itself. Having the car do all the driving for weeks or months on end, lulling the person into a false sense of security, and then one day expect the driver to take over the controls at some random time is just asking for problems. The driver will most likely not be paying attention to the road if they haven't had to do anything with the controls for the past 3 months. The person will either be reading, playing video games, watching a movie, sleeping, doing their makeup, or any other number of things which means they aren't watching the road, and don't have their hands on the controls ready to take over. Sure the car could enforce that you have your hands on the wheel, ready to take over. But, what's the point of paying all the extra for a self driving car if you have to basically act like you are driving it anyway.

Hostile environment. (1)

DrYak (748999) | about three weeks ago | (#47758649)

Can a human wirelessly communicate with a car 5 miles ahead to know of a road condition and adjust it's speed in tandem with all the other cars in between to mitigate any and all danger in advance?

Do not assume that source of wireless coordination is always 100% trusty.
The wireless coordination information might be hostile origin. i.e.: some idiot with a hacked emitter that systematically ask all the other cars to slow down and move aside to let him go through. In theory such a function has practical uses (ambulances, for example), in practice such function WILL GET abused (idiot wanting to arrive faster, or a criminal trying to run away through heavy traffic).

Can a human react in sub-millisecond time to avoid obstacles thrown in their way.

Yup, that's what I consider as the main reason why we should have robotic drive.
Except for the occasional false positive, the current collision-avoidance systems that are already street-legal nowadays and that are already travelling in some cars around us are already much better than humans in reacting in case of emergency.

The only drawbacks currently are false positive[1].

But even in that situation, most of the false positive are safe.
It just causes the cars to slow down or stop when that should not be needed.

---

Example on our car:
- auto-cruise control which chooses the wrong taget: with our car, a large truck that is almost as large as the lane can be mis-targeted and our car slows down to yeild, even if the truck is actually in a different lane and we're not actually on a collision course with it if we stay in the middle of our current lane.
- mis-identified target: the current logic inside the car is: "if there's an object on the way and the car is on a trajectory intersecting it, then hit the breaks (unless overriden by the driver)". The car has no concepts of *what* the object is, and might break on useless occasion. Nearby automatic RFID-based tool booth are non-stop drive through: you don't need top stop, just drive through at a slow steady pace. The RFID transponder will beep in advance to alert you that the transaction with the booth was successful and the barrier will open shortly, you know that barrier will open shortly/is opening shortly and you don't need to brake. But the car only sees an object that is still currently inside your lane (it's not able to notice that the object is moving vertically and that by the time you reach it will be safely away) and will auto-brake unless you keep your feet on the gas pedal.
- very simplified hit-box: the car's hitbox is exactly that: a box. the car will panick and hit the brakes if you try to park under a low hanging balcony. You see that there's enough room under the balcony for the car's engine compartment to go there, but the car will react as if it was a solid wall and break if your foot is on the brake instead of the gaz pedal (which will be the case during slow manoeuvres).

weakest link (0)

Anonymous Coward | about three weeks ago | (#47758233)

Eventually it will become clear that human drivers are the weakest link.

Re:weakest link (5, Insightful)

HornWumpus (783565) | about three weeks ago | (#47758289)

Eventually is a nice word. You can be completely wrong today but adding that one word...

Re:weakest link (1)

Anonymous Coward | about three weeks ago | (#47758301)

Eventually it will become clear that human drivers are the weakest link.

Right, I propose to field autonomous cars without any humans onboard and send to the junkyard any non autonomous cars immediately. Now that should make the roads safe.
Next : get rid of pedestrians, those pesky humans walking on the sidewalk are just asking for trouble.
Conclude : exterminate the human race 'cause they're nothing but problems and give the world over to the 'bots.

Re:weakest link (0)

Anonymous Coward | about three weeks ago | (#47758733)

I wish to subscribe to your newsletter.

- The Internet

CA is mind bottling (1)

hsmith (818216) | about three weeks ago | (#47758239)

Consider how many people die on the roads every year in the United States alone, the biggest factor is humans.

Then, consider how many people die every year due to firearms in the United States alone and look how CA reacts and tries to limit access to firearms, through laws and technology. Basically, "remove the problem."

Shouldn't CA be pushing hard for driverless vehicles? Removing human error from the equation would save countless lives.

Think of the children.

Re:CA is mind bottling (1)

hondo77 (324058) | about three weeks ago | (#47758329)

Shouldn't CA be pushing hard for driverless vehicles? Removing human error from the equation would save countless lives.

Consider all the bug-free software that is written, especially for first-generation devices. Oh, wait...

Re:CA is mind bottling (0)

Anonymous Coward | about three weeks ago | (#47758397)

The voters of CA must first have a mind before you can bottle it.

Re:CA is mind bottling (1)

TWX (665546) | about three weeks ago | (#47758399)

The purpose of a firearm is to shoot. The purpose of a car is to convey people or contents over distance, not to crash or to run over someone. That they happen to crash or run over people is something to be solved, and this is heading in that direction.

Someone needs to point out the issue to California (1)

Anonymous Coward | about three weeks ago | (#47758273)

Which is: 2 out of three of those are already physically disconnected in a modern automobile, and the third they're working really hard to do the same with.

If California's DMV wants to bitch about google removing all those 'unnecessary' features, then it should look really hard at what it's allowing in it's HUMAN OPERATED vehicles.

Allowing a computer to adjust throttle or braking response is one thing. But physically disconnecting the pedals from the devices they're meant to operate means in the event of an electronics failure there's no guarantee you'll have control of braking, throttle, or even steering anyway.

Food for thought!

Well... (1)

Agares (1890982) | about three weeks ago | (#47758277)

I think it would be nice to still have a steering wheel. Not that I doubt the safety and precision of the vehicles, but it is always good to have manual control just in case some sort of freak accident or whatever occurs. For example planes can fly themselves as well but we still have manual controls for those just in case, even though the situations that they may be needed in are probably extremely rare.

A big EMO button on the dashboard (1)

DavidMZ (3411229) | about three weeks ago | (#47758297)

That's basically all is needed for an autonomous vehicle.

Re:A big EMO button on the dashboard (1)

TWX (665546) | about three weeks ago | (#47758431)

So the car can cut itself or something?

I'd love a lawnmower-equivalent of a Roomba, if it would handle things like curves and not running over sprinklerheads that didn't retract after the last watering.

Re:A big EMO button on the dashboard (1)

Wycliffe (116160) | about three weeks ago | (#47758533)

I'd agree. Halfway automation is a disaster waiting to happen. You could possibly have two buttons though:
      1) a "try to stop safely" button which would attempt to pull over to the side of the road and stop (similiar to a computer's shutdown command)
      2) a "full stop" button which immediately powers down and comes to a complete stop. (similar to holding down the power button or pulling the plug)
A third option of ejecting the passenger would be a nice option too if there was a way to do it safely. This could possibly be done automatically
when a collision is unavoidable as well.

Steering wheels are nice, but... (1)

fustakrakich (1673220) | about three weeks ago | (#47758317)

They still leave the operators open to liability, and he/she has to pay attention. The vehicle is not autonomous until then. Babysitting the damn thing is not part of the deal.

Re:Steering wheels are nice, but... (1)

mark-t (151149) | about three weeks ago | (#47758723)

Yes it is part of the deal... Until they have demonstrated reliability at being safer than human drivers with several years of data over many millions of vehicles.

It's all about liability (0)

Anonymous Coward | about three weeks ago | (#47758319)

If you keep the brake, steering wheel and other human usable control interfaces, then you can still hold the human liable. I would love to have a self-driving car but I'm not going to get one if I am liable for accidents as I won't be driving, so how could it possibly be my fault?

That means I'll just keep driving to the best of my abilities and hopefully they can keep adding in small corrective things to assist in the driving. I love my cruise control and I hear that some of the newer and fancier cars let you toggle a setting that basically will pace the car in front of you. How awesome is that?

Big Red Button (0)

Anonymous Coward | about three weeks ago | (#47758335)

All they really need is a Big Red Button to just power off the car and brake as fast as possible - auto ABS, maybe?

As far as steering out of a spin....you got me there. Then again, how many folks know how to do that anyway?

No Steering Wheel In Time (3, Insightful)

Jason Levine (196982) | about three weeks ago | (#47758351)

I agree that an automated car will need a steering wheel in the immediate future. Once their track record has been proven and people are comfortable with them, however, cars will gradually lose manual controls. We'll likely be telling our grandkids with stories of hundreds of non-automated cars screaming down the highway piloted by fallible humans. Of course, they'll just roll their eyes at us, make an "uphill both ways in the snow" comment, and tell their RobotCar to take them to the mall.

Re:No Steering Wheel In Time (1)

bigpat (158134) | about three weeks ago | (#47758383)

Compared with the track record for human drivers which is proven to be completely unsafe?

Re:No Steering Wheel In Time (1)

idontgno (624372) | about three weeks ago | (#47758511)

What, 100% of driver-operated cars are guaranteed to crash?

As enamored as you are of the technology, dial back the hyperbole. It doesn't do the cause any good.

It's called "paying your dues". No one gets away without it. You prove, by extended experience over a long period of time, that the new technology is superior to the old. After a couple of generations (of people, not technology), it's accepted and the shackles of the old can safely go away.

Of course a DMV... (0)

Anonymous Coward | about three weeks ago | (#47758357)

They still want permits and license plates to be a thing, so they don't want you do be able to just hop in a car and be drove around as you want.
I would sooner trust my life to a self driving car, than to one with a meatbag at the wheel.

Star Trek (4, Funny)

Barlo_Mung_42 (411228) | about three weeks ago | (#47758367)

If there's one lesson I learned from Star Trek it's that you always, ALWAYS, include a manual override.

Never gonna work ... (4, Insightful)

gstoddart (321705) | about three weeks ago | (#47758371)

California DMV has told the company it needs to add all those things back to their traditional locations so that occupants can take "immediate physical control" of the vehicle if necessary

The transition time from the computer giving up to the user having to take control is always going to mean this is impossible.

If you're reading the newspaper, you are not going to be able to transition to operating the vehicle in the event the computer gives up and says it's all up to you.

I've been saying for a while, that a driverless car needs to be 100% hands off for the people in the car, or serves no value at all other than as a gimmick.

I will believe driverless cars are ready for prime time when I can stumble out of a pub, crawl into the back seat and tell the car to take me home. Anything less than that is a giant failure of automation waiting to happen, and a convenient way of dodging liability by pretending that users are expected to be in control of the car even while the AI is driving.

As long as there is a pretense of handing back to the driver in even of an emergency, this is a glorified cruise control, and I'll bloody well drive myself.

If I'm ultimately responsible for the vehicle, I'll stay in control of the vehicle. Because if there's a 10 second lag between when the computer throws up its hands and says "I have no idea" and when the user is actually aware enough and in control, that is the window where Really Bad Things will happen.

Re:Never gonna work ... (1)

Nkwe (604125) | about three weeks ago | (#47758669)

As long as there is a pretense of handing back to the driver in even of an emergency, this is a glorified cruise control, and I'll bloody well drive myself.

If I'm ultimately responsible for the vehicle, I'll stay in control of the vehicle. Because if there's a 10 second lag between when the computer throws up its hands and says "I have no idea" and when the user is actually aware enough and in control, that is the window where Really Bad Things will happen.

I would agree if the human is expected to be able to take over at any time. But what about if automation was to the point that if the computer found conditions too complicated, it would pull over and stop the vehicle. Once stopped by the computer, manual controls would be used to drive in those "complicated" situations. You could have the option to interrupt the "safe stop" process and assume control if the human driver felt comfortable doing so, but If the logic included an unattended safe stop, would it be good enough? (I am not saying that we have the ability to build a system that could always achieve an unattended safe stop, but if we if we could - or at least build a system that could achieve an unattended safe stop at a provably better chance than humans can achieve an attended stop - would it be good enough?)

Never gonna work ... (1)

sehlat (180760) | about three weeks ago | (#47758739)

"... stumble out of a pub..."

Like the inebriated gentleman in San Francisco of many years ago? He stumbled out of a pub, crawled into the back seat of a waiting automobile, assuming it was a taxi, and demanded "Take me to the corner of Washington and Clay!" Given that Washington and Clay run parallel to each other, that would confuse the hell out of the computer.

In this case, however, the officers driving the vehicle escorted their new passenger to the lockup so he could sleep it off.

Re:Never gonna work ... (1)

tverbeek (457094) | about three weeks ago | (#47758751)

"If you're reading the newspaper, you are not going to be able to transition to operating the vehicle in the event the computer gives up and says it's all up to you."

I don't think you understand the topic of conversation here. We're not talking about situations in which the computer says, "Excuse me, Dave, but I'm not sure what to do here. Could you please drive for me?" We're talking about situations in which Dave says, "WTF! You're heading for a cliff!" and chooses to take control. Maybe it takes him some seconds to notice the problem before he takes action, but once he does notice, there would be significant delay before he puts his foot down on the brake and his hands on the wheel.

Re:Never gonna work ... (1)

tverbeek (457094) | about three weeks ago | (#47758761)

* "there would be no significant delay"

Re:Never gonna work ... (1)

TheSync (5291) | about three weeks ago | (#47758753)

The transition time from the computer giving up to the user having to take control is always going to mean this is impossible.

I can think of several recent airplane crashes that occurred because pilots tried to take back control from the auto-pilot or auto-landing system without full situational awareness.

Fail-safe (1)

DrYak (748999) | about three weeks ago | (#47758755)

If I'm ultimately responsible for the vehicle, I'll stay in control of the vehicle. Because if there's a 10 second lag between when the computer throws up its hands and says "I have no idea" and when the user is actually aware enough and in control, that is the window where Really Bad Things will happen.

Have a look at how collision avoidance systems that are on the streets nowadays currently work:
- the car will sound an alarm signalling probable impending collision and asking the user to intervene.
- the car will also autonomously start to slow down and eventually brake and stop never the less.

The system is designed in such a way that, although human override is possible, the car will also try to autonomously to follow the best course of actions, unless overridden. You could take the control and do something, or you could also let the car follow its normal program (in traffic jams typically).

Same should be applied to fully autonomous cars one day:
in case of "I have no idea" situation, the user should be able to take over control, but lacking any intervention, the car should also react in a sane way ("I have no idea what to do, and instead I'm gona park on the side of the road and wait safely there until further instructions").

Adverse weather conditions? (0)

Anonymous Coward | about three weeks ago | (#47758447)

So just how do these driverless cars deal with heavy fog, rain, sleet, snow, icy roads, etc? I'm all for them and can't wait for them to have full penetration and that's all we use, but I've yet to see anything about how they do in bad weather. Until that is proven, not only should steering wheels still be there, along with brake pedals, gas, etc, but a real, licensed driver, still paying attention to the road, should still be required to sit in the driver's seat.

why? (2)

troll -1 (956834) | about three weeks ago | (#47758449)

Is this requirement based on science or an irrational fear of computers?

Re:why? (0)

Anonymous Coward | about three weeks ago | (#47758567)

Legislators just want to make sure nobody hacks their vehicles and drives them all off a cliff.

Re:why? (1)

gstoddart (321705) | about three weeks ago | (#47758735)

I would say it's based on a rational fear of computers and automation, and a reasoned understanding that they have failure modes and won't be perfect in all situations.

The problem is that the transition time from being essentially cargo to the one operating the vehicle is going to be where most of the failures occur.

So, it's all fine and lovely to say "tag, you're it", but the human reaction time to re-engage with what the vehicle is doing, what is happening around you, and what needs to be done about it is going to be the critical window in which lives are lost, or accidents become inevitable.

And if the driver has to be engaged enough to do all of those things, they might as well still be driving ... because humans are pretty terrible at making the context switch from not paying attention to making decisive action which has to happen Right Now.

So, as I said elsewhere, either I have a vehicle with no inputs from me and Google (or whomever) takes all liability, or I'll simply decide this is a gimmick and not really ready for anything other than a technology demo.

In my opinion, suddenly handing control back to the driver is the point at which things will go terribly wrong. And by the time that is happening, it's probably already too late.

California already anticipating hacked vehicles (0)

Anonymous Coward | about three weeks ago | (#47758505)

I'm glad they will include a manual override.

Important feature to have (0)

Anonymous Coward | about three weeks ago | (#47758509)

Yes, driverless cars are likely to be safer than human driven cars. However, what will they do when an emergency occurs and the road is diverted in new and exciting ways that not only are typically illegal, but are mildly dangerous? Simple, they'll shut down. No steering wheel? Now you just caused a traffic jam on a major artery.

The busiest highway in North America has been diverted to the point of people driving the "wrong way" before (of course with plenty of officers present to help traffic move). Did you know that a driverless car that shuts down when it gets to that "Uhhh... I don't understand hand signals? You want me to drive over the median to the wrong side of the road???" point would be fined an enormous amount of money per hour?

Why? (0)

Anonymous Coward | about three weeks ago | (#47758515)

As long as there is some ultimate fail-safe button that brings the vehicle to a controlled stop why would you require a steering wheel/brake/gas? Maybe for the sake of simplicity put it where the brake would usually be, make it pressure sensitive so the vehicle knows how to respond (slow down or slam on the brakes). Under normal operation a person isn't going to be paying enough attention to take immediate steering control & a gas pedal isn't really necessary for any real world road safety situations.

stopping... (0)

Anonymous Coward | about three weeks ago | (#47758517)

I wonder if a human could withstand the deceleration caused by a boat anchor and cable. In case of emergency break glass by tossing boat anchor out the nearest window while keeping clear of cable.

Too soon? (1)

Falos (2905315) | about three weeks ago | (#47758543)

If this is an infancy thing, fine, but I don't think we need to reach Ultimate Endgame for these things to be ride'able (NOT "operable") by a child, or even unmanned for deliveries. It'll be the better choice long before we reach superduper polished and perfected.

Mind, I still expect them to come with some tucked away form of control access, even a clunky digital-only one. There's an endless number of possible edge cases that can't be scripted.

I assume code will mostly fail-safe into "Stop the car." for unexpected encounters, so that someone (incl remotely, ie support center or the owner) can help manually guide the derping sensors back to the clean, plainly marked road and resume automated driving, or summon a tow for really locked up/physical damage situations.

Clearly there IS a question (1)

khb (266593) | about three weeks ago | (#47758569)

90% of accidents (or more, depending on the study) are due to human error. So the DMV insistence on putting the humans back into the drivers seat is actually counterproductive. "there's no question when it comes down to the safety of those on the road." ... the question is are the other humans on the road more or less safe with the google vehicle operators able to override the computer?

While I'm not interested in being an early adopter of this or most automotive technologies, there are lots of questions when it comes to safety. It is a pity that government hardly ever uses science or logic in the decision making.

find the fallacy (0)

Anonymous Coward | about three weeks ago | (#47758653)

There is an error in your sampling. Given that 100% of accidents have human drivers involved, it's not a huge surprise that you find a large number of causes point to human drivers, which are the most complex piece of the system.
Substitute a different piece, an automated driver program, then that becomes the most complex piece of the car, maybe not as complex a human, but it's where we're likely to see the most failures when we analyze crashes in the future.

You can take the steering wheels out of cars (0)

Anonymous Coward | about three weeks ago | (#47758613)

After you've operated them safely for 20 years. I'm all for progress, but let's not skip over the part where there is a trial period while society adapts to changes.

has it donned on anyone (0)

Anonymous Coward | about three weeks ago | (#47758651)

that this may well CAUSE more accidents that it prevents? also, that delaying the roll out of self-driving cars is almost certainly a net negative from an accident/injury/fatality standpoint?

false dichotomy - my personal favorite of all the fallacies...

The requirement for these controls is badly placed (0)

Anonymous Coward | about three weeks ago | (#47758679)

Lets us say that in a ideal world, the requirement for a steering wheel, brake and accelerator was *not* required.

Then....
I can get driven home drunk, as I am not driving(or ever expected too)
A 14 year old *could* get driven to the after school after school practice, and then driven home. (no need for a license)
The self driving car is like a subway, or bus. As a rider, my expectations will be in line with public transit, yet, I will have
more options as to where and when I go.

The requirement for all those other controls, still means it is not self driving.
Of course I understand all the reasons being cited as to why they are needed. I just disagree, I believe that consumers will be able to
decide if the real self driving car is safe and reliable or not. We still have lawyers, and that threat is the single most compelling reason
that any manufacturer will make them viable.

When cars first came out, some cities made stupid laws such as the car had to flag man walk in front of the car(safety, etc)

Move a broken down vehicle? (4, Interesting)

HockeyPuck (141947) | about three weeks ago | (#47758695)

If a driverless car has no manual means of steering, and if it broke down and you had to push it, how could you control it?

Re:Move a broken down vehicle? (1, Funny)

Ksevio (865461) | about three weeks ago | (#47758773)

Tow truck.

Does not matter (1)

ZombieBraintrust (1685608) | about three weeks ago | (#47758707)

The laws will be rewritten once this gets closer to being a real thing. Google can continue to do what it wants on its test tracks.

Non-Issue (1)

fivepan (572611) | about three weeks ago | (#47758779)

This is a non-issue because no one actually expected to make a car like Google's prototypes. Google isn't even trying to get into the automobile business with this technology. That's not what they do. They sell tech and software services. They're just trying to show their prospective customers what their tech is truly capable of. The real automakers like GM, Ford, Toyota, etc will use Google's tech in the cars they produce and I don't think anyone ever expected them to make a car without a steering wheel. I'm sure they'll eventually include "automated driving assist" as an option in some of their vehicles but they're still going to sell cars drivable by humans. Don't get me wrong, it did need to be clarified in the laws...but no one was actually planning to build a real car without pedals and a steering wheel.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>