by Mike –
Contributor Wallace Wyss, hard at work on a book on autonomous cars, has volunteered to publish five different scenarios on My Car Quest, one every few days, to see what readers think about what will inevitably happen once the robots take the wheel.
“I am not against autonomous cars,” he says “but I don’t think the public has thought out what could happen as their use spreads without these questions being asked and answered in a public airing.”
He hopes to see posted reader’s comments in any area: engineering, design philosophy, and even ethics.
Here’s the first scenario
You are driving your autonomous car on a two-lane road, one lane each way, following a garbage truck which is carrying over its back a heavily loaded dumpster. You are doing 45 mph, keeping pace, and see a tunnel up ahead about 100 yards.
Suddenly the dumpster breaks loose, slamming down on the road. Your autonomous warning system chirps–obstacle! You squeeze the brakes but there’s traffic hard on your tail. You consider zooming around the truck but there’s intermittent traffic coming the other way. Too chancy.
You’re sliding toward it still doing 35 mph. It’s obvious you’re gonna hit the dumpster and hit it hard. You cringe and brace yourself for impact.
But Mr. Autonomous takes over. In a microsecond, he has noted the situation and weighed all the various options. In his scan to the left, his robot eyes saw a sidewalk which has a mother with three toddlers, the kids holding hands, trailing behind her. To the right, on that sidewalk, the robot sees an aged lady, walking with a cane.
His reptilian brain is programmed to follow Isaac Asimov’s Three Laws of Robotics (see below).
What should the autonomous car do? – any opinions?
Isaac Asimov’s Three Laws of Robotics
The laws are written in order of priority for the robot.
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Let us know what you think in the Comments.
The scenario (or something similar, I guess it was a loss of adherence, in front fo a zebra crossing with the same lady+kids crossing the street) appeared in some documents during Google experiments.
As far as I understood, and I may undersign such a decision, the system is build to adopt the “minor injury paradigm”. In your scenario, the autonomous brain may decide to take the hit at the truck, reducing as more as possible the speed, retracting the steering wheel, switching off oil pump and oil distribution and pre tensing the security belt, before the hit in order to minimize damages.Hypothetically, a 25/30 mph (50-60 km/h) hit can be survived by the vehicle passengers, thanks to passive and active security systems (ABS, ASR, “intelligent” seat belts, airbags, aso..). an “acceptable hit” answer scenario to your question.
Hitting the street walkers (whether old or young) would me much more drastic and critical, also from a legal point of view;
The big question (and current dilemma) for insurance is about liability: car owner/driver is liable for the “autonomous system”? What if Autonomous system choice is not what the car owner/driver would have chosen? What if owner/driver has to pay for damages related to a decision made by a computer, a decision he/she doesn’t share?
Another liability issue concerns the “service provider”: the autonomous car will probably change the way we conceive transportation: it’s likely that car ownership will soon disappear and the man/woman inside the car will be only a user of a “liquid” service (Transportation On Request, TOR), see Jeremy Rifkin’s theories.
I am not sure the future “TOR” providers/companies will accept full liability for third person damages. it may be easier to include the “acceptable hit and damage to passengers” answer in their Service Level Agreements…
All I keep thinking of is the car employs vertical thrusters allowing a near instant stop 15 feet above the road but I understand that’s a cop-out.
That’s not a cop-out. The posted scenario is faulty in assuming it has all of the possible answers and that we should have to limit our response to the situation by those already thought-of answers. At the same time it’s asking us to think of the autonomous car of the future (which by definition assumes new technology). Why not actually try to come up with a real solution rather than deciding to accept only bad solutions of the present mindset?
The dumpster is still moving and will continue to. Your car will slow and stop if it needs to. Chances are the following cars will be autonomous as well, and will slow to a stop. If the car is human driven,normal rules will apply, and that car is at fault if it hits you. The A cars will have GPS and know there is a tunnel. The A car will know it has passengers and would simply hit the dumpster at a reduced speed after braking.
The car will stop as quickly as it can. Yes it may crash into the dumpster. Accidents will never go completely away.
It crashes:
The following cars will also stop, as every vehicle will be talking to each other. Only 1 car is damaged.
Cars coming up to the incident will reroute around it or stop . Oncoming traffic also stopped when the incident happened.
Close to the crash, each car will have recorded the incident as well. Authorities will have been called out before
anyone gets out of their car. The involved cars will assess if Ambulance is required. If any Doctors are in the area-
they will be alerted as well in their vehicle.
First responders arrive on scene- with no one slowing the process, as every car will allow them to pass because they will be stopped well before they come by.
Oncoming traffic will resume through the caution area at the same speed as determined by the traffic director.
No stop and go or lookie loos clogging up the mess.
After so many cars pass, oncoming cars will stop and the affected lane cars will let 10 cars go around the scene.
Furthermore- we have a damaged vehicle unable to continue and passengers who are unhurt. The tow truck arrives and exchanges out the damaged car for a loaner car so they all can continue on, go home, or wherever.
The garbage truck knew instantly what happened and stopped too. He was cited and had the dumpster re-secured and trash cleaned up within minutes. Police directed him to a secondary location- for evaluation and safety
check. This is a lengthy investigation to insure it doesn’t happen again-maybe it is determined the dumpsters can never be transported this way again.
All reasonable answers (except for the jet thrusters). Mostly avoiding direct comment on the value judgement the machine (the robot piloting the car behind the dump truck) might have to make between the toddlers on the left or the elderly lady on the right. I don’t think it will choose to hit the fallen dumpster or the truck. I think it could go down to the car/robot’s core mission–which is to protect its passenger and itself. It will go the route that will result in the least amount of damage to itself. It may visually weigh the children–say 40 lbs. each–and averaging 2.5 ft. tall– so it could decide that they collectively will do less damage than the elderly lady who , though weighing 100 lbs., is 5 ft. tall and could crack the windshield. What makes this even more soulless is that the car will have already calculated its own accident damage and ordered replacement parts sent to the dealer before it even comes to a stop. The injured pedestrians? Well, that’s the responsibility no one wants to talk about yet–I think there will come a case like this–on an accident that hasn’t happened yet– that will receive world attention, and determine just how fast we go pedal to the metal on autonomous…
I’d like to point out that the scenario is limited by traditional driver thinking, non algorithmic processing, and road conditions that will largely be things of the past.
The assumption that our autonomous cars will operate as individual units is simply not the case. ALL cars with computerized drive mechanisms will be linked to each other. Just like communications devices are. So the autonomous car is going to be assessing the probability of the dumpster to drop long before you are getting too close. The dumpster carrying truck will also be sensing the weight shifting and will have sent out warnings to cars in the immediate path of danger.
Autonomous cars are not being programmed to drive like humans. What a mess that would be!! Instead they are given literally tons of road data, condition variables, and simulations that extrapolate decision making in micro-seconds. The computational analysis will be happening all the time. In fact, the autonomous car might even decide when you get in it on a snowy morning to NOT engage at all having assessed that the road conditions are too risky for your safe arrival.
Asimov speculated the third law would apply as part of AI advancement in empathy and that robots would have this capacity as well as “need” it to prevent anti-robot sentiment that would naturally enter the world as they populated the landscape. Self-preservation is an interesting directive when part of hierarchical thinking. It involves a series of computational variables that shift as more experience is known about the conditions around the entity.
Which is where we return to the story of the car and dumpster. In the not too distant future, the autonomous car will be no where near the dumpster-laden truck. The truck itself would also not be engaged to enter traffic with the risk of losing the load. The only area where this scenario applies is today when all the errors are human and the resultant tragedy is accidental. Yes, there are computational “accidents” but in the case of hardware and software that is progressing these days, the service industry of finding those flaws or accidents will be built into the software and just like the car that refuses to leave home due to too much snow. The flawed computer will know it is damaged and shut down while it awaits repairs.
Autonomous cars will be in the future. Future cars will likely be electric.powered. Engineer in our roads, slots that will supply electricity to the car. These slots can act as safety guides and act as a safety base to keep the rear cars at a safe distance. These slots can give instant breaking and guidance to avoid hitting the dumpster. Even today dumpsters have a safety double hitch to prevent runaway dumpsters. Submitted by SAVE THE BABY AND SENIORS ASSOCIATION. .
If programmed properly all vehicles will be following the correct distance and only the first will incur any damage, that is if the dumpster rolls into the first car.
The future truck will also have numerous fail safes to prevent the load from detaching and falling off in the first place. Unless made in China.