NewsNovember 20, 2014

LOS ANGELES -- A large truck speeding in the opposite direction suddenly veers into your lane. Jerk the wheel left and smash into a bicyclist? Swerve right toward a family on foot? Slam the brakes and brace for an impact? Drivers make split-second decisions based on instinct and a limited view of the dangers around them. ...

By JUSTIN PRITCHARD ~ Associated Press
A Google self-driving car goes on a test drive in May near the Computer History Museum in Mountain View, California. (Eric Risberg ~ Associated Press)
A Google self-driving car goes on a test drive in May near the Computer History Museum in Mountain View, California. (Eric Risberg ~ Associated Press)

LOS ANGELES -- A large truck speeding in the opposite direction suddenly veers into your lane.

Jerk the wheel left and smash into a bicyclist? Swerve right toward a family on foot? Slam the brakes and brace for an impact?

Drivers make split-second decisions based on instinct and a limited view of the dangers around them. The cars of the future -- those that can drive themselves thanks to an array of sensors and computing power -- will have near-perfect perception and react based on preprogrammed logic.

While cars that do most or even all of the driving may be much safer, accidents happen.

It's relatively easy to write computer code that directs the car how to respond to a dilemma. The hard part is deciding what that response should be.

"The problem is, who's determining what we want?" asked Jeffrey Miller, a University of Southern California professor who develops driverless vehicle software. "You're not going to have 100 percent buy-in that says, 'Hit the guy on the right."'

Companies testing driverless cars are not focusing on these moral questions.

The company most aggressively developing self-driving cars isn't a carmaker. Google has invested heavily in the technology, driving hundreds of thousands of miles on roads and highways in tricked-out Priuses and Lexus SUVs. Leaders at the Silicon Valley giant have said they want to get the technology to the public by 2017.

For now, Google is focused on mastering the most common driving scenarios, programming the cars to drive defensively in hopes of avoiding the rare instances when an accident is truly unavoidable.

"People are philosophizing about it, but the question about real-world capability and real-world events that can affect us, we really haven't studied that issue," said Ron Medford, the director of safety for Google's self-driving car project.

One of those philosophers is Patrick Lin, a professor at Cal Poly, San Luis Obispo.

Receive Daily Headlines FREESign up today!

"This is one of the most profoundly serious decisions we can make. Program a machine that can foreseeably lead to someone's death," Lin said. "When we make programming decisions, we expect those to be as right as we can be."

Lin said he has discussed the ethics of driverless cars with Google as well as automakers including Tesla, Nissan and BMW.

As far as he knows, only BMW has formed an internal group to study the issue.

Many automakers remain skeptical cars will operate completely without drivers, at least not in the next five or 10 years.

Uwe Higgen, head of BMW's group technology office in Silicon Valley, said the automaker has brought together specialists in technology, ethics, social impact, and the law to discuss a range of issues related to cars that do evermore driving instead of people.

"This is a constant process going forward," Higgen said.

To some, the fundamental moral question doesn't ask about rare and catastrophic accidents but rather how to balance appropriate caution over introducing the technology against its potential to save lives. After all, more than 30,000 people die in traffic accidents each year in the United States.

"No one has a good answer for how safe is safe enough," said Bryant Walker Smith, a law professor who has written extensively on self-driving cars. The cars "are going to crash, and that is something that the companies need to accept and the public needs to accept."

And what about government regulators -- how will they react to crashes, especially those that are particularly gruesome or the result of a decision that a person would be unlikely to make? Just four states have passed any rules governing self-driving cars on public roads, and the federal government appears to be in no hurry to regulate them.

In California, the department of motor vehicles is discussing ethical questions with companies, but isn't writing rules.

"That's a natural question that would come up and it does come up," said Bernard Soriano, the department's point man on driverless cars, of how cars should decide between a series of bad choices. "There will have to be some sort of explanation."

Story Tags

Connect with the Southeast Missourian Newsroom:

For corrections to this story or other insights for the editor, click here. To submit a letter to the editor, click here. To learn about the Southeast Missourian’s AI Policy, click here.

Advertisement
Receive Daily Headlines FREESign up today!