The truly weird ethics around driverless cars

Amy Bairstow
Amy Bairstow
The truly weird ethics around driverless cars

We’ve seen self-driving cars on screen for years, from Batman to Knight Rider, and now they’re a thrilling reality. Self-driving cars are already roaming the roads in several US cities, and are expected to produce $300-400 billion in revenue by 2035. But along with the excitement and awe about this brilliant technology, comes some truly weird dilemmas that haven’t been resolved yet.

Here are some of the biggest quandaries.

Who should the car prioritise in a split decision?

In theory, driverless cars should be safer than most because they can reduce or even eliminate human error. After all, robots can’t drive drunk or sneeze. And in a critical moment, driverless cars can act rapidly to avoid hitting a pedestrian. But what if there’s an urgent choice to be made between saving one pedestrian or another?

Is it right to prioritise saving three lives, rather than one?

Should the car prioritise an 8-year-old’s life over that of an 80-year-old?

What if the choice is between saving someone who’s pregnant, and someone who isn’t?

For the first time, cars have the processing ability and programming to make such impossible decisions. As it turns out, the trolley problem and self-driving cars are essentially one and the same — which makes self-driving car design a deeply philosophical process.

Is a driver worth more than a pedestrian?

The nature of driverless cars also presents a choice between who’s inside and outside the vehicle. When surveyed, most people say a driverless car should prioritise pedestrians over the car’s occupants if a crash is unavoidable. Yet, people prefer to ride in a driverless car that will protect its occupants at all costs. After all - who wants to buy a car that might sacrifice you?

If this decision is left up to manufacturers, it’s conceivable that the car’s occupants might be prioritised for the simple sake of profits. This throws up all kinds of curly questions to do with class and privilege. For example: could car drivers have a better life expectancy because they’re better protected? And could life insurance premiums cost more if a person walks or rides a bike to work?

Who’s responsible for any accidents?

There are also some major ethical implications of self driving cars if things do go wrong. If someone is hurt in a crash, who’s actually responsible - the car manufacturer, the person who programmed the software, or the car itself?

Even now, driverless cars have been exempt from speeding tickets in San Francisco while the law catches up. And in cities where driverless cars are common, there have been plenty of complaints about chaos on the roads including the blocking of emergency vehicles. When crashes do happen, there’s ambiguity about who’s actually accountable.

Who gets to decide these answers?

So far, the ethics of self-driving cars have largely been left up to car engineers and programmers. But there’s lots of talk about broader governance over these decisions. Germany has been the first country to implement a national framework for autonomous vehicles, along with France and Japan. Here in Australia we’re expected to have an Automated Vehicle Safety Law by 2026. Liability laws everywhere may also need to be updated to handle complex cases as cars and drivers share operational responsibility.

And while we’re on the topic of complexity…

Who controls the software and data?

It might seem like a sci-fi kind of problem, but another driverless car dilemma centres around data and control. If hackers can gain access to car software, could they potentially change cars’ behaviour - or even frame a car’s passenger for an accident?

Driverless cars currently rely on plenty of cameras, and the ethics of allowing such widespread surveillance technology has also been pulled into question. In San Francisco there have already been law enforcement requests for vehicle footage, which privacy experts argue could disproportionately harm marginalised groups. Could we turn into a surveillance state through the cars on our roads?

These ideas might seem far-fetched - but as the technology rapidly evolves, these types of problems could arise sooner than expected.

So - what’s the solution?

There’s no clear solution to each autonomous car dilemma, and laws and regulations are struggling to keep pace with rapidly evolving tech. But one thing’s clear - manufacturers and lawmakers can’t just cruise on autopilot until these issues are addressed.

Until we are all safely cruising hands-free, let’s just enjoy the drive with easy-to-love cars. If you happen to be looking for pre-owned cars for sale including second-hand EVs, Carma takes the stress out of buying. Every car is thoroughly checked, comes with a 7-day return window, and is available for free delivery. Plus as a bonus, your new car probably won’t place you into an episode of Black Mirror - not yet, anyway.

Find your next SUV


Get the best car news delivered straight to your inbox