Protyre News

Content_News_Automated-Morality

Autonomous Cars - will machines decide the outcome of an accident?

23
Aug

As cars become more autonomous, one of the largest dilemmas facing manufacturers and lawmakers is an ethical one.
 
In a situation where a self-driving car must decide who it should protect in the event of an accident i.e. the driver, passengers or pedestrians, some might take issue with a machine making moral decisions on behalf of a human.
 
Inspired by an ongoing experiment at MIT, the below simulation link created by Confused.com is designed to collect human responses to such dilemmas to better understand how we expect self-driving cars to behave. The responses gathered from this experiment will help form an industry report on autonomous cars due to publish in September.
 
The ‘Conscious Car’ simulator allows you to programme an autonomous car’s ethical decisions by choosing who to protect in the event of a collision.
 
You are faced with a random selection of tough decisions such as….

  • Do you drive into a wall, or steer into three dogs?

  • Do you protect two elderly people, or one child?

  • Does your opinion change when the driver has passengers?

Try the Conscious Car Simulator here

Frivolity aside, these are the decisions autonomous cars will have to make, although a third (35%) of UK drivers wouldn’t want a driverless vehicle to make decisions on their behalf in the event of a collision.
 
The simulation anonymously collects all answers and the data will be used as part of Confused.com’s industry report on autonomous cars.

  • Search by Registration
  • Tyre Search by Size
  • GB
  • Pointer

About the author

Article Author Photo
By Adam White
Adam looks to create engaging and informative content across the website that provides consumers with expert advice on MOTs, servicing, vehicle maintenance and tyre care. As a motorsport enthusiast, Adam enjoys documenting the Protyre Motorsport team’s involvement in major motorsport events across the UK.
View authorArrow right