12 November 2018

Who should die?

This is the question being discussed by developers of self-driving cars.

These vehicles will be everywhere within a few years, but their release is being delayed partly due to the ethical “decisions” they need to be programmed to make in the event of an unavoidable accident. The dilemma is: What should the car do when someone or something has to die? If in that split second it is presented with an unavoidable choice, who, or what, should it kill?

For example, if a self-driving vehicle full of people is travelling at 70mph on a motorway and an animal steps out in front of it, the car has to make an instant, pre-programmed moral judgment about whether to hit and kill the animal or swerve to avoid it and possibly crash, killing some of the passengers. Should the decision differ depending on who is in the car or the type of animal?

When the scenario involves choosing between humans it gets a whole lot more complicated. What if the decision was to either kill three passengers in the car, or three people crossing the road? Whose life is more valuable? What about the perceived value to society of the people who might be killed, or their age, or their responsibility for causing the accident? Should these factors be considered?

Students at Massachusetts Institute of Technology in the US have been evaluating the public’s view on these dilemmas for the past few years by asking them to be the judge of various scenarios involving a self-driving vehicle on their website Moral Machine. The results so far have differed based on the nationality of the respondents; while some chose to save pedestrians over animals or over passengers, others took the opposite view. Is it therefore impossible to build a car with “morals” that are universally acceptable? Will manufacturers have to tailor vehicles’ choices dependent on the local culture in which is it being used? Who decides what is right and wrong?

Playing God

Ultimately, only God can make these decisions; he sets the standards; he is the ultimate authority. But he is also the perfect judge; unlike us, he knows all the facts, the whole context and all the options. Trying to program a morally-responsible robot to make these choices should makes us realise that we are not God. We are morally flawed and cannot aspire to be God, for we have all sinned and fall short of his glory (Romans 3:23). This taints all of our work. So even though self-driving vehicles are an example of humanity’s amazing ingenuity, they will nevertheless be flawed because they are made by imperfect people – and this will be reflected in the moral decisions these cars will be expected to deliver.

But the Bible tells us that every human life matters and we should always “swerve” to avoid taking the life of another human being. The value of our lives cannot be judged by a computer program in a car or by a morally-flawed society; the Bible tells us the only judgment that really matters is God’s. His son Jesus Christ will be the decision-maker for our eternal future. (2 Corinthians 5:10). The decisions of a self-driving car will be based on programs written by humans with limited knowledge and differing moral views. But God has no such limitations; he knows every possible scenario and every person involved – everything about them, both good and bad. He knows the secrets in every heart; every desire and motivation.

Who can stand against this oncoming vehicle?

That is why we are so grateful for Jesus. He lived a perfect life; every decision he made was right; every desire and motivation of his heart was pure. He came to take our place, to live the life we never could, so that when we meet our Maker, God will see his perfection in place of our morally flawed life. On this basis alone will he judge us worthy to be saved.

Tom Warburton is a member of Christ Church Haywards Heath

 

 

 

 

 

 

Share

Related articles

Stay connected with our monthly update

Sign up to receive the latest news from Affinity and our members, delivered straight to your inbox once a month.