Attention: You are using an outdated browser, device or you do not have the latest version of JavaScript downloaded and so this website may not work as expected. Please download the latest software or switch device to avoid further issues.

News > Science, Technology & Medicine > Can Machines Learn To Be Moral?

Can Machines Learn To Be Moral?

Ian Pyle (JH 47-52) Emeritus Professor at the University of Aberystwyth, tackles some of the important questions raised by recent advances in AI and the explosion of social media
During the last half century, technology has transformed human existence. As tech becomes more and more ingrained in our lives – through social media, online banking, internet shopping – the moral and social consequences of our dependency on technology are becoming more and more apparent.

The next technological revolution, artificial intelligence (AI) has brought with it even further moral dilemmas, and the promise of even greater societal change. Industry leaders like Elon Musk have warned that AI may constitute ‘a fundamental risk to the existence of human civilisation.’ The late Stephen Hawking contended that ‘the development of full artificial intelligence could spell the end of the human race.’

The rise of these technologies – from driverless cars to autonomous weapons – has been met with calls for the industry to imbue AI technologies with a sense of ethics.

Now that some machines can do what had been the exclusive domain of humans and other intelligent life - learn on their own, (so called ‘machine learning’) – can machines be taught right from wrong?

For this to become a reality, several critical issues must be resolved:

1. Who decides what’s ‘right’ and ‘wrong’?

The behaviour of a machine is entirely determined by the algorithms written for it by programmers. In other words, a machine cannot be taught what is ‘right’ unless the programmer creating it has a precise conception of what ‘right’ constitutes.
An ethical conundrum that has been highly debated in the field of morality and computing is how cars should be programmed to respond in the event of an unavoidable collision. How do we value one human life over another, and who decides this?

2. Who should be accountable?

If a robot doesn’t have the moral responsibility for its own actions, who does? Is it the writer of the software, who fundamentally defines its behaviour? Perhaps the company developing it, which determined its purpose, requirements and constraints? Another view (thanks to my daughter-in-law, Kirsten) is that the manufacturer has presumptive responsibility if it does not do what it was bought to do (with the vendor as the manufacturer’s agent, under the ‘Sale of Goods Act’) as it was bought for a particular stated purpose, and must have been properly authorised for sale in this way.

Perhaps, ultimately, the responsibility lies with the owner, who stands to benefit from its successes?  Probably robots will have to be registered to identify who is responsible for bad behaviour, and (inevitably!) to be taxed!

3. How do we balance the drive for cost saving over morality?

The use of computers in offices and factories has already made enormous changes to patterns of employment. Many traditional ‘skilled’ jobs have been replaced through automation, and AI puts even more of these professions at risk. During the Industrial Revolution, the Luddites destroyed the new power-driven looms that were replacing traditional weavers. Were their actions immoral, or is striving for cost-saving immoral? 

Actions have consequences, not always intended.  Perhaps thinking about robots and morality makes us wonder how humans tell right from wrong?

This note was edited by Katerina Ward, for which I am most grateful. A long version is available below:
Professor Ian Pyle: Computers and Morality
 

Share Your Story

Do you have a story to share?
Contact a member of our team.

Click here to email us
with your idea

Or, call us on:
+44 (0) 1732 304253

image

CONTACT US

Tonbridge Society Office

Email us

 +44 (0) 1732 304253

Charity Registration Number 1099162

This website is powered by
ToucanTech