Home » Robot Ethics Home – Old

Request contact
If you need to get in touch
Enter your email and click the button

Robot Ethics Home – Old

Audio – Video – Written Material – Organisations

The fear that mankind will become victim of its own success in developing science and technology has always been with us. All new technology revolutionary advances from fire to fusion carry the potential of immense danger as well as immense benefit. Robots and artificial intelligence are just the latest manifestations. So far, we have managed to avoid the worst nightmares of annihilation but there is no shortage of great minds, from Isaac Asimov to Stephen Hawking and Elon Musk, warning of the dangers of Artificial Intelligence (AI) and robotics. Nick Bostrom’s Future of Humanity Institute regards AI as human-kinds greatest existential threat.

Robotethics.co.uk gives policy makers, developers, researchers, and designers of AI and robotic systems, access to publicly available material, events and discussions about robots and artificial intelligence to facilitate the consideration of ethical issues at all stages of development.


Video clips, UK Parliament, Artificial Intelligence Committee, parliamentlive.tv, 12 December 2017, under 2 minutes

Witnesses to the Committee on AI emphasise the importance of teaching the ethics of technology in schools

Digital Minister, Matt Hancock, says the UK can lead the world in understanding the ethical implications of AI

Matt Hancock described the role of the Centre for Data Ethics and Innovation

See the fuller deliberations of the AI Committee amongst the videos on the page below.

At present the www.robotethics.co.uk website is structured mainly by media type as follows:

The site is open access. If you are aware of pubicly available materials about robot or artificial intelligence ethics that might be added, want to comment or otherwise make contact then please email: info@robotethics.co.uk

Contact Form

(IEEE consultation on artificial intelligence now closed but see response at: robot ethics response).