Error message

Deprecated function: Methods with the same name as their class will not be constructors in a future version of PHP; mPDF has a deprecated constructor in include_once() (line 38 of /home/futures/webapps/futures_live/sites/all/modules/print/print_pdf/lib_handlers/print_pdf_mpdf/print_pdf_mpdf.module).

EU publishes ethical development guidelines for AI

Signal of change / EU publishes ethical development guidelines for AI

By Ryan Jones / 09 Apr 2019

The EU has laid out a list of principles for creating “trustyworthy AI”. The point where many of the guidelines would begin to apply is still technically a way off, but they are looking to get ahead of development to be able to shift its direction away from more dangerous potential outcomes. 

The guidelines have no enforcement mechanism, yet represent a large step towards more comprehensive national and international policy. The EU has continued to be the leader in monitoring Big Tech and now has a first mover advantage in setting the ground work for the ethical development of AI.


So what?

AI has become incredibly prominent in every aspect of our lives, further integrating as we increasingly rely on it to manage our society and lives. Currently the technology and application is vastly outpacing the ethics and regulation of its evolution. There have been calls across industries and disciplines to address this gap and get ahead of its development as we have no second chance once a certain threshold of intelligence is reached. Amy Webb of NYU and the founder of Future Today Institute speaks about how “everyone wants to dictate what’s happening” but as of today the governments of the countries leading its development, mainly the U. S. and China, have yet put forward any support. Other legislation is expected to come in the coming months from the OECD and the US and Europe are deliberating over different means of overseeing Big Tech.




What might the implications of this be? What related signals of change have you seen?

Please register or log in to comment.