Humanoid robots to be classified as “electronic people”?

24/06/2016

On May 31, 2016, the Committee for Legal Affairs at the European Parliament presented a draft motion proposing that robots be considered as “electronic people”. The text concerns robots capable of “making autonomous decisions in an intelligent manner” or of “interacting in an independent manner with third parties”.

For Luxembourger European Member Mady Delvaux, rapporteur of this text: “Humanity is at the dawn of an era where robots, intelligent algorithms, androids, and other forms of more and more sophisticated artificial intelligence, appear to be on the verge of setting off a new industrial revolution”.

From a legal point of view, everyone is accountable and responsible for his own actions as long as he is competent. But in the case of a robot, who would be responsible for its acts? Would it be the manufacturer, the designer of the algorithms, the seller or the owner? If we consider that the artificial intelligence of the robot exceeds the capacities of a human being, can we impute the responsibilities of its acts to the manufacturers?

According to the European report, these autonomous robots have partial responsibility. The Delvaux report notes, “The more autonomous a robot is, the less it can be considered as a simple tool controlled by another actor. (…) It’s time to adopt new rules allowing us to impute the action or inaction to the machine (totally or partially).”

Therefore, the European Parliament could propose “the creation of a legal entity specific for robots, so that at least the most sophisticated and autonomous robots be established as having the status of electronic persons with specific rights and obligations, including that of making amends to any damage inflicted on a third party.”

According to this project, the more advanced robots would need liability insurance policies with the companies in order to frame their social interactions and to manage any possible damages.

Additionally, these robots would have to be labeled so the owner could be easily identified. “The robot must be recognizable: it must have a registration number, a name, and funds to cover their liability, somewhat like a moral person. Because if the robot causes damage, there has to be a way to have recourse against it”, explains Alain Bensoussan, lawyer at the Appeals Court in Paris and legal expert in technology, especially information technology.

This resolution project recommends the institution of a “European Agency for Robots and Artificial Intelligence” to administer all of this.

Restez informé de nos dernières actualités

Articles récents