17.06.2015 change 17.06.2015

Expert: Researchers are wondering whether to sew conscience in robots

Photo: PAP 2015 / Tomasz Gzell Photo: PAP 2015 / Tomasz Gzell

Robots are becoming more common and are gaining more autonomy, which could raise some concern. That is why researchers are wondering whether autonomous robots should be enhanced with conscience. Expert Dr. Piotr Szynkarczyk spoke about this in an interview with PAP.

"In the future, robots will be commonplace. They will give us opportunities, which we currently do not have" - said in an interview with PAP an intelligent safety systems expert, Dr. Piotr Szynkarczyk, deputy director of the Industrial Research Institute for Automation and Measurements in Warsaw. He explained that now robots are most often used in professional applications and they "perform the dull, dirty and dangerous work". They are given repetitive, precise tasks in factories, they are sent to places inaccessible to humans (for example, they clean sewage systems and explore outer space), or assist in the work on explosives (pyrotechnics robots). But we encounter more and more robots in our daily life - they are used not only in agriculture and medicine, but also in households, where they vacuum houses, mow lawns, or provide entertainment.

ESCAPE FROM THE UNCANNY VALLEY

The increasingly common presence of robots is, however, associated with concerns. And they arise, among others, due to the fact that robots begin to resemble people. Interestingly, even a physical resemblance of machines to humans can cause anxiety. "If a robot looks like Wally or R2D2, it seems cute and everyone loves it. But if a machine is disturbingly similar in appearance to a man, we begin to fear it. This is written in our psyche" - said the expert. He explained that this phenomenon is called the uncanny valley. The "uncanny valley" was first described by researchers from Japan, who specialize in designing robots that look like humans. It turns out that if a robot looks too much like a man (but is not identical), it causes the recipient’s discomfort - concern or even fear. Deputy director of the institute added that if we want a person to feel comfortable with a robot, the machine should look like a machine. It should not pretend to be a man.

THE MAN HIDING BEHIND A MACHINE

Another cause for concern associated with the development of robotics is that robots can pose a real threat to humans. Szynkarczyk mentioned the problem of armed drones used by the US army - also to kill people. "These are not autonomous robots, but remote-controlled machines. Button to open fire is always pushed by a man" - said deputy director of the institute and added that a man sitting safely in California in a military container thousands of kilometres from the battlefield, can be quite differently involved in giving the "Kill!" order, than when he is on site and pulls the trigger himself. Szynkarczyk explained that the moral issues arising from the presence of robots are the topics of a new field of science - roboethics. Its tasks include developing the principles that should guide people who use robots for military or medical purposes.

He gave the example of a situation in which a robot performing surgery fails, for example because of a software error, and causes damage to someone\'s health. "Who is responsible for this situation: the surgeon, creator of the robot, someone who bought the machine, or perhaps someone who services the device? Those are still open questions. We need to devote a lot of attention to these matters" - commented Szynkarczyk.

ROBOT WITH A CLEAR, STILL UNUSED CONSCIENCE

The development of robotics is also associated with the fact that robots are becoming more autonomous and some of their tasks and decisions no longer require human intervention. "The autonomy of robots increases, but at least today we do not have to fear the rise of the machines. There are those who doubt that it will ever happen" - said the expert. He admitted, however, that there are also studies that address this issue. "Recently I was at a conference at which it is proposed to implant conscience in robots" - he emphasised. He recalled a scene from the film "2001: A Space Odyssey". In order to successfully complete the mission, autonomous computer HAL 9000 refused to open the door and rescue the astronaut, saying "I\'m sorry Dave, I\'m afraid I can\'t do that". "At some point a robot will be able to use these words, but to someone who will issue an immoral order. And this is because morality will be sewn into the robot’s algorithms" - said Szynkarczyk.

WHO DO WE PUT AGAINST A ROBOT ARMY?

The expert from the Industrial Research Institute for Automation and Measurements admitted he did not fear that Polish robots produced in the institute (there are 10 types of such robots) would be used for unethical purposes. "Our robots that are used by the police and army, are directly used to replace humans in hazardous areas, where there are harmful chemicals, radioactivity, fire, risk of explosion..." - he explained.

"But you can imagine using robots in war - machines can be armed and sent against the enemy army" - he said. He stressed that in Poland such activities are closely supervised and regulated. "But we can not exclude the possibility of armed robots appearing somewhere in the world. And then the question arises: do we sent our people or our robots against a foreign army of robots..." - he added.

PAP - Science in Poland, Ludwika Tomala

lt/ agt/ mrt/

tr. RL

Przed dodaniem komentarza prosimy o zapoznanie z Regulaminem forum serwisu Nauka w Polsce.

Copyright © Foundation PAP 2024