KILLER robots? Artificial intelligence ‘will be used to do EVIL THINGS’ warns expert

Martin Ford, the author behind New York Times bestseller ‘Rise of the Robots’, said there are many risks that people must be aware of when it comes to role of artificial intelligence in society.

He said: “Artificial intelligence, robots and smart software are going to be used by criminals to hack in to systems and do evil things.”

When questioned on the threat of blackmailing hackers, the robotics expert gave an insight into what may be to come.

He told the Daily Star Online: “There will also be the other side, there will be good guys using artificial intelligence to protect systems.

“But, in general, as systems become more autonomous and people are not in the loop, they become more vulnerable to hacking.

“Think about self-driving trucks in the future that deliver all our food.

“We wouldn’t want someone to hack into those trucks and bring them all to a stop – that would be a huge problem.”

Mr Ford’s words have come as many scientists, including Stephen Hawking, have argued that it may only be a matter of time before artificial intelligence destroys mankind like something out of science fiction.

Mr Hawking said: “The development of full artificial intelligence could spell the end of the human race.”

Robots could soon be weaponised and some people have advocated the advancement, arguing it could save lives.

But a report by Human Rights Watch and the Harvard Law School International Human Rights Clinic has called for people to remain in control of weapons at a time of rapid advancement.

Senior arms division researcher at Human Rights Watch, Bonnie Docherty, said: “Machines have long served as instruments of war, but historically humans have directed how they are used.”

“Now there is a real threat that humans would relinquish their control and delegate life-and-death decisions to machines.”

Professor Hawking, Elon Musk and more than 1,000 robotics experts warned that such weapons could be developed within years, not decades.