World War 3 horror: Scientists warn ‘Terminator war’ could break out if AI controls nukes

Artificial intelligence could cause an all out “Terminator-style” war if it were to gain control of China, Russia and the US’ nukes. Both nuclear scientists and defence experts believe that the threat of AI gaining control of nukes far exceeds the danger of AI turning on humanity.

According to a report in the Bulletin of Atomic Scientist, Russia has already started to incorporate AI into its new Poseidon nuclear torpedo.

The missiles were revealed by Russia’s president, Vladimir Putin, earlier this year and are believed to have the capacity to wipe out any city within the EU.

Russia isn’t alone in reportedly supercharging its weapons with AI – both the US and China are allegedly considering bringing more AI into their nuclear arsenal to improve precision and potential devastation.

There is concern over the use of AI controlling nuclear weapons systems becoming the norm as the technology considerably advances

The US is considered a leader in the field of AI weaponry, with Russia and China scrambling to catch up.

The report, by AI experts at Cornell University, warns that an increasing “automation bias” could allow machines it “slip out of control”.

It says military outfits may be lulled into believing AI is inherently the safest route, but the technology could in reality bring “insidious risks that do no manifest until an accident occurs”.

JUST INAI shock: China unveils ‘cyber court’ complete with AI JUDGES

Lt. Col. Petrov was able to defy “automation bias” and correctly identify that the warnings were false, potentially saving the US and Russia from nuclear war.

The report concedes: “Some forms of automation could increase reliability and surety in nuclear operations, strengthening stability.”

The AI technology can geeks gather comprehensive data and provide analysis that would be otherwise unattainable.

Yet, the paper adds: “Other forms could increase accident risk or create perverse incentives, undermining stability.

“When modernising nuclear arsenals, policymakers should aim to use automation to decrease the risk of accidents and false alarms and increase human control over nuclear operations.”

The AI could also be victim to hacking from external sources, for example, am enemy state or agent.

Expert say they are more concerned with China in using AI as it plays catch-up, as it may want to make a statement to the US to prevent the superpower from striking first.

Some scientists go as far as to say AI should not be integrated in weaponry at all, their citing the risks outweighing the benefits tenfold.

source: express.co.uk