Libyan Fighters Attacked by a Potentially Unaided Drone, U.N. Says

A military drone that attacked soldiers during a battle in Libya’s civil war last year may have done so without human control, according to a recent report commissioned by the United Nations.

The drone, which the report described as “a lethal autonomous weapons systems,” was powered by artificial intelligence and used by forces backed by the government based in Tripoli, the capital, against enemy militia fighters as they ran away from rocket attacks.

The fighters “were hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems,” according to the report, which did not say whether there were any casualties or injuries.

Sign up for The Morning newsletter from the New York Times

The weapons systems, it said, “were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect a true ‘fire, forget and find’ capability.”

The United Nations declined to comment on the report, which was written by a panel of independent experts. The report has been sent to a U.N. sanctions committee for review, according to the organization.

The drone, a Kargu-2, was used as soldiers tried to flee, the report said.

“Once in retreat, they were subject to continual harassment from the unmanned combat aerial vehicles and lethal autonomous weapons systems,” according to the report, which was written by the U.N. Panel of Experts on Libya and released in March.

The findings about the drone attack, described briefly in the 548-page document, were reported last month by The New Scientist and by the Bulletin of Atomic Scientists, a nonprofit organization.

Human-operated drones have been used in military strikes for over a decade. President Barack Obama for years embraced drone strikes as a counterterrorism strategy, and President Donald Trump expanded the use of drones in Africa.

Nations like China, Russia and Israel also operate drone fleets, and drones were used in the war between Azerbaijan and Armenia last year.

Experts were divided about the importance of the findings in the U.N. report on Libya, with some saying it underscored how murky “autonomy” can be.

Zachary Kallenborn, a research affiliate who studies drone warfare, terrorism and weapons of mass destruction at the University of Maryland, said the report suggested that for the first time, a weapons systems with artificial intelligence capability operated autonomously to find and attack humans.

“What’s clear is this drone was used in the conflict,” said Kallenborn, who wrote about the report in the Bulletin of Atomic Scientists. “What’s not clear is whether the drone was allowed to select its target autonomously and whether the drone, while acting autonomously, harmed anyone. The U.N. report heavily implies, but does not state, that it did.”

But Ulrike Franke, a senior policy fellow at the European Council on Foreign Relations, said that the report does not say how independently the drone acted, how much human oversight or control there was over it, and what specific impact it had in the conflict.

“Should we talk more about autonomy in weapon systems? Definitely,” Franke said in an email. “Does this instance in Libya appear to be a groundbreaking, novel moment in this discussion? Not really.”

She noted that the report stated the Kargu-2 and “other loitering munitions” attacked convoys and retreating fighters. Loitering munitions, which are simpler autonomous weapons that are designed to hover on their own in an area before crashing into a target, have been used in several other conflicts, Franke said.

“What is not new is the presence of loitering munition,” she said. “What is also not new is the observation that these systems are quite autonomous. How autonomous is difficult to ascertain — and autonomy is ill-defined anyway — but we know that several manufacturers of loitering munition claim that their systems can act autonomously.”

The report indicates that the “race to regulate these weapons” is being lost, a potentially “catastrophic” development, said James Dawes, a professor at Macalester College in St. Paul, Minnesota, who has written about autonomous weapons.

“The heavy investment militaries around the globe are making in autonomous weapons systems made this inevitable,” he said in an email.

So far, the AI capabilities of drones remain far below that of humans, Kallenborn said. The machines can easily make mistakes, such as confusing a farmer holding a rake for an enemy soldier holding a gun, he said.

Human rights organizations are “particularly concerned, among other things, about the fragility or brittleness of the artificial intelligence system,” he said.

Dawes said countries may begin to compete aggressively with each other to create more autonomous weapons.

“The concern that these weapons might misidentify targets is the least of our worries,” he said. “More significant is the threat of an AWS arms race and proliferation crisis.”

The report said the attack happened in a clash between fighters for the Tripoli-based government, which is supported by Turkey and officially recognized by the United States and other Western powers, and militia forces led by Khalifa Hifter, who has received backing from Russia, Egypt, the United Arab Emirates, Saudi Arabia and, at times, France.

In October, the two warring factions agreed to a cease-fire, raising hopes for an end to years of shifting conflict.

The Kargu-2 was built by STM, a defense company based in Turkey that describes the weapon as “a rotary wing attack drone” that can be used autonomously or manually.

The company did not respond to a message for comment.

Turkey, which supports the government in Tripoli, provided many weapons and defense systems, according to the U.N. report.

“Loitering munitions show how human control and judgment in life-and-death decisions is eroding, potentially to an unacceptable point,” Mary Wareham, the arms advocacy director at Human Rights Watch, wrote in an email.

She is a founding coordinator of the Campaign to Stop Killer Robots, which is working to ban fully autonomous weapons.

Wareham said countries “must act in the interest of humanity by negotiating a new international treaty to ban fully autonomous weapons and retain meaningful human control over the use of force.”

This article originally appeared in The New York Times.

© 2021 The New York Times Company

source: yahoo.com