• IP addresses are NOT logged in this forum so there's no point asking. Please note that this forum is full of homophobes, racists, lunatics, schizophrenics & absolute nut jobs with a smattering of geniuses, Chinese chauvinists, Moderate Muslims and last but not least a couple of "know-it-alls" constantly sprouting their dubious wisdom. If you believe that content generated by unsavory characters might cause you offense PLEASE LEAVE NOW! Sammyboy Admin and Staff are not responsible for your hurt feelings should you choose to read any of the content here.

    The OTHER forum is HERE so please stop asking.

Sci-fi short, but in our future for warfare

CharKuayTeow

Alfrescian
Loyal

Source: https://www.independent.co.uk/life-...e-fully-automated-military-kill-b1856815.html

Autonomous military drones may have attacked humans, UN says​

Vishwam Sankaran8 hours ago
A military drone may have autonomously attacked humans for the first time without being instructed to do so, according to a recent report by the UN Security Council.

The report, published in March, claimed that the AI drone – Kargu-2 quadcopter – produced by Turkish military tech company STM, attacked retreating soldiers loyal to Libyan General Khalifa Haftar.

The 548-page report by the UN Security Council’s Panel of Experts on Libya has not delved into details on if there were any deaths due to the incident, but it raises questions on whether global efforts to ban killer autonomous robots before they are built may be futile.

Over the course of the year, the UN-recognized Government of National Accord pushed the Haftar Affiliated Forces (HAF) back from the Libyan capital Tripoli, and the drone may have been operational since January 2020, the experts noted.

“Logistics convoys and retreating HAF were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the STM Kargu-2,” the UN report noted.

Kargu is a “loitering” drone that uses machine learning-based object classification to select and engage targets, according to STM, and also has swarming capabilities to allow 20 drones to work together.

“The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability,” the experts wrote in the report.

Many robotics and AI researchers in the past, including Elon Musk, and several other prominent personalities like Stephen Hawking and Noam Chomsky have calledfor a ban on "offensive autonomous weapons", such as those with the potential to search for and kill specific people based on their programming.

Experts have cautioned that the datasets used to train these autonomous killer robots to classify and identify objects such as buses, cars and civilians may not be sufficiently complex or robust, and that the artificial intelligence (AI) system may learn wrong lessons.

They have also warned of the “black box” in machine learning, in which the decision making process in AI systems is often opaque, posing a real risk of fully autonomous military drones executing the wrong targets with the reasons remaining difficult to unravel.

Zachary Kallenborn, a national security consultant specialising in unmanned aerial vehicles, believes there is greater risk of something going wrong when several such autonomous drones communicate and coordinate their actions, such as in a drone swarm.

“Communication creates risks of cascading error in which an error by one unit is shared with another,” Kallenborn wrote in The Bulletin.

“If anyone was killed in an autonomous attack, it would likely represent an historic first known case of artificial intelligence-based autonomous weapons being used to kill,” he added.
 

mahjongking

Alfrescian
Loyal
this is just going to escalate conflicts and kill more people,
whatever cruel device you invent, the enemy will have it too eventually,

why are those idiots in the audience clapping? they think they won? in actual fact nobody wins
 
Top