• IP addresses are NOT logged in this forum so there's no point asking. Please note that this forum is full of homophobes, racists, lunatics, schizophrenics & absolute nut jobs with a smattering of geniuses, Chinese chauvinists, Moderate Muslims and last but not least a couple of "know-it-alls" constantly sprouting their dubious wisdom. If you believe that content generated by unsavory characters might cause you offense PLEASE LEAVE NOW! Sammyboy Admin and Staff are not responsible for your hurt feelings should you choose to read any of the content here.

    The OTHER forum is HERE so please stop asking.

'Killer Robots': 10 Reasons Why Experts Want Ban

Pulse

Alfrescian
Loyal
Joined
Jul 25, 2013
Messages
99
Points
0

'Killer Robots': 10 Reasons Why Experts Want Ban


A group of scientists and tech experts wants the UN to back moves to ban weapons over which humans have no "meaningful control".

01:56 Monday 07 September 2015

ed209new-1-736x414.jpg


ED 209 killed a human in the original Robocop movie. Pic: Orion Pictures

There have been calls for the United Nations to support a ban on autonomous weapons that select and engage targets without human intervention - also known as "killer robots".

Thousands of scientists and technology experts, including entrepreneur Elon Musk and physicist Stephen Hawking, signed a letter in July saying they needed to be outlawed.

The letter, issued by the Future of Life organisation, argued if any major military power pushed ahead with Artificial Intelligence weapon development, a global arms race was virtually inevitable.

And autonomous weapons would become the 'Kalashnikovs of tomorrow' it said, as they would be cheap to mass produce and end up in the hands of terrorists, dictators and warlords.

The International Committee for Robot Arms Control admits robots can make humans' lives better by doing mundane and dangerous tasks, increasing productivity and helping after natural disasters.

But it has given 10 reasons why Lethal Autonomous Weapons Systems (LAWS), over which humans have no "meaningful control", could "perilously impact global security".

1. Proliferation


Without a ban on the development, testing and production of LAWS, there is likely to be a proliferation of the arms and counter weapons.

2. Lowered threshold for armed conflicts


With LAWS, there could be fewer human troops on the ground in conflict zones and without them there could be more armed conflicts and civilian populations may suffer.

3. Continuous global battlefield


LAWS could be left behind - like landmines - to patrol post-conflict zones and may create a continuous global battlefield.

4. Unpredictability of interaction

LAWS and its software programmes will inevitably interact with competing hostile devices controlled by unknown software - with results impossible to predict.

5. Accelerating the pace of battle

New prototypes in unmanned systems are increasingly being tested at supersonic and hypersonic speeds which will mean even faster weapons and less chance for humans to have control.

6. Accidental conflict


Defence systems of one state could interact with equally fast LAWS from another state and this could trigger unintended armed conflicts before humans were able to react.

7. Militarisation of the civilian world

With autonomous targeting technology there could be violations of human and civil rights by police and private security forces with little possibility of accountability.

8. Automated oppression

While human soldiers can refuse to turn their weapons on their own people, LAWS will kill mercilessly due to their coded instructions.

9. Non-state actors

Crude copies of autonomous weapons could get into the hands of non-state armed organisations.

10. Cyber vulnerability


LAWS will be inherently insecure as there are risks of software coding errors and malfunctions - and so humans need to be in control of weapons systems.


 
Back
Top