Dræber robotter og droner
autonome dræbende dronesværme
autonome dræbende våbensystemmer

 

 

Sidst redigeret d. 19/12 - 2021

Billede:Demonstrators standing beside the “Broken Chair” statue in Geneva, a symbol of the damage done by landmines and cluster munitions, called on countries to ban autonomous weapons. Credit: Clare Conboy/Campaign to Stop Killer Robots

Billede: "Drones: Making enemies faster than we can kill them" af Ben Schumin - klik her.

Droner kan selv dræbe mennesker,!

 

Kære alle

Husk at vi kun har undgået en atomkrig fordi mennesker har grebet ind overfor teknikken.
Teknikken har flere gange fortalt både amerikanske og rusiske militærfolk at et atomangreb var på vej, hvis ikke mennekset havde grebet ind og sagt at dette er en fejl i systemet, havde jorden være gjort ubeboelig for menneker grundet en atomkrig.

Det er meget usikkert at overlade beslutninger til teknikken, se nedenfor, derfor bør det ikke være tillad at lave autonome systemer.

Fra Hellen Caldecott’s bog “Hvis er størst?” Om våbenkapløb & atomkrig. På side 33 - 37 beskriver Hellen Caldicotte bl.a. at i en periode på 18 måneder fra januar 1979 til juni 1980 var der 3.703 alamer, hvoraf de fleste blev vurderet og afvist rutinemæsigt, men 152 af dem var alvorlige nok til at mulighede for et angreb var tilstede. På nær 3 af de 152 alarmer kom fejlen fra radar eller satelitter. På et tidspunkt var der ved en fejl puttet et computerspil i systemet, så spillet viste at der var et angreb på vej. USA har været i højeste alarmberedskab og og alt klar til at lette / blive affyret.

En russisk u-bådskaptajn nægtede at tro på den alarm han havde modtaget, så heldigvis undlod han at sende sine atomraketter afsted.

Måske er det muligt at hacke sig ind på de servere som programmer til raketterne mm styres fra. Er det tilfældet, vil det også være en usikkerhedsfaktor ved teknikken.

I Ukraine er der rigmænd der har deres egen hær, der er ca. 14 forskellige hære i vest Ukraine. Dette skaber nogle store demokratiske problemer. Dette problem ville være endnu større hvis disse rigmænd besad autonome dræber droner.

Demokratiet er i fare hvis man går over til autonome dræber droner / robotter. Soldater kan nægte at skyde, hvis det bliver for vanvittigt, det kan robotter ikke.

Der skal kun en til at styre autonome dræber robotterne, der skal mange til at få en hær til at fungere.

Forestil dig at Hitler eller Pinochet havde haft den viden der er om hver enkelt borger i en række computersystemer i dag og med mulighed for at styre en stor hær med autonome dræberrobotter.

Når droner bliver en del af trafikken bliver det vanskeligere at beskytte sig mod kamp droner med mindre disse er totalt forbudt.

Så hvis vi skal værne om livet og om demokratiet, skal der være et forbud mod produktion af dræber robotter.

Bedste fredshilsner

Poul

Sidens top

Her kan du se et facebook opslag med en video der viser lidt om hvor langt man er kommet med robotudviklingen. En udvikling der kan bruges til gode og til skrækkelige ting. Vær med til at skabe en lovgivning der forbyder autonome dræber robotter.

Klik her

 

---------------------------------------------------------

Nye teknologier er ved at ændre kamphandlinger helt

se f.eks. dette

 

Convention on Certain Conventional Weapons – Sixth Review Conference d. 13 - 17 - 2021.

UNODA

Overview
Documents
Statements
Side events

Link til dette.

Her kan du hører hvad CAMPAIGN TO STOP KILLER ROBOTS konkludere ved conferencens afslutning.

Se det sidste møde, den sidste taler. Klik her.

Sidens top

 

Orientering d. 17/12 - 2021

27:48 inde i udsendelse.

Orientering d. 17/12
om killerrobot
Kampagnen Stop killer robot fortæller til orientering at disse våben ikke er science-fiction
deltager i Geneve i 6. FN-konferense om disse våbensystemer

Interview med Iben Yde lektor i militær teknologi ved Forsvars Akademiet.

Hun fortæller at vi ser nærforsvar mod missiler som er stramt programmeret, men der er ikke udviklet autonome våben, der efter de er aktiveret er i stand til selv at udvælge udvælge mål og angribe,

Autonomi kan defineres ved: frihed og skøn.
Der er ikke våbensystemer der har frihed i valg af mål og måden angrebt bliver gennemført på.

Red: I modstrid af dræbende autonome dronesværme, som er anvendt af Israel i maj 2021.

Link

Sidens top

 

Crunch Time on Killer Robots
Why New Law Is Needed and How It Can Be Achieved

Human Rights Watch d. 1/12 - 2021

The Sixth Review Conference of the Convention on Conventional Weapons (CCW), scheduled to be held at the United Nations in Geneva from December 13-17, 2021, is a major juncture for international talks on lethal autonomous weapons systems. After holding informal and formal discussions on the matter since 2013, states parties now face the pivotal decision of whether to approve a mandate to open negotiations of a protocol on the systems or to leave the CCW to initiate negotiations elsewhere.

udging by an assessment of the two CCW meetings held so far this year, much debate at the Review Conference will center on views of the adequacy of existing international humanitarian law regarding autonomy in weapons systems. At the most recent CCW Group of Governmental Experts (GGE) meeting on lethal autonomous weapons systems in September-October, most CCW states parties called for a new legally binding instrument on the topic, while a minority countered that existing international humanitarian law is sufficient to address any problems raised by autonomous weapons systems. Other states have yet to express a clear position on the question.

This briefing paper explains why a new treaty on autonomous weapons systems is needed to clarify and strengthen existing international humanitarian law. Such an instrument would address the legal, ethical, accountability, and security concerns such systems pose by including the following elements:

A broad scope that covers all weapons systems that select and engage targets on the basis of sensor inputs—that is, systems in which the object to be attacked is determined by sensor processing, not by humans;
A general obligation to retain meaningful human control over the use of force;
A prohibition on the development, production, and use of weapons systems that by their nature select and engage targets without meaningful human control;
A prohibition on the development, production, and use of autonomous weapons systems that target people; and
Positive obligations to ensure other autonomous weapons systems cannot be used without meaningful human control.
This briefing paper provides an overview of states’ positions on the adequacy of existing international humanitarian law, highlighting the widespread support for new law and noting that any divergence of views reinforces the need to clarify existing law. It examines justifications for a new instrument on grounds of international humanitarian law, ethics, international human rights law, accountability, and security. The paper then discusses the way forward, identifying potential forums for negotiating a new treaty outside of the CCW, including an independent stand-alone process and the United Nations General Assembly.

Recommendations

The emergence of autonomous weapons systems and the prospect of losing meaningful human control over the use of force are grave threats that demand urgent action. Human Rights Watch and Harvard Law School’s International Human Rights Clinic (IHRC) call on CCW states parties to move beyond diplomatic discussions and:

Express their support for a new legally binding instrument on autonomous weapons systems at the next GGE meeting, scheduled for December 2-8, and at the Sixth Review Conference that follows;
Call for a legally binding instrument that includes prohibitions and regulations to preserve meaningful human control over the use of force and bans autonomous weapons systems that target people; and
Agree at the Sixth Review Conference to a mandate to negotiate a new CCW protocol, or, if that fails, commit to initiate as soon as possible negotiations of a legally binding instrument on autonomous weapons systems elsewhere...

Læs hele artiklen som er meget grundig

Sidens top


Views and recommendations of the ICRC for the Sixth Review Conference of the Convention on Certain Conventional Weapons

International Committee of the Red Cross, ICRC d. 8/11 - 2021

The Sixth Review Conference of the Convention on Certain Conventional Weapons (CCW), to be held from 13 to 17 December 2021 in Geneva, is a key moment for High Contracting Parties to take stock of, and build on, the important role the CCW has played in minimizing suffering in armed conflict, in order to ensure that the CCW remains fit for purpose as warfare evolves...

This working paper outlines the views and recommendations of the ICRC on issues of humanitarian concern relevant to the CCW, specifically: adherence to the CCW and national implementation; mines other than anti-personnel mines; incendiary weapons and weapons with incendiary effects; blinding laser weapons and other laser systems; explosive remnants of war; explosive weapons in populated areas; autonomous weapon systems; and review of developments in science and technology, and legal review of new weapons, means and methods of warfare.

Læs hele artikelen

Sidens top

 

Military drones

Youtube 11/10 - 2021

Link til video

Sidens top

 

A Preview of Political Leadership - Vienna Conference

Stop Killerrobots

On 15-16 September 2021, the Austrian Ministry of Foreign Affairs hosted an online conference on Safeguarding Human Control over Autonomous Weapons. Taking place over two afternoon sessions, it kicked off with a High Level Panel that provided a striking illustration of political leadership embracing the need for action on this issue.

Austria’s Foreign Minister, Alexander Schallenberg and New Zealand’s Minister for Disarmament, Phil Twyford, presented a strong call for action towards new international law that would establish prohibitions and regulations on autonomy in weapons systems. Their comments recognised the implications of this issue in conflict and also for our wider society, and they also evoked both countries’ past leadership in processes to develop treaty law.

Læs hele artiklen

Sidens top

 

GREJEN MED MÖRDARROBOTAR

Publicerad av Internationella Kvinnoförbundet för Fred och Frihet (IKFF)
Denna publikation är framtagen med stöd från Campaign to Stop Killer Robots som består av en koalition av över 170 civilsamhällesorganisationer i 65 länder. Målet för kampanjen är att ett så kallat förebyggande förbud ska förhandlas fram, med syfte att stoppa utvecklingen av mördarrobotar. Läs mer på: www.stopkillerrobots.org
Stockholm 2021
Författare: Gabriella Irsten Ansvarig utgivare: Malin Nilsson Layout: Pernilla Lundmark

Klip fra publikationen (versat med Google):

Computere bliver klogere, med deres egne "hjerner", der kan lære ting og træffe deres egne beslutninger uden vores indblanding, med hjælp så kaldt artificial intelligens (AI) såkaldt kunstig intelligens.

Dette er en stor fordel for os ti forhold til at håndtere klimaændringer eller til at forbedre vores sundhedsydelser.

Vi har alle oplevet at computere selv kan lave fejl og vi må genstarte. Men hvad sker der hvis vi tale rom våben?

Kunstig intelligens kan træffe sine egne beslutninger for at nå sit mål på den mest effektive måde. AI har evnen til at lære og tilpasse sig baseret på erfaring, hvilket indebærer ekstreme risici, da de kan ændre deres adfærd, gøre ting, de oprindeligt ikke var tiltænkt, uden at producenten eller brugeren forstod hvordan eller hvorfor. Teknologien har derfor alvorlige begrænsninger som vi kan ikke vide, hvordan de vil opføre sig. Ligesom med din computer kan der også opstå tekniske fejl. Hvis en dræberrobot begynder at skyde ukontrollabelt på grund af fejl, vil konsekvenserne være katastrofale. Hvor store risici er acceptable? Vi ved også, at intet system er hundrede procent vandtæt. Hackerangreb og fejl i systemerne kan have fatale konsekvenser, når det kommer til våben.

Link til publikationen

Sidens top

 

BENEFITS & RISKS OF ARTIFICIAL INTELLIGENCE

WHAT IS AI?
From SIRI to self-driving cars, artificial intelligence (AI) is progressing rapidly. While science fiction often portrays AI as robots with human-like characteristics, AI can encompass anything from Google’s search algorithms to IBM’s Watson to autonomous weapons.

Artificial intelligence today is properly known as narrow AI (or weak AI), in that it is designed to perform a narrow task (e.g. only facial recognition or only internet searches or only driving a car). However, the long-term goal of many researchers is to create general AI (AGI or strong AI). While narrow AI may outperform humans at whatever its specific task is, like playing chess or solving equations, AGI would outperform humans at nearly every cognitive task.

Link til artikkel

Sidens top

I løbet af 44 dage så vi, hvordan verden blev forandret. Hvordan selve krigens væsen blev forandret.

Politiken d. 1/8 - 2021 alf Michael Jarlner

Krigen i Nagorno-Karabakh var en øjenåbner for, hvor fremtidens krige bevæger sig hen. Og mod en debat, der også herhjemme bliver mere og mere påtrængende. Med købet af F-35-fly går vi faktisk i en retning, som de færreste nok helt har forstået rækkevidden af.

a våbnene atter var bragt til tavshed efter 44 dage af en krig, de fleste måske allerede har glemt, vidste eksperterne godt, at noget stort lige var sket. At præcis den krig vil sætte sig nogle varige spor. Og at præcis de spor også gør det endnu mere påtrængende at besvare nogle spørgsmål, som de fleste af os slet ikke kan overskue rækkevidden af.

»For mange var den et wakeupcall«, lyder det fra Ulrike Franke.

Den tyske forsker er ansat ved tænketanken European Council on Foreign Relationsi London. Og så er hun en af Europas mest fremtrædende sikkerhedseksperter og rådgiver for blandt andet EU, ikke mindst når det gælder spørgsmål om droner og fremtidig krigsførelse. Til det sidste siger hun dog: »Fremtiden er her allerede«.

I løbet af 44 dage så vi, hvordan verden blev forandret. Hvordan selve krigens væsen blev forandret.

Krigen i Nagorno-Karabakh var en øjenåbner for, hvor fremtidens krige bevæger sig hen. Og mod en debat, der også herhjemme bliver mere og mere påtrængende. Med købet af F-35-fly går vi faktisk i en retning, som de færreste nok helt har forstået rækkevidden af.
Da våbnene atter var bragt til tavshed efter 44 dage af en krig, de fleste måske allerede har glemt, vidste eksperterne godt, at noget stort lige var sket. At præcis den krig vil sætte sig nogle varige spor. Og at præcis de spor også gør det endnu mere påtrængende at besvare nogle spørgsmål, som de fleste af os slet ikke kan overskue rækkevidden af.

Den tyske forsker er ansat ved tænketanken European Council on Foreign Relationsi London. Og så er hun en af Europas mest fremtrædende sikkerhedseksperter og rådgiver for blandt andet EU, ikke mindst når det gælder spørgsmål om droner og fremtidig krigsførelse. Til det sidste siger hun dog: »Fremtiden er her allerede«.

Vi står over for en militærteknologisk revolution
Iben Yde, forsker ved Forsvarsakademiet
Så det følgende er ikke science fiction. Det er mere historien om, hvordan en krig på blot 44 dage har skrevet sig ind som en slags startkapitel i en militær omvæltning, der formentlig er en hel del mere fremskreden, end mange forestiller sig. Faktisk tøver den danske folkerets- og sikkerhedsforsker Iben Yde, netop nu aktuel som hovedforfatter til den spritnye bog ’Smart krig – Militær anvendelse af kunstig intelligens’, ikke med at tale om historiske forandringer:

»Vi står over for en militærteknologisk revolution, som er mere omfattende og indgribende i den militære opgaveløsning, end opfindelsen af motoriserede køretøjer, atombomben og præcisionsvåbnene i sin tid var det«, mener hun.

Læs hele artiklen

Sidens top


The Future Of Artificial General Intelligence

Fobes 16/7 - 2021

...Artificial intelligence can be broadly categorized into three main types: artificial narrow intelligence (ANI), artificial general intelligence (AGI) and artificial superintelligence (ASI). Amongst these, AGI positions artificial intelligence at par with human capabilities. As a result, AGI systems can think, comprehend, learn and apply their intelligence to solve problems much like humans would for a given situation...

...The next decade will play a crucial role in accelerating the development of AGI. In fact, experts believe that there is a 25% chance of achieving human-like AI by 2030. Furthermore, advancements in robotic approaches and machine algorithms, paired with the recent data explosion and computing advancements, will serve as a fertile basis for human-level AI platforms...

Læs hele artiklen

Sidens top

 

THE FIRST AI WAR

In apparent world first, IDF deployed drone swarms in Gaza fighting

The Times og Israel By JUDAH ARI GROSS d. 10/7 - 2021

Military used ‘flocks’ of small aircraft, all communicating with each other, to locate targets, direct airstrikes, highlighting advances in artificial intelligence-driven combat

Military used ‘flocks’ of small aircraft, all communicating with each other, to locate targets, direct airstrikes, highlighting advances in artificial intelligence-driven combat

...“After a year of preparation and exercises, the situation came and the aerial detection system is able to find the enemy and destroy it and bring the operational achievement we are looking for,” ...

...However, Israeli drone expert Tal Inbar said it was not clear if these were truly the first attacks by a drone swarm in the world, as has been claimed in media reports in recent days, but this was nevertheless a significant milestone in the use of the technology...

...According to Inbar, there are a number of different methods for deploying drone swarms, which can range in size from just a handful of vessels to several thousand. In some cases, all of the aircraft work as equals, while in others certain drones have greater computer processing capabilities and act as commanders for the rest...

...Israel’s use of drone swarms during May’s conflict garnered international coverage in large part because it indicates the speed at which this technology is developing and being deployed in the real world.

For now, the use of large swarms requires a high level of artificial intelligence and machine learning technology, meaning it is mostly in the domain of larger nation-states that have the necessary technical capabilities. But that is starting to change.

“Artificial intelligence is not something that’s just for superpowers anymore,” Inbar said.

Læs hele artiklen

Sidens top

 

Israel used first-ever AI-guided combat drone swarm in Gaza attacks

InceptiveMind d. 6/7 - 2021

Swarms of flying robots could change the way wars are fought. In recent years, its use has seen a surge in warzones.

Recently, the Israel Defense Forces (IDF) used a swarm of small combat drones to locate, identify, and attack Hamas militants within the Gaza strip. The mission was carried out during the Israel-Gaza conflict in mid-May, marking the first time that an artificial intelligence-guided drone swarm has been used in combat.

Since their introduction into armed forces around the world, drones have typically been controlled individually by remote operators. But a drone swarm, like the one just used against the Palestinian Hamas militant in the Gaza Strip, is a single networked entity that flies itself using artificial intelligence...

...However, the increasing use of AI-guided drones is a concern for many, including the UN Security Council and Humans Rights Watch, which runs a campaign to Stop Killer Robots, calling for a pre-emptive ban on fully autonomous weapons.

Læs hele artiklen


An expert's point of view on a current event.
Killer Flying Robots Are Here. What Do We Do Now?

FP d. 5/7 - 2021 By Vivek Wadhwa, a columnist at Foreign Policy and a fellow at Harvard Law School’s Labor and Worklife Program, and Alex Salkever, a technology writer and futurist.

A new generation of AI-enabled drones could be used to terrible ends by rogue states, criminal groups, and psychopaths.

In the popular Terminator movies, a relentless super-robot played by Arnold Schwarzenegger tracks and attempts to kill human targets. It was pure science fiction in the 1980s. Today, killer robots hunting down targets have not only become reality, but are sold and deployed on the field of battle. These robots aren’t cyborgs, like in the movies, but autonomously operating killer drones. The new Turkish-made Kargu-2 quadcopter drone can allegedly autonomously track and kill human targets on the basis of facial recognition and artificial intelligence—a big technological leap from the drone fleets requiring remote control by human operators. A United Nations Security Council report claims the Kargu-2 was used in Libya to mount autonomous attacks on human targets. According to the report, the Kargu-2 hunted down retreating logistics and military convoys, “attack[ing] targets without requiring data connectivity between the operator and the munition.”...

...The companies producing the new wave of autonomous flying weapons are heavily marketing their wares. Meanwhile, the United States and China have thus far refused to back calls for a ban on the development and production of fully autonomous weapons. Washington and Beijing are thereby providing a cover of tacit legitimacy for weapons makers and governments deploying the new killer drones in the field.

Læs hele artiklen

Sidens top

 

SEEK AND DESTROY Israel uses first-ever AI drone swarm in battle to hunt down and blitz Hamas terrorists with NO human input

The US SUN d. 5/7 - 2021

...Arthur Holland of the United Nations Institute for Disarmament Research said that “if confirmed, they are certainly a notch up in the incremental growth of autonomy and machine-to-machine collaboration in warfare".

Drones have previously been directed by a single operator who ‘fly’ the aircraft from a remote base.

But in recent years, militaries have been working on developing Artificial Intelligence that allows the drones to work together without the need for an operator....

the Israel Defense Forces (IDF) 

Læs hele artiklen

Sidens top

 

First "AI WAR": Israel Used World's First AI-Guided Swarm Of Combat Drones In Gaza Attacks.

Youtube d. 3/7 - 2021

Se video på Youtube

Sidens top

 

THE FIRST AI WAR

In apparent world first, IDF deployed drone swarms in Gaza fighting

The Times of Israel By JUDAH ARI GROSS d. 10/7 - 2021

Military used ‘flocks’ of small aircraft, all communicating with each other, to locate targets, direct airstrikes, highlighting advances in artificial intelligence-driven combat.

Mass swarms of dozens or hundreds of drones guided by artificial intelligence are widely considering to be one of the more worrying weapons making their way into the modern battlefield, one with the potential to be far cheaper and thus more available to non-state actors than other advanced munitions.

Læs hele artiklen

Sidens top

 

Israel drone swarm technology: Israel AI drone swarm in Gaza attack on Hamas

BCFocus d. 2/7 - 2021

...Swarm drone technology is no less dangerous
When multiple drones perform a mission together, this system is referred to as drone swarm or swarm drone technology. In these there is a mother drone, from which emerge many small drones capable of attacking different targets. Due to the high number, enemy anti-aircraft guns or missiles also prove ineffective against them. This new technology has the potential to change the whole battle scene in the future. This technology will prove to be very important in war without contact war, that is, without any human contact. These drones have the power to carry out a suicide attack on strategic enemy targets. These drones are also known as ALFA-S (Air Launched Flexible Asset or Swarm). Its main drone or parent drone can be dropped into the air through any fighter jet. After which many small drones will emerge from these mother drones and be able to destroy enemy bases.

Swarming drones can cause massive destruction
Swarm drones are capable of causing massive destruction in enemy territory. Inexpensive, lightweight and equipped with high-tech artificial intelligence technology, these drones are capable of destroying any target. Swarm drones are also capable of deceiving air defense systems and radars operating in enemy territory. Due to their small size, even radars are mostly unable to capture these drones. Swarm drones are capable of wreaking havoc by penetrating 50 kilometers inside enemy territory. These can contain guns or bombs, which can be shot or explode. Apart from this, with the help of Swarm Drone, logistics and military equipment can also be delivered to the military in difficult situations. Due to its small size, each drone can airlift a small amount of cargo and drop it off at a designated place.

Læs hele artiklen

Sidens top

 

Israel used world's first AI-guided combat drone swarm in Gaza attacks

NewScientist d. 30/6 - 2021

During operations in Gaza in mid-May, the Israel Defense Forces (IDF) used a swarm of small drones to locate, identify and attack Hamas militants. This is thought to be the first time a drone swarm has been used in combat.

Drones are usually controlled individually by remote operators, but a swarm is a single networked entity that flies itself using artificial intelligence. It can cover a wide area and keep operating even if it loses many units, and only requires a single…

Read more.

 

Hvor galt kan det gå med dræberrobotter?

videnskab.dk d. 29/6 - 2021

En FN-rapport konkluderer, at dræberdroner har været i brug i Libyen. Det kan være starten på en dødbringende glidebane med historiske paralleller.

En klynge af droner hænger faretruende i luften over et nordafrikansk ørkenlandskab. Pludselig dykker den summende sværm mod en deling menneskelige soldater.

De flygter i vild forvirring, men dronerne jager dem og beskyder dem med indbyggede maskingeværer.

Intet menneske er involveret i dronernes beslutningsproces, de vælger autonomt deres mål, beskyder dem og glemmer dem igen.

Teknologien er baseret på kunstig intelligens, blandt andet ansigtsgenkendelse.

Lyder det som science fiction?

Ikke længere.

En FN-rapport fra marts 2021 konkluderer, at droner af typen Kargu 2 blev brugt i 2020 i Libyen, hvor de forfulgte og angreb flygtende oprørstropper.

Dette er en historisk hændelse, og vi kan måske også lære af historien, hvor galt det kan ende...

...Et våbenkapløb er sat i gang

Den tyrkisk producerede drone er ikke endepunktet for autonome våbensystemer.

Det forekommer rimeligt at antage, at et våbenkapløb om udvikling af langt mere sofistikerede dræberrobotter allerede er i gang, og at de vigtigste beslutninger vil blive taget i Washington og Beijing.

Den slags kan gå stærkt. Vi kom fra nedkastning af håndgranater til nedkastning af atombomber på 34 år. Og hastigheden af den teknologiske udvikling er ikke just aftaget siden.

Den militære udvikling drives også frem af det globale kapløb om at være førende inden for kunstig intelligens, som Kina og USA leder.

Jo mindre disse aktører bekymrer sig om at overholde krigens love, jo lettere er det at udvikle teknologien.

Hvis man blot ønsker et instrument, der kan udpege mennesker og slå dem ihjel uafhængigt af principperne for krigens love, har hændelsen i Libyen vist, at det kan gøres effektivt i forhold til at nå militære mål.

Desuden er krigens love, som vi kender dem, formentlig uvirksomme i forhold til at forhindre de negative konsekvenser af denne nye teknologi, da disse konsekvenser er forbundet med en høj grad af usikkerhed.

Vores juridiske og etiske begreber er simpelthen ikke udviklet til at håndtere denne nye situation...

læs hele artiklen

Sidens top

 

CONVENTION ON PROHIBITIONS OR RESTRICTIONS ON THE USE OF CERTAIN CONVENTIONAL WEAPONS WHICH MAY BE DEEMED TO BE EXCESSIVELY INJURIOUS OR TO HAVE INDISCRIMINATE EFFECTS

Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems Geneva, 21-25 September and 2-6 November 2020

Læs mere her

 

Webinar series on the technological, military and legal aspects of lethal autonomous weapon systems

UN oktober 2020

Summary of the webinars

kan downloades herfra.

Sidens top

 

Legality of drone warfare

The Bureau of Investigative Journalism

...There is just one law that underpins the whole basis for the US being at war with al Qaeda and its cronies that has been even remotely scrutinised by Congress. The Authorisation for the Use of Military Force Act was drafted by the Bush White House in the week after the 9/11 attacks.

At its heart is a sixty-word sentence that gives the US president the power to “use all necessary and appropriate force against those nations, organisations, or persons” that he or she determines was behind or helped the people who carried out the attacks. It was passed into law by Congress on September 16 with only one dissenting vote.

The AUMF’s scope gave the president a free hand. It has no time or geographical limits; it technically allows the president to fight a perpetual global war. It also empowers the president to go after individuals as well as nation states.

Within weeks of it becoming law the US and its allies had invaded Afghanistan, going after the Taliban and al Qaeda. A year later the US carried out its first drone strike beyond active battlefields, killing six al Qaeda fighters in Yemen.

By 2004 the US was striking al Qaeda, the Taliban, and various other armed groups in Pakistan.

AUMF is so broad that it allows the President to target new enemies without the usual authorisation from Congress. The scope has grown from just the Taliban and al Qaeda - AUMF is now being used to justify strikes against groups that did not exist when al Qaeda attacked the World Trade Centre and Pentagon...

Læs hele artiklen

Sidens top

 

Berlin has no obligation to ensure US doesn’t commit war crimes with drone strikes from Germany, court rules

RT d. 27/11 - 2020

Berlin will no longer be required to ensure that US drone strikes coordinated through an air base in Germany are in line with international law, a top court has ruled, in a “severe blow” to a case brought by human rights groups.
The Federal Administrative Court in Leipzig concluded that the government has no obligation to guarantee US strikes are in line with humanitarian law beyond basic assurances from US authorities, overturning a ruling from last year that made Berlin partially responsible for such operations...

Læs hele artiklen

Sidens top

Amazon får amerikansk godkendelse til dronelevering

finans d. 3/9 - 2020

Med den første nødvendige tilladelse i hus fra de amerikanske luftfartsmyndigheder kan Amazon nu for alvor få sat i gang i det længe ventede droneprojekt, Amazon Prime Air.

Læs hele artiklen

Sidens top

 

Autonomous killer drones set to be used by Turkey in Syria

NewScientist d. 20/6 - 2020

Turkey is to become the first nation to use drones able to find, track and kill people without human intervention.

The country recently started producing armed, human-operated drones and is reported to have used them hundreds of times in north-west Syria. Now, Turkish defence company STM has announced that the nation’s army will start using its Kargu drones early next year.

These 7-kilogram quadcopters are intended to be used as part of a cooperative swarm. A video posted on YouTube (Link til YouTube).

Link til artklen - kræver abonnement

 

Sidens top


SIPRI Reflection: How to ensure human control over autonomous weapons

D. 27/3 - 2020

Se video på Youtube

Sidens top

 

Analyse: Dronerne kommer – og de angriber også i sværme

USA’s droneangreb i fredags peger mod en ny tidsalder, hvor droner vil være centrale våben. Men den enlige drone får følge af hele dronesværme.

Politiken, analyse af Michael Jarner d. 10/1 - 2020

Det var et angreb, der pegede ind i en ny tidsalder, da en amerikansk Reaper-drone for en uge side angreb og dræbte en af Irans mest magtfulde mænd, general Qassem Suleimani, fra et sted i himlen over den irakiske hovedstad, Bagdad...

...For en organisation som Campaign to Stop Killer Robots peger udviklingen mod en dystopisk fremtid, hvor sværme af dræberrobotter med våben og kunstig intelligens pludselig kan angribe stater og statsledere. Derfor kræver kampagnen en international våbenaftale om dem.

Andre mener, at vi er langt fra den fremtid og peger samtidig på, at mange lande allerede er begyndt at udvikle og investere i forsvarssystemer mod den slags angreb.

Men, som føromtalte Paul Scharre har påpeget: Hvad tænkte folk i 1912, da de første gang så en flyvemaskine? Så de mon alle perspektiverne – på godt og på ondt?

Læs hele artiklen

Sidens top

Droneangreb peger mod en tid med dræber-robotter og hævntogter, der kan få dig til at gyse


Politikken d. 6/1 - 2020

af Michael Jarner, international redaktør.

...Hvad vi ser er imidlertid kun en første fase af droner, der - efter alt at dømme - med tiden vil blive udstyret med stadig mere avanceret kunstig intelligens og ansigtsgenkendelse, så de selv kan udpege og tilintetgøre mål.

Det betyder så også, at fredagens drab er en anledning til at stoppe op og tænke sig op: Hvad er det egentlig for en udvikling, der er i gang?

Hvis du hurtigt og overskueligt vil se perspektivet, vil jeg anbefale dig at få ind på Campaign to Stop Killer Robots hjemmeside og trykke her. Det er både skræmmende og oplysende...

Læs hele artiklen


The United States should drop its opposition to a killer robot treaty

Bulletin of the Atomic Scientists d. 7/11 - 2019
by Lisa A bergstrom

...To address the humanitarian risks of autonomous weapons, about 100 countries have been discussing the possibility of negotiating a new treaty within the Convention on Certain Conventional Weapons (CCW), a little-appreciated, United Nations-affiliated forum for regulating inhumane weapons. Since 2014, the slow-moving CCW has agreed to renew talks on the issue without being able to reach the consensus the convention requires to actually start negotiating a treaty...

...Given this history of success, it is tempting to conclude that a strong, standalone treaty is the best way to deal with the threat posed by autonomous weapons, despite the fact that countries like the United States and Russia would almost certainly refuse to join. Autonomous weapons, however, are not landmines or cluster munitions. Landmines and cluster munitions were used around the world for decades in conflicts large and small, in many cases causing great civilian harm. Treaties banning these weapons have value even when the United States, Russia, China, and other major military powers do not participate. In contrast, autonomous weapons are a developing technology likely to be used by only the most advanced militaries for some time. A treaty that excludes almost all the countries with the interest and ability to deploy autonomous weapons would have comparatively little value either as arms control or as a humanitarian norm builder...

Læs hele artiklen

Sidens top

 

Nu leverer droner pakker i USA

TV2 d. 20/10 - 2019

af Steffen Neupert
Både Google-selskabet Alphabet, Amazon og det amerikanske postvæsen er ved at udvikle ubemandet pakkelevering.
Selskabet Alphabet, der blandt andet ejer Google, tilbyder nu som de første i USA at levere pakker med ubemandede droner til folks baghaver...

Læs hele artiklen

Sidens top

 

Kargu - The Kamikaze Drones Getting Ready For The Swarm Operation

STM, Youtube d. 17/7 - 2019

Drone swarms are a popular trend in high tech industries and rightly so, they attract a special focus from the drone makers. As a leading manufacturer of loitering munition platforms and surveillance drones, STM also maintains its research on swarm drones, especially with regards to military applications. Recent tests demonstrate STM’s latest achievement with respect to the concept of drone swarms, where 20+ KARGU Smart Munition Platforms perform a joint strike on a target provided by a single GCS.

Link til video

 

US, Russia Lead Efforts to Prevent Global Ban on Killer Robots

Antiwar.com d. 29/3 - 2019

With some activists warning that the advent of armies of vicious killer robots could be just 3-4 years away, a large number of nations are trying to get out in front of that, with an eye toward a global ban on such robots.

The EU and UN both heavily support such a ban, and Germany is seen as a major proponent. Yet several nations seem keen to resist the idea, as they envision powerful armies of metal men crushing their enemies.

The US and Russia, unsurprisingly, are leading the opposition to the global ban, calling it premature. In practice this really just means they oppose any limitations that would prevent them from building killer robots. Britain, Israel, and Australia are also opposed...

Læs hele artiklen

Sidens top

 

Pioneer in Field Warns Dangers Posed by Artificial Intelligence 'Very Real'

Common Dreams d. 5/4 2019

"What is most concerning is not happening in broad daylight"
by
Andrea Germanos, staff writer

...Unfortunately, "what is most concerning is not happening in broad daylight" but "in military labs, in security organizations, in private companies providing services to governments or the police," he said.

In particular, Bengio said, he is concerned with so-called killer drones—lethal autonomous weapons—and surveillance, which can be abused by authoritarian governments.

AI "can be used by those in power to keep that power, and to increase it," said Bengio, and be used "to worsen gender or racial discrimination."

Some sort of government or international regulatory framework needs to be in place to put a check on AI, added Bengio: "Self-regulation is not going to work."

It's not the first time Bengio expressed concerns about possible abuse of AI. This week he joined over two dozen other AI researchers in calling on Amazon to stop selling its facial-recognition software Rekognition to police departments...

..."Generally speaking," he wrote at The Conversation highlighting the need for the declaration, "scientists tend to avoid getting too involved in politics. But when there are issues that concern them and that will have a major impact on society, they must assume their responsibility and become part of the debate."

"And in this debate, I have come to realize that society has given me a voice—that governments and the media were interested in what I had to say on these topics because of my role as a pioneer in the scientific development of AI."

"So, for me, it is now more than a responsibility," he said. "It is my duty. I have no choice."...

Link til hele artiklen

Sidens top

 

Why We Should Ban Lethal Autonomous Weapons

Future of Life Institute, Youtube, d. 26/3 - 2019

Top AI researchers -- including deep-learning co-inventor Yoshua Bengio and AAAI President-elect Bart Selman -- explain what you need to know about lethal autonomous weapons. Thanks to Joseph Gordon-Levitt for the narration. Take action at https://autonomousweapons.org

Læs hele artiklen

Sidens top

 

Human security in the age of AI:
Securing and empowering individuals

ICT for peace foundation

Heidelberg, 18 Dezember 2018.
Digital Human Security2020

Foreword

We at the ICT4Peace Foundation have been working on Information Communication Technologies (ICTs) and peace andsecurity issues for the past 14years. Much has changed in terms of the potential for international coordination, the speed of information, the rise of social media and the advancements made by AI. Over the past 18 months ICT4Peace has focussed particularly on the impact of Artificial Intelligence and Cybersecurity on society and individualsandresulting peace-time threats. How can we secure individuals’ rights, data and privacy online,using traditional national security approaches when the challenges we face are inherently both localcitizen-based, and international? One way forward could beto develop policies that consider the individual as the epicenter of the security challengeinstead ofonly traditional territorialsovereignty. Human beings need to bethe core focus of the IT and security agendagoing forward.

The ICT4Peace foundationin cooperation with the Zurich Hub for Ethics and Technology (ZHET)held a series ofinformalworkshops1in 2018 with leading thinkers onthe impact of AI.As the events of the pastmonths have signalled, it is clear we are at a turning point about how we want to manage and shape the future of the “Data Age”.

Link til hele rapporten (12 sider)

 

U.N.'s Guterres urges ban on autonomous weapons

World News, Reuter d. 5/11 - 2018

LISBON (Reuters) - It would be “morally repugnant” if the world fails to ban autonomous machines from being able to kill people without human involvement, U.N. Secretary-General Antonio Guterres said on Monday...

...Speaking to Reuters during the Web Summit technology conference which opened in Lisbon on Monday, Guterres praised the “enormous benefits” of new technology.

But he said it was crucial that the world works to avoid “autonomous machines with the power and the capacity to take human lives on their own without human control”.

“This is the kind of thing that in my opinion is not only politically unacceptable, it is morally repugnant and I believe it should be banned by international law,” he said...

læs hele artiklen

Sidens top

 

"Killer Robots' ban blocked by US and Russia at UN meeting

Independent d. 3/9 - 2018

USA, Rusland, Syd Korea, Israel, Austalien blokere for forhandlinger mod et forbud mod autonome dræberrobotter.
Man skal også blive enig om, hvad man forstår ved "lethal autonomous weapons system"

Læs hele artiklen

Sidens top

 

Er tryghed stadig vigtig?

AF: POUL ECK SØRENSEN ESBJERG FREDSBEVÆGELSE, WILLEMOESGADE 29, ESBJERG
Jyske Vestkysten d. 18/9 2018 kl. 13:56

Læserbreve.

Er tryghed stadig vigtig? Hvis ja, bliver vi nødt til at forholde os til autonome dræberrobotter/droner. I FN forhandler en række lande og NGO'er, om der skal være et forbud mod autonome dræberrobotter, men det går trægt. Danmark deltager slet ikke i forhandlingerne. Hvis vi ikke får stoppet produktionen af de autonome dræberrobotter/droner, vil det blive alt for let at slå et andet menneske eller en gruppe af mennesker ihjel, uden man kan finde gerningsmanden/gerningsmændene.

Droner, der kan slå et menneske ihjel, behøver ikke at være større end en lille fugl. Disse droner kan flyve i store sværme og kommunikere med hinanden - så det kan være meget vanskeligt at undgå dem. Autonome robotter/droner er udstyret med kunstig intelligens (AI), der tænker langt hurtigere end mennesker. Kunstig intelligens kan selv udvikle sin intelligens. Således har robotter kunnet udvikle et andet sprog til kommunikation, da menneskets sprog var for langsomt.

I krig er hurtige beslutninger noget, man kan vinde en krig med. Derfor er det fristende at overlade krigens beslutninger til superhurtige dræberrobotter, men det vil betyde, at en atomkrig lettere vil kunne starte ved en fejl. Derfor kom og hør vort gadeteater om autonome dræberrobotter på FN's fredsdag den 21. september klokken 17.00 og klokken 17.30 i Heerups Have ved Hotel Britannia i Esbjerg.

Link til artiklen

Sidens top

 

Google Employees Resign in Protest Against Pentagon Contract

GIZMODO d. 14/5/18 af Kate Conger

It’s been nearly three months since many Google employees—and the public—learned about the company’s decision to provide artificial intelligence to a controversial military pilot program known as Project Maven, which aims to speed up analysis of drone footage by automatically classifying images of objects and people. Now, about a dozen Google employees are resigning in protest over the company’s continued involvement in Maven.

The resigning employees’ frustrations range from particular ethical concerns over the use of artificial intelligence in drone warfare to broader worries about Google’s political decisions—and the erosion of user trust that could result from these actions. Many of them have written accounts of their decisions to leave the company, and their stories have been gathered and shared in an internal document, the contents of which multiple sources have described to Gizmodo.

The employees who are resigning in protest, several of whom discussed their decision to leave with Gizmodo, say that executives have become less transparent with their workforce about controversial business decisions and seem less interested in listening to workers’ objections than they once did. In the case of Maven, Google is helping the Defense Department implement machine learning to classify images gathered by drones. But some employees believe humans, not algorithms, should be responsible for this sensitive and potentially lethal work—and that Google shouldn’t be involved in military work at all.

Historically, Google has promoted an open culture that encourages employees to challenge and debate product decisions. But some employees feel that their leadership no longer as attentive to their concerns, leaving them to face the fallout. “Over the last couple of months, I’ve been less and less impressed with the response and the way people’s concerns are being treated and listened to,” one employee who resigned said.
“I realized if I can’t recommend people join here, then why am I still here?”

There’s precedent for employee pushback resulting in product changes—in 2015, employees and users successfully challenged Google’s ban on sexually explicit content posted to Blogger. But these are the first known mass resignations at Google in protest against one of the company’s business decisions, and they speak to the strongly felt ethical concerns of the employees who are departing.

In addition to the resignations, nearly 4,000 Google employees have voiced their opposition to Project Maven in an internal petition that asks Google to immediately cancel the contract and institute a policy against taking on future military work.

Læs hele artiklen

 

 

 

 

 

Bliv medlem af den internationale kampagne
Stop Killer Robots


Statement on the "Emergence of Convergence" to the CCW informal meeting on lethal autonomous weapons systems

Human Rights Watch by Bonnie Docherty, Senior Researcher d. 30/6 - 2021

...Many states parties have recommended prohibiting weapons systems that cannot operate with meaningful human control. Some have also recommended prohibiting systems that use sensor inputs to target humans, or at least have expressed concern about delegating life-and-death decisions to machines...

We therefore recommend states parties:
Continue to articulate their positions on the details of such a response,
Agree at the Sixth Review Conference to a mandate to negotiate a legally binding instrument on autonomous weapons systems, and
Be prepared to look to other forums if the CCW Review Conference fails to take the requisite action.

Læs hele artiklen

 

Klip fra FNs mødekalender:

Meeting 2019

25 - 29 March Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, Group of Governmental Experts on the High Contracting Parties to the CCW on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Third session.

20 - 21 August Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, Group of Governmental Experts on the High Contracting Parties to the CCW on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, Fourth session Geneva

11 November Conference of the High Contracting Parties to Protocol V to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, Thirteenth Conference of the High Contracting Parties Geneva

12 November Annual Conference of the High Contracting Parties to Amended Protocol II to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, Twenty-first annual conference Geneva

13 - 15 November Annual meeting of States parties to the Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May Be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects, Meeting of the High Contracting Parties Geneva

Vores foldere om droner og robotter

8 - siders folder ved Thomas

4- sidetfolder ved Oluf - opdateret i 2017

Vores oplæg i Degerfos d. 20/5 - 2018

Vores oplæg som en lydfil

FNs side om traktaten

Convention on Certain Conventional Weapons. It is also known as the Inhumane Weapons Convention d. 6/16 - 1997 Klik her.

D. 13/4 - 2018: 26 lande er nu for et forbud mod autonome dræberrobotter.

Aug. 27-31 (Geneva): CCW meeting of the Group of Governmental Experts on lethal autonomous weapons systems. se kalender fra Stop Killer Robot

robotGun
Foto: Rick Marshall, Beware the Robot Apocalypse! A remote-controlled, machine-gun robot was wheeling around Times Square today. Humanity is *soooo* screwed. Link til billede

Ban Lethal Autonomous Weapons

Se deres hjemmesdie

 

FN om autonome dræber robotter:

Background - Lethal Autonomous Weapons Systems

 

2021 CCW Group of Governmental Experts on lethal autonomous weapon systems

The meeting of the Group of Governmental Experts (GGE) on lethal autonomous weapon systems was scheduled to meet for 20 days in 2021:

28 June - 5 July 2021 (postponed due to continued COVID-19 restrictions, but an informal exchange is taking place online in the same period)
3 August - 13 August 2021
27 September to 1 October 2021

Læs mere her

 

Møde i FN om autonome dræber robotter d. 11/4 - 2016

Næste konference i FN om
autonome dræber robotter 12 - 16/12 - 2016

The Fifth CCW Review Conference will take place from 12 to 16 December 2016 in Geneva. It will be presided over by Ambassador Tehmina Janjua of Pakistan. In advance of the Review Conference a Preparatory Committee will be held from 31 August to 2 September 2016 in Geneva.

Sidens top

 

Læs om FN's møde om autonome dræberrobotter d. 11/4 - 2016

Klik her for at læse om det fra kampagnen mod dræber robotter.

Følgende lande ønsker et forbud at autonome våbensystemer i alfabetisk orden:

1. Algeria
2. Chile
3. Costa Rica
4. Cuba
5. Bolivia
6. Ecuador
7. Egypt
8. Ghana
9. Holy See
10. Mexico
11. Nicaragua
12. Pakistan
13. State of Palestine
14. Zimbabwe

klik her for at læse om det samme fra FN's webide

Red. Det er ikke længere til at finde lande der er imod på FN's side. Det er tydeligt at man i FN blot debatere og laver undersøgelser i øjeblikket (red. d. 13/5 - 2018)

Sidens top

FN - møde d. 11 - 15/4 2016

lethal autonomous weapons systems (LAWS)
The Meeting decided to convene an informal meeting of experts of up to five days during the week of 11 to 15 April 2016 to discuss further the questions related to emerging technologies in the area of lethal autonomous weapons systems (LAWS), in the context of the objectives and purposes of the Convention. The Chairperson of the meeting of experts will submit a report in his personal capacity to the 2016 Fifth Review Conference of High Contracting Parties to the Convention. The meeting of experts may agree by consensus on recommendations for further work for consideration by the 2016 Fifth Review Conference."

se websiden.

 

--------------------------------------------------------------

 

Danmark bør tage “dræberrobotterne” alvorligt

DIIS Artikel, 19. marts 2018

Der er gode grunde til at deltage i FN-møder om autonome våben
Johannes Lang, Robin May Schott & Rens van Munster

I november 2017 mødtes en lang række lande og NGO’er for at diskutere, hvordan det internationale samfund bedst kan regulere udviklingen og anvendelsen af autonome våben. Ifølge eksperter udgør autonome eller selvtænkende våbensystemer, også kendt som ’dræberrobotter’, den tredje revolution i militærhistorien – efter opfindelsen af krudt og atomvåben. Teslas Elon Musk har kaldt kunstig intelligens den største trussel mod menneskeheden, og Vladimir Putin pointerede for nyligt i en tale, at det land, der fører an inden for forskning i kunstig intelligens, også vil komme til at styre verden.

Som et af få europæiske lande valgte Danmark ikke at deltage i FN-mødet om autonome våben sidste år. Det er beklageligt, for Danmark har en klar politisk og militær interesse i at påvirke diskussionen om disse våben. I en ny artikel i Jyllands-Posten argumenterer tre DIIS-forskere for at Danmark burde tage denne udvikling alvorligt og deltage i de kommende FN-møder om ”dræberrobotterne”.

Se artiklen hos DIIS

 

Four reasons why Denmark should speak up about lethal autonomous weapons
Johannes Lang, Robin May Schott & Rens van Munster

DIIS Policy Brief 13. marts 2018

Recommendations
? Denmark will co-finance UN meetings on autonomous weapons and should avoid the scenario of payment without representation.

? Regulation of lethal autonomous weapons is necessary. Denmark should participate in the UN meetings in order to prevent other countries from dictating the rules.

? Denmark should use the meetings to raise concerns about a new technological arms race.

? Denmark should seize this opportunity to show normative leadership, joining calls for improved weapons reviews.

...Denmark’s closest allies are keen on developing and using autonomous weapons, while the Danish military has acquired high-tech aircraft able to interact with such weapons. The United States is leading the way, investing heavily in robotics, electronic warfare, artificial intelligence and autonomous technologies. Early last year US military airplanes released a swarm of small drones, which successfully completed a series of simple missions and returned to base without human intervention. Other great powers, including Russia and China, have also begun to develop autonomous weapons. Considering the technical capacities and strategic ambitions of Denmark’s closest allies, as well as its possible future adversaries, the Danish military will eventually face autonomous weapons on the battlefield. Without clear international norms, states may feel little obligation to live up to humanitarian imperatives....

...In an open letter to the UN, tech entrepreneur Elon Musk and more than 100 leaders in the fields of artificial intelligence and robotics stated that: “Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend.”...

Læs hele artiklen

 

Autonomous Weapons: an Open Letter from AI & Robotics Researchers

underskriftsindsamling.

D. 13/5 - 2018 To date, the open letter has been signed by 3978 AI/Robotics researchers and 22541 others. The list of signatories

Klik her

 

Report examines the issue of human control with regard to lethal autonomy

Rapporten er lavet af CNA,

Wikipedia: CNA formerly known as the CNA Corporation, is a nonprofit research and analysis organization based in Arlington, VA. CNA has around 625 employees. Læs mere her.

Det ser ud til at der er hentet en folk fra militæret til CNA.

Se deres rapport

 

 

Dræberrobotterne kommer

- International debat

Johannes Lang, DIIS | Robin May Schott, DIIS | Rens van Munster, DIIS

Jyllandsposten d. 18/3 - 2018

Under den Star Wars-inspirerede overskrift ”The Force Awakens” diskuterede deltagerne på den årlige sikkerhedskonference i München i februar konsekvenserne af kunstig intelligens for international sikkerhed og strategi. Ifølge førende eksperter på området udgør autonome eller selvtænkende våbensystemer, også kendt som ”dræberrobotter”, den tredje revolution i militærhistorien – efter opfindelsen af krudt og atomvåben. Teslas Elon Musk har kaldt kunstig intelligens den største trussel mod menneskeheden. Og Vladimir Putin pointerede i en nylig tale, at det land, der fører an inden for forskning i kunstig intelligens, også vil komme til at styre verden.

Det var på denne baggrund, at en lang række lande og ngo’er mødtes i november i FN-regi for at diskutere, hvordan det internationale samfund bedst kan regulere udviklingen og brugen af de såkaldte Laws (Lethal Autonomous Weapon Systems). Danmark har – som et af få europæiske lande – valgt ikke at deltage i FN-mødet, og det kan undre. Danmark har nemlig en klar politisk og militær interesse i at påvirke diskussionen om dræberrobotter...

Læs hele artiklen

 

Artificial Intelligence: Autonomous Technology(AT), Lethal Autonomous Weapons Systems (LAWS) and Peace Time ThreatsByRegina Surber, Scientific Advisor, ICT4Peace Foundationandthe Zurich Hub for Ethics and Technology (ZHET)

Copyright 2018,

ICT4Peace Foundation

Zurich, 21 February 2018

1 Introduction

The main purpose of this paper is to inform the international community on the risks of Autonomous Technology(AT) for global society. AT can be said to betheessence of Lethal AutonomousWeapons Systems (LAWS), which have triggered a legal andpolicy debate within the international arms control framework of the United Nations Convention on CertainConventional Weapons (UN CCW) that is now entering its fifth year. Since LAWS highly challenge existing International Humanitarian Law (IHL) due to their capacity of replacing a human operator on aweapons platform, the CCW’s tasks of,i.a., ensuringthat the concepts of legal accountability and human responsibility do not become void, and assessing whether LAWS are legal under IHL, areof utmost importance.

However, LAWS are not the only manifestation of the security risks of AT. This paper will demonstratefurther ways of the actual and potential weaponization ofATthat are currently not yet fully addressed by theUN organizations. Moreover, ATnot only posesrisks to global society if weaponized, but canposetremendous systemic risks to global societyand humanityalso when not weaponized. This potentially dangerous transformative power of AT, which is beyond the scope of the CCW’smandate, will be the thematic core of this paper.Based on a risk assessment of not-weaponizedAT, the paper will present thought-provoking impulses that can shapean international interdisciplinary debate on the risks of ATspecificallyand of emergingtechnologies more generally. In addition, this paper highlights risks underlying the application of terms originally referring to humantraitsto technological artefacts, such as ‘intelligence’, ‘autonomy’, ‘decision-making capacity’ or ‘agent’. It will argue that thisunquestioned so-called ‘anthropomorphism’ leads to a premature overvaluation of technology and a simultaneous potential devaluation of human beings, and will present ideas for linguistic substitutes.

At the same time, the paper will illustrate that the ‘classical’ understanding of ‘autonomy’ as human ‘personalautonomy’ has, in fact,donated its meaning to the current technological use of the term. However,this fact risks to be obfuscated by the broadening pool of diverse definitions and understandings of ‘autonomy’ for technological artefacts. Thereby, the paper will unearth thecurrentparadigm shift in human technological creation and self-understanding that underliesthe ongoing debate on AT and LAWS: The fact that humans are creating technological artefactsthat may lose their instrumental character because we gradually give away control and responsibility for the outcomes of their usage. Locating the core challenge of AT, AI and any emerging technology in this still subtle but pervasive change intheunderstanding of the human-technology relationship, this paper will also provide conclusions and recommendations that are of a more general and long-term character. The paper will be structured as follows: Chapters2 and 3 will describe the current understandings, uses as well as the risks of those uses of the terms ‘Artificial Intelligence’ (AI) and ‘Autonomous Technology’ (AT). Chapter 4 will introduce the term ‘Lethal Autonomous Weapons System’ (LAWS), which willlead over to Chapter 5 on the international discussions within the UN CCW and this UN debate’slimitations. Chapter 6 will present further ways of weaponizing AT that are ignored by the UN CCW, yet need immediate attention. Chapter 7 showsthreats of AT for global society during peace-time. Chapter 8 presents three arguments for shaping an international debate on AT, AI and LAWS. Chapter 9 concludes and presents a list of recommendations. Eleven listsof principles for ethical/ responsible research on AI, AT and Robotics can be found in the annex.

Læs hele rapporten

 

Failure to define killer robots means failure to regulate them
States disagree on definition of lethal autonomous weapons

DIIS Policy Brief d. 2/2 2018 by Johannes Lang, Rens van Munster & Robin May Schott

SPEAR diving. There is international disagreement whether a weapon like the UK SPEAR missile, with its capacity to select and engage targets within a defined area without human intervention, might qualify as an autonomous weapon.

Recommendations
States should prioritize reaching agreement on how to define LAWS. Without agreement, international regulation risks falling behind technological developments.

A definition of autonomous weapons that focuses on the autonomy of critical functions such as target selection and firing is most likely to succeed within the context of the CCW.

The CCW focuses on the compliance of LAWS with international humanitarian law. However, international regulation should also address the broader effects LAWS have on military competition.

Disagreements on how to define “autonomy” are stalling formal UN discussions on the compliance of autonomous weapons with international humanitarian law. A pragmatic approach that focuses on the weapon’s critical functions, such as target selection and firing, can help move discussions forward in the future.

Læs hele artiklen

Sidens top

 

'Stop Killer Robots'-kampagne: Danmark var savnet ved mødet om 'dræber­robotter'

Danmark bør aktivt begynde at deltage og præcisere, hvor det sætter grænsen for autonomi i våbensystemer.

Politkken Debatindlæg 5. dec. 2017 kl. 17.50

...Efter fire års drøftelser af problemet i regi af FN’s konvention om konventionelle våben (CCW) er der udbredt skuffelse over det uambitiøse tempo og manglen på et klart mål for at tackle denne udfordring.

De 10 dage, der er afsat af næste år til at drøfte spørgsmålet om dræberrobotter, er ikke tilstrækkeligt, når lande med et højteknologisk militær investerer betydelige midler i udvikling af våben med stadig mindre menneskelig kontrol.

Hvis det ikke kontrolleres, vil der være en stigende risiko for et globalt våbenkapløb med katastrofale konsekvenser, der vil destabilisere sikkerheden og true menneskehedens overlevelse...

Læs hele artiklen

 

Dræberrobotter er lige om hjørnet. Her er, hvorfor du skal være bekymret

Zetland d. 5/12 - 2017

Du har måske ikke opdaget det, men menneskeheden står over for et eksistentielt valg.

Det handler om, hvem der skal have lov at slå os ihjel.

Vil vi tillade dunkle algoritmer at beslutte, hvem der skal leve og dø, eller vil vi selv beholde kontrollen over den slags valg?

Måske lyder det som science fiction, men det haster faktisk med at finde på et svar, for den teknologiske udvikling går rivende stærkt.

Lige om lidt står vi ansigt til ansigt med selvtænkende, dødbringende robotter, der både kan udvælge mål, bevæge sig hen til dem og dræbe med stor præcision – helt på egen hånd og uden nogen menneskelig indblanding undervejs.

De første, diplomatiske skridt på vejen mod en mulig regulering af teknologien blev taget i FN-regi midt i november, men det går mildt sagt langsomt, og spørgsmålet er, om diplomaterne kan blive enige om noget som helst, før udviklingen overhaler dem.

Og hvad der så vil ske, er et åbent spørgsmål.

Læs hele artiklen på Zetland

Sidens top

 

Slaughterbots

Stop Autonomous Weapons har lavet en video om emnet d. 12/11 - 2017 d. 17/9 - 2017.

Klik her.

Sidens top

FN-forbundet advarer om våbenkapløb

Lykkedes det ikke at få vedtaget et globalt forbud mod autonome dræberrobotter, kan det forventes, at der kommer et nyt våbenkapløb

Næste skridt i våbenudviklingen, efter de ubemandede og stærkt omdiskuterede dronefly, er såkaldte dræberrobotter. Det er avancerede våben, der er udstyret med en så udviklet grad af kunstig intelligens, at de kan handle autonomt i krig uden at være styret af menneskehænder.

Det kan f.eks. være svært at placere et ansvar, når en dræberrobot slår et menneske ihjel. Er det ejeren, programmøren, hackeren eller den stat, den kommer fra, der har ansvaret?

I øjeblikket drøftes der i FN et forbud mod "Lethal Autonomous Weapons Systems", men Danmark er ikke repræsenteret i disse forhandlinger.

FN-forbundet opfordrer Danmarks regering til at støtte op om et globalt forbud, og at sætte ressourcer af til dansk deltagelse i forhandlingerne inden for FN. Der er indtil videre 14 lande, der har underskrevet et krav om et forbud.

Hvis det ikke lykkes at få vedtaget et globalt forbud, kan det forventes, at der kommer et nyt våbenkapløb, som vil skabe en meget mere usikker verden.

Det er ikke i Danmarks eller andre landes interesse, at der indledes et nyt våbenkapløb. Det er på tide, at Danmark tager dette spørgsmål alvorligt.

Udtalelsen er vedtaget på FN-forbundets bestyrelsesmøde 10. september 2017

Link til udtalelsen

 

Elon Musk: Kampen om kunstig intelligens kan starte Tredje Verdenskrig

Manden bag Tesla frygter, at supermagternes våbenkapløb om kunstig intelligens kan ende med en ny verdenskrig.

Politikken Tech 5/9 - 2017

...»Måske bliver det ikke initieret af statsledere, men af et AI-produkt (kunstig intelligens), der beslutter, at et forebyggende angreb er den mest sandsynlige vej til sejr«, skriver Musk i et svar til en Twitter-bruger.

Med kunstig intelligens bliver vores teknologi efterhånden så reflekterende, at robotterne selv kan beslutte sig for den handling, der forekommer dem at være den hurtigste vej til sejr. Og når de tænker langt hurtigere end mennesker, frygter manden bag Tesla-koncernen altså, at deres handlinger kan spinne ud af kontrol og forårsage krig på stor skala...

...Finansverdenen kender konsekvenserne af automatisering

I finansmarkedets verden af køb og salg kender man til både potentialet og risikoen ved at lade computere tage over. Her har det længe været kendt, at automatiseret handel, hvor computere handler selv på millisekunder, kan spinne markedet ud af kontrol og få en aktie til pludselig at styrte i værdi.

Af den grund arbejder man i finansmarkedet konstant med muligheder for også automatisk at lukke ned for alle salg, indtil mennesker kan tage stilling til de informationer, der har sat aktiviteterne igang.
Læs også: Mia Amalie Holstein: Robotterne kommer - og de skal være så velkomne

Sådanne sikkerhedsmekanismer har man ikke for våben, der bruger kunstig intelligens. Fordi nationalstater ikke er bundet af normale juridiske regler, kan de helt uigennemsigtigt udvikle våben med kunstig intelligens, de ikke selv kan håndtere, eller ved magt overtage teknologi, private virksomheder har udviklet til andre formål...

læs hele artiklen

 

Elon Musk leads 116 experts calling for outright ban of killer robots

theguardian d. 20/8 - 2017

Open letter signed by Tesla chief and Alphabet’s Mustafa Suleyman urges UN to block use of lethal autonomous weapons to prevent third age of war

Some of the world’s leading robotics and artificial intelligence pioneers are calling on the United Nations to ban the development and use of killer robots.

Tesla’s Elon Musk and Alphabet’s Mustafa Suleyman are leading a group of 116 specialists from across 26 countries who are calling for the ban on autonomous weapons.

Læs hele artiklen.

Sidens top

Robotter opfandt eget sprog og talte henover hovedet på forskerne. Er ny lovgivning måske nødvendig?

 

Sidens top

Robotter opfandt eget sprog og talte henover hovedet på forskerne. Er ny lovgivning måske nødvendig?

Nyhedscentrum d. 7/8 - 2017

Facebook valgte at lukke et eksperiment ned, hvor to kunstige intelligenser var sat til at kommunikere med hinanden. Efter et stykke tid begyndte de to robotter at opfinde deres eget sprog, så udviklerne ikke kunne følge med. Fænomenet er ikke helt ukendt med AI (Artificial Intelligence) hvor robotter hurtigt skærer ind til benet, og fjerner al unødig sætnings konstruktion i en samtale. I dette tilfælde helt uforståeligt for andre end den kunstige intelligens.

Læs hele artiklen.

 

 

Esbjerg fredsbevægelse har henvendt sig til forsvarspolitiske ordfører i alle partier ang. autonome dræberrobotter:

Esbjerg d.06.11.2016
Kære xxx

Der arbejdes i disse tider på udvikling af fuldt autonome robotter til brug som våben på
slagmarken. Disse robotter vil blive udviklet, så de selv kan operere og træffe vigtige
afgørelser i væbnede konflikter, og afgøre spørgsmål om liv eller død.

Vi i Esbjergs fredsbevægelse er meget bekymrede for udviklingen, og forholder os meget
skeptisk overfor, at skulle overlade afgørelsen om et mål skal tilintetgøres eller ej til en robot,
der agerer uden indblanding fra et menneske.
Vi støtter op om Stop Killer Robots-kampagnen, der arbejder for et forbud mod Autonome
robotter.

Der er indtil videre 14 lande, der har underskrevet og givet deres støtte til et forbud, og vi i
Esbjerg Fredsbevægelse håber, at Danmark vil tilslutte sig de 14 lande, der støtter op om et
forbud mod brug af fuldt autonome robotter.

Der vil fra den 12. til d 16. december blive afholdt en konference i Geneve, hvor det vil blive
drøftet hvad der skal gøres for, at forhindre at en menneskelig pilot fjernes fra ligningen i brug
af armerede robotter.

Vi retter denne henvendelse i håbet om, at få vished om jeres partis standpunkt i spørgsmålet
om udviklingen og brug af Autonome Dræber Robotter. Mener I, at Danmark skal deltage i
konferencen i Geneve og vil I arbejde for et forbud??

Vi vedhæfter en pjece om Autonome Dræber Robotter, vi har udarbejdet. Samtidig vil vi
opfordre til at søge information om Stop Killer Robots Campaign på dereshjemmeside

Eller information om mødet i Geneve her
og her

Kilder til folderen:
Youtube:
The Daily conversation : Future Military Robots
Motherboard: The Dawn Of Killer Robots
Advexon TV: (DARPA) Most Advanced Weapons
Doc Doc Ang: DARPA`s Killing US Army Weapon Documentary
Educational Documentary : Secret Most Advance US Robots Army- Trillion Dollars Defense (Full
Documantary
Control of Mobile Robots: The US Army Top Secret/ America Building Robots Army For Future (
Full Documentary )
Andre kilder
Defense One Web magasin
Artikel af Patrick Tucker: The Pentagon is nervous about Russian and Chinese Killer robots
Boston Dynamics Hjemmeside bostondynamics.com
New Scientist Web magasin
Artikel af Divid Hambling: Armed Russian robocops to defend missile bases

Med venlig hilsen
Thomas Bækdal
Esbjerg Fredsbevægelse

Folder

 

Studier:

The International-Law Dimension of Autonomous Weapons Systems

Fredrich Ebert Stiftung, Study, June 2015

About the author
Robin Geiß is Professor of International Law and Security at the University of Glasgow. Previously he was Professor of International and European Law at the University of Potsdam. Prior to that, he had worked as Legal Adviser to the International Committee of the Red Cross (ICRC) in Geneva. He is the author of Failed States (Duncker & Humblot 2005) and co-author of Piracy and Armed Robbery at Sea (OUP 2011).
This publication is the English translation of the initial study Die völkerrechtliche Dimension autonomer Waffensysteme from June 2015

Klik her for at hente den fra peaceweb.dk

Eller hent den fra kilden

 

Dræber robotter og droner

Vores drone folder

Klik her

Sidens top

 

d1

 

Den beskidte dronekrig

Information, leder, d. 19/10 - 2015

…Alle dræbte, som er mænd i den kampdygtige alder, bliver automatisk rubriceret som Enemy Killed in Action (EKIA) og med mindre andre beviser dokumenterer det modsatte, bliver det altså registreret som et legitimt mål…

…Kilden til oplysningerne er foreløbig anonym og motiverer selv lækagen på The Intercept:

»Denne oprørende stigning i at sætte folk på watchlister – overvåge mennesker, stable og sætte dem på lister, tildele dem numre, tildele dem ’baseball-kort’, tildele dem dødsdomme uden varsel på en verdensomspændende kampplads – det var helt fra starten forkert.«

Noget som de dokumenter, han nu har lagt frem i offentligheden, ser ud til at bakke op om…

Link til siden

Sidens top

 

Ny whistleblower lækker tophemmelige dokumenter om amerikansk droneprogram

DR d. 15/10 - 2015

…I nogle af de utallige dokumenter, der nu har set dagens lys, afsløres der angiveligt detaljer om en amerikansk operation kaldet ’Haymaker’ i det nordøstlige Afghanistan. Dokumenterne viser angiveligt, at op mod 90 procent af de drab, som dronerne foretog i en periode på næsten fem måneder, var på personer, som ikke var det tilsigtede mål…

Læs hele artiklen

Sidens top

 

The Drones paper

the Intercept - onlinemidie - d. 15/10 - 2015

The Intercept has obtained a cache of secret documents detailing the inner workings of the U.S. military’s assassination program in Afghanistan, Yemen, and Somalia. The documents, provided by a whistleblower, offer an unprecedented glimpse into Obama’s drone wars.

Her er 10 artikler om dronepapirene

 


Drapsroboter: Fremtidens krigføring?

Norges Fredslag d. 6/10 - 2015

Nå er det bare én uke igjen til Fredslaget arrangerer seminar om drapsroboter på Fredshuset! Gerald Folkvord (Amnesty), Erik Reichborn-Kjennerud (NUPI) og Anders Kofod-Petersen (Alexandra Instituttet) kommer for å diskutere problemstillinger knyttet til menneskerettigheter, hvordan autonome våpen påvirker vårt syn på krig, og begrensninger ved autonome systemer. I dag har Fredslaget også lansert publikasjonen Kampen mot drapsrobotene: 5 argumenter for et internasjonalt forbud.

læs hele artiklen

Læs skriv om dræberrobotter fra Noges Fredslag

Sidens top

 

Ensuring Human Control Over
Military Robotics

Institute for Ethics and Emerging Technologies, IEET
By Wendell Wallach, New York Times d. 29/8 -2015

...One robotic device accidentally starting a war, that would not have otherwise occurred, is sufficient to wipe out any benefits accrued. One expansionist political leader, willing to start a new war because he believes he could do so without loss of his own troops, puts a lie to any contention that in the long run robotic weapons will save lives...

...More important, there has been a successful international campaign to ban the use of indiscriminate weapons such as land mines and cluster bombs. One hundred and sixty-one countries have signed the Ottawa Treaty, which bans the use, stockpiling and manufacturing of anti-personnel mines. Creating the illusion that indiscriminate weapons could be made more acceptable through computerized enhancements would be a major setback to limiting atrocities during warfare...

...Let us stop looking at the challenges posed by the robotization of warfare piecemeal, and begin to reflect comprehensively upon the manner in which autonomous weapons alter the future conduct of war...

læs hele artiklen

Sidens top

 

droneme
Foto: Steve Rhodes, Drones protest at General Atomics in San Diego, link til billede

 

 

Retired General: Drones Create More Terrorists Than They Kill, Iraq War Helped Create ISIS

The Intercept d. 16/7 - 2015

Retired Army Gen. Mike Flynn, a top intelligence official in the post-9/11 wars in Iraq and Afghanistan, says in a forthcoming interview on Al Jazeera English that the drone war is creating more terrorists than it is killing. He also asserts that the U.S. invasion of Iraq helped create the Islamic State and that U.S. soldiers involved in torturing detainees need to be held legally accountable for their actions.

Læs hele artiklen

Sidens top

 

Retired US general: Drones cause more damage than good

Aljazeera d. 16/7 - 2015

…Asked by Al Jazeera English's Mehdi Hasan if drone strikes tend to create more terrorists than they kill, Flynn, who has been described by Wired magazine as "the real father of the modern JSOC", replied: "I don't disagree with that", adding: "I think as an overarching strategy, it is a failed strategy."

"What we have is this continued investment in conflict," the retired general said. "The more weapons we give, the more bombs we drop, that just… fuels the conflict. Some of that has to be done but I am looking for the other solutions."

Commenting on the rise of ISIL in Iraq, Flynn acknowledged the role played by the US invasion and occupation of Iraq. "We definitely put fuel on a fire," he told Hasan. "Absolutely… there is no doubt, history will not be kind to the decisions that were made certainly in 2003."

"Going into Iraq, definitely… it was a strategic mistake," said Flynn on Head to Head…

Læs hele artiklen

Sidens top

 

Stephen Hawking og 1.000 andre forskere vil forbyde intelligente krigsrobotter

Videnskab.dk 28/7 - 2015

Kunstigt intelligente ‘dræber-robotter’ bør forbydes, skriver Stephen Hawking sammen med mere end 1.000 forskere i robotteknologi og kunstig intelligens i et åbent brev…

…»Vi er nu der, hvor vi er tvunget til at tage stilling til de etiske, eksistentielle og politiske spørgsmål, som vi har diskuteret teoretisk i mange år. Det varer ikke længe, før der kører selvkørende biler rundt i Danmark, og algoritmerne har længe styret aktiemarkedet. Kunstig intelligens har så stor indflydelse, at vi ikke bare kan udvikle systemerne uden at forholde os til deres konsekvenser. Erklæringen afspejler en bekymring, som er meget stærk i forskningsmiljøet,« siger Henrik Schärfe til Jyllands-Posten…

… Dræber-robotter kan føre til mere krig i verden - og et potentielt dommedag

På Future of Life Institute er de ikke bange for at bruge store ord om truslen fra robotterne, og på deres hjemmeside kan man blandt andet læse, »Teknologi har givet liv muligheden for at blomstre som aldrig før... eller til at selvdestruere.«…

Læs hele artiklen

 

Killer robots:The soldiers that never sleep

In the city of Daejeon, South Korea, an arms manufacturer has designed and built a gun turret that’s able to identify, track and shoot targets, theoretically without the need for human mediation. Who will teach these robot soldiers the rules of engagement?

BBC by Simon Parkin d. 16/7 - 2015

…Regardless of what’s possible in the future, automated machine guns capable of finding, tracking, warning and eliminating human targets, absent of any human interaction already exist in our world. Without clear international regulations, the only thing holding arms makers back from selling such machines appears to be the conscience, not of the engineer or the robot, but of the clients. “If someone came to us wanting a turret that did not have the current safeguards we would, of course, advise them otherwise, and highlight the potential issues,” says Park. “But they will ultimately decide what they want. And we develop to customer specification.”..

Læs hele artiklen

 

 

 

Killer robots are 'quickly moving toward reality' and humanity only has a YEAR to ban them, expert warns

Mail Online d. 17/6 - 2016

Robots do the fighting would keep soldiers and officers out of harm's way

But experts say the threats to humanity would outweigh any benefits

Risk of harm or erroneous targeting of civilians would increase

Should start process on lethal autonomous weapons systems in 2017

New technology could lead humans to relinquish control over decisions to use lethal force.

As artificial intelligence advances, the possibility that machines could independently select and fire on targets is fast approaching.

Fully autonomous weapons, also known as 'killer robots,' are quickly moving from the realm of science fiction toward reality…

Læs hele artiklen

Sidens top

 

Nyt fra Campaign to Stop Killer Robots

Campaign to Stop Killer Robots d. 13/11 - 2015

Nations agreed today (November 13) to hold another week-long diplomatic meeting on 11-15 April 2016 to continue their deliberations on questions relating to lethal autonomous weapons systems, which are weapons that would select and attack targets without further human intervention. The Campaign to Stop Killer Robots sees this decision as positive in that talks on killer robots will continue […]

Campaign to Stop Killer Robots d. 27/10 - 2015

More states have raised autonomous weapons concerns at the UN General Assembly First Committee on Disarmament and International Security this year than in the past two years, according to a Campaign to Stop Killer Robots review of statements from the 2015 session, which concludes on 9 November. More than 30 states and five groups of states have included autonomous weapons […]

Læs mere her

 

Campaign to Stop Killer Robots
Offentliggjort den 9. jun. 2016

We shot this 3:22 film at the third Convention on Conventional Weapons or "CCW" meeting on lethal autonomous weapons systems at the United Nations in Geneva on 11-12 April 2016. It contains remarks by Michael Møller, director-general of the United Nations Office at Geneva, where the CCW and Human Rights Council meet. It features Pakistan's disarmament representative in Geneva, Ambassador Tehmina Janjua, who preside over the CCW's Fifth Review Conference in December 2016. The film includes extracts from statements delivered by three campaign representatives with suggestions for the way ahead at these diplomatic talks on killer robots. Shot by Sharron Ward and edited by Sharron Ward and Andrew Labens for the Campaign to Stop Killer Robots.

se video klik her

 

How can you stop killer robots | Toby Walsh | TEDxBerlin

Offentliggjort den 8. okt. 2015

Se hans video på Youtube - klik her

 

Kunstig intelligens kan tage magten

JP d. 19/7 - 2015

Verdens førende specialister i kunstig intelligens har underskrevet en erklæring om at arbejde for, at kunstig intelligens skal gavne menneskeheden. Der er frygt for, at systemerne løber løbsk og til sidst udsletter os alle…

…I erklæringen fra Future of Life Institute, der arbejder for at afbøde eksistentielle risici, der truer menneskeheden, står der, at det er vigtigt at undersøge, hvordan man opnår de enorme fordele ved kunstig intelligens, samtidig med at man undgår faldgruberne.

Det er en af verdens førende forskere i kunstig intelligens, Stuart Russell fra amerikanske Berkeley University, der har skrevet udkastet til erklæringen.

Han mener, at vi skal beskytte os mod den kunstige intelligens’ negative sider.

»De potentielle trusler ved kunstig intelligens er autonome våben, der selv kan beslutte at dræbe, og en total omvæltning af arbejdsmarkedet og økonomien. Endelig er der på langt sigt en eksistentiel trussel i form af superintelligente systemer. Vi bliver nødt til at forbyde autonome våben, finde en økonomisk model, der forhindrer stor arbejdsløshed og endnu større ulighed, og sørge for kontrol med, at superintelligente systemer handler ud fra menneskelige værdier,« siger Stuart Russell….

Læs hele artiklen

Sidens top

 

 

Autonomous Weapons Systems & the Role of Law The University of Adelaide - Faculty of the Professions
Offentliggjort den 7. maj 2015

The development of autonomous weapons systems continues unabated. In its broadest sense, these are systems that can operate outside of direct human control and can independently dispense lethal force in the battlespace based on internal programming. While not yet operational, it seems only a matter of time before such weapons can be deployed. While this weapon system, like all others, must comply with the law of armed conflict there seems something troubling about this prospect. There is growing advocacy asserting that these weapons can comply with the law, and even that they may deliver more humanitarian outcomes in their dispensation of violence. Such advocacy assumes much about the normativity of the law. While moral and/or ethical commitments are not overtly contained within the black letter rules, such considerations are understood to guide effective judgment when making targeting decisions in armed conflict. It seems an unarticulated assumption within the vast body of International Humanitarian Law that qualities of emotion and cognition do and should guide decisions about violence in armed conflict and should temper such violence. Such human qualities are naturally elusive in the context of autonomous systems, hence making ‘compliance’ with the law problematic. This presentation will canvass these issues and will interrogate how optimal judgment should be exercised in the battlespace by using the development of autonomous weapons systems as a useful case study.

se video

 

 

The Guardian view on robots as weapons: the human factor

The Guardian d. 13/4 - 2015

The future is already here, said William Gibson. It’s just not evenly distributed. One area where this is obviously true is the field of lethal autonomous weapon systems, as they are known to specialists – killer robots to the rest of us. Such machines could roam a battlefield, on the ground or in the air, picking their own targets and then shredding them with cannon fire, or blowing them up with missiles, without any human intervention. And if they were not deployed on a battlefield, they could turn wherever they were in fact deployed into a battlefield, or a place of slaughter…

…Although the slope to killer robots is a slippery one, there is one point we have not reached. No one has yet built weapons systems sufficiently complex that they make their own decisions about when they should be deployed. This may never happen, but it would be unwise to bet that way. In the financial markets we already see the use of autonomous computer programs whose speed and power can overwhelm a whole economy in minutes. The markets, in that sense, are already amoral. Robots may be autonomous, but they cannot be morally responsible as humans must be. The ambition to control them is as profoundly human as it is right.

Læs mere her.

 

Campaign to Stop Killer Robots

Se deres hjemmeside.

Se videoer fra Campaign to Stop Killer Robots -Youtube

Nedenstående er delvis taget fra deres hjemmeside og delvist omskrevet.

Gennem det seneste årti har brugen af ubemandede bevæbnede køretøjer dramatisk ændret mulighederne for krigsførelse og givet humanitære og juridiske problemer. Hurtige teknologiske fremskridt resulterer i bestræbelserne på at udvikle fuldt autonome våben. Disse robot våben ville være i stand til at vælge og skyde på mål på egen hånd, uden menneskelig indgriben. Denne evne vil ændre forholdet til voldsudøvelse grundlæggende. Det bliver svært at beskyttelsen civile og håndhæve overholdelsen af internationale menneskerettigheder og den humanitære folkeret. USA overtræder i øjeblikket mange internationale love med droneangreb i Pakistan.

Flere nationer med højteknologiske militær, herunder USA, Kina, Israel, Rusland og Det Forenede Kongerige, er på vej med systemer, der vil give en større autonomi til maskiner. Hvis en eller flere vælger at implementere fuldt autonome våben, et stort skridt videre end fjernstyrede bevæbnede droner, kan andre føle sig tvunget til at opgive politisk tilbageholdenhed, hvilket fører til et robot våbenkapløb.

Der er behov for en aftale for at etablere kontrol med disse våben, før investeringer, teknologisk udvikling og nye militære doktriner gør det vanskeligt at ændre kurs. Der er allerede udviklet droner der arbejder i grupper / sværme og kommunikere indbyrdes uden menneskelig indblanding.

Det at tillade, at beslutninger om liv eller død, skal foretages af maskiner, krydser en fundamental moralsk linje. Autonome robotter ville mangle menneskelige dømmekraft og evnen til at forstå sammenhængen. Fuldt autonome våben vil ikke opfylde kravene i krigens love.

Udskiftning af menneskelige tropper med maskiner ville kunne gøre beslutning om at gå i krig lettere.

Brugen af fuldt autonome våben ville skabe et ansvarligheds hul, da der ikke er klarhed om, hvem der ville være juridisk ansvarlig for en robots handlinger: den øverstbefalende, programmøren, fabrikanten, eller robotten selv?

Uden ansvarlighed, kan man frygte at nogle vil bruge disse robotter til udrydelse af modstandere i andre lande og politiske modstandere i eget land.

Sidens top

 

The Campaign to Stop Killer Robots

Over the past decade, the expanded use of unmanned armed vehicles has dramatically changed warfare, bringing new humanitarian and legal challenges. Now rapid advances in technology are resulting in efforts to develop fully autonomous weapons. These robotic weapons would be able to choose and fire on targets on their own, without any human intervention. The Problem describes numerous ethical, legal, moral, policy, technical, and other concerns with fully autonomous weapons.

Giving machines the power to decide who lives and dies on the battlefield is an unacceptable application of technology. Human control of any combat robot is essential to ensuring both humanitarian protection and effective legal control. A comprehensive, pre-emptive prohibition on fully autonomous weapons is urgently needed. The Solution outlines how a ban could be achieved through an international treaty, as well as through national laws and other measures.

In recent years, the benefits and dangers of fully autonomous weapons have been hotly debated by a relatively small community of specialists, including military personnel, scientists, roboticists, ethicists, philosophers, and lawyers. They have evaluated autonomous weapons from a range of perspectives, including military utility, cost, policy, and the ethics of delegating life-and-death decisions to a machine. Our Bibliography provides a list of recent publications about this challenge, while Statements contains documents issued by the Campaign to Stop Killer Robots.

Læs mere her

Sidens top

 

Press Conference by the Convention on Certain Conventional Weapons

UNOG - The United nations Office af Geneva d. 16/5 - 2014

Meeting of Experts on Lethal Autonomous Weapons
Ambassador Jean-Hugues Simon-Michel of France, Chair of the Meeting of Experts

Læs mere her

Sidens top

 

Det er en forfærdelig tankegang af mange grunde at indføre autonome våben.

Dette skal standses:

"Autonome våpensystemer har også blitt diskutert i FNs konvensjon for inhumane våpen ved to anledninger i løpet av de siste to årene. I november i år skal konvensjonens statsparter, inkludert Norge, møtes på nytt i Genève for å avgjøre hvordan verden skal forholde seg til disse våpnene. Skal vi innføre et internasjonalt forbud mot utvikling og bruk av autonome våpensystemer? Eller skal vi bare fortsette med uformelle møter for å belyse mulige problemstillinger knyttet til disse våpentypene?"

Link til ovenstående

Se også

Killer Robots diskutert i FN: På vei mot et forbud mot drapsroboter?

Se kampagnen mod disse våben

Sidens top

 

USA ønsker at dominere i rummet og på missiler:

Command Goal
To provide dominant space and missile defense capabilities for the Army and to plan for and integrate those capabilities in support of U.S. Strategic Command (USSTRATCOM) and Geographic Combatant Commanders (GCC) missions.

USA's mål er at skaffe dominerende rum- og missil-forsvarskapabiliteter og at bruge de kapabiliteter i missioner i sammenhæng med U.S. Strategic Command og Geographic Combatant Commanders!

Se kilden

 

Space Domination: Pyramids to the Heavens

By Bruce K. Gagnon Global Network Against Weapons and Nuclear power in Space

…The Pentagon is so sure that whomever controls space will control the Earth and beyond that they are feverishly working to deploy anti-satellite weapons (ASAT’s) that will enable the U.S. to knock out competitors "eyes in the sky" during times of hostilities.

As the Space Command says in their slick brochure Vision for 2020, "Control of space is the ability to assure access to space, freedom of operations within the space medium, and an ability to deny others the use of space if required." …

…We are now building pyramids to the heavens and the aerospace industry know that they must convince the public that their "plans for space" are vital, exciting, and patriotic. The time has come for a rigorous international debate and campaign around the entire space program. Won’t you please join with us?…

Læs hele artiklen

 


Most people want fully autonomous weapons banned: UBC survey

Public opinion is against the use of autonomous weapons capable of identifying and destroying targets without human input, according to a new survey by researchers at the University of British Columbia.

More than eight out of every 10 individuals surveyed said such robots should not be used for aggression, and 67 per cent said they should be banned across the planet.

More than a thousand people from 54 countries, including the United States, Canada, South Korea, Mexico and the U.K. answered the survey. It was conducted by the Open Roboethics initiative (ORi), a UBC-based group that studies issues concerning robotics and artificial intelligence.

Læs hele artilen

Sidens top

 

About IJCAI

ijcai-logoIJCAI is the International Joint Conference on Artificial Intelligence, the main international gathering of researchers in AI. Held biennially in odd-numbered years since 1969, IJCAI is sponsored jointly by IJCAI and the national AI societie(s) of the host nation(s).

IJCAI is a not-for-profit scientific and educational organization incorporated in California. Its major objective is dissemination of information and cutting-edge research on Artificial Intelligence through its Conferences, Proceedings and other educational materials...

Læs mere her.

Sidens top

 

 

Welcome to the 25th International Joint Conference on Artificial Intelligence

IJCAI d. 9-15/7 - 2016

Welcome to New York City!

We are delighted to invite you to come to New York, one of the most exciting cities of the world, and take part in IJCAI, the leading conference on the thrilling field of Artificial Intelligence. AI today has a tremendous impact. It is in all the media and makes a real difference. At IJCAI-16, you will have the opportunity to meet some of the world's leading AI researchers, to learn first-hand about their newest research results and developments, and to catch up with current AI trends in science and industry. And, of course, IJCAI-16 will be the perfect forum for presenting your own achievements, both to specialists in your field, and to the AI world in general.

læs hele artiklen

 

This open letter was announced July 28 at the opening of the IJCAI 2015 conference on July 28.

Journalists who wish to see the press release may contact Toby Walsh. Hosting, signature verification and list management are supported by FLI; for administrative questions about this letter, please contact tegmark@mit.edu.

Autonomous Weapons: an Open Letter from AI & Robotics Researchers

Autonomous weapons select and engage targets without human intervention. They might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria, but do not include cruise missiles or remotely piloted drones for which humans make all targeting decisions. Artificial Intelligence (AI) technology has reached a point where the deployment of such systems is — practically if not legally — feasible within years, not decades, and the stakes are high: autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.

Many arguments have been made for and against autonomous weapons, for example that replacing human soldiers by machines is good by reducing casualties for the owner but bad by thereby lowering the threshold for going to battle. The key question for humanity today is whether to start a global AI arms race or to prevent it from starting. If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow. Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce. It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group. We therefore believe that a military AI arms race would not be beneficial for humanity. There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people.

Just as most chemists and biologists have no interest in building chemical or biological weapons, most AI researchers have no interest in building AI weapons — and do not want others to tarnish their field by doing so, potentially creating a major public backlash against AI that curtails its future societal benefits. Indeed, chemists and biologists have broadly supported international agreements that have successfully prohibited chemical and biological weapons, just as most physicists supported the treaties banning space-based nuclear weapons and blinding laser weapons.

In summary, we believe that AI has great potential to benefit humanity in many ways, and that the goal of the field should be to do so. Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.

Læs mere

Sidens top

 

 

 

 

 

"Intelligente" missiler kan selv dræbe: FN vil stoppe robot-krige

TV2 d. 12/11 - 2014

En hemmelig krig, hvor forprogrammerede missiler og droner kan affyre sig selv, selv finde mål og dræbe tusindvis af mennesker i storbyer uden der har været menneskehænder involveret - bortset fra den overordnede planlægning og progammering.

Det er ikke scenariet i en ny amerikansk blockbuster-film af f.eks. Christopher Nolan eller Steven Spielberg, men den skinbarlige nutidige virkelighed, som FN torsdag prøver at forholde sig til ved at begrænse udbredelsen af "intelligente våben" ved et såkaldt moratorium i en konvention, Convention on Certain Conventional Weapons (Konvention angående visse konventionelle våben, red.). Et moratorium vil betyde, at brugen af såkaldt intelligente våben skal suspenderes over hele verden…

Læs hele artiklen

Sidens top

 

Humans, Not Robots, Are the Real Reason Artificial Intelligence Is Scary

The Atlantic d. 14/8 - 2015

Intelligent weapons are too easily converted by software engineers into indiscriminate killing machines…

…The potential of these weapons has not escaped the imaginations of governments. This year we saw the US Navy's announcement of plans to develop autonomous-drone weapons, as well as the announcement of both the South Korean Super aEgis II automatic turret and the Russian Platform-M automatic combat machine

Læs hele artiklen

Sidens top

Canada’s leading robot company rejects ‘killer robots’ — updated!

International Comitte for Robot Arms Control, ICRAC by Nsharkey d. 14/8 - 2014

Hi-Tech Canadian Robotics company, Clearpath, today issued a statement pledging not to manufacture autonomous weapons systems despite their commercial advantage and they urged other companies to follow suit: “those who might see business opportunities in this technology to seek other ways to apply their skills and resources for the betterment of humankind.”

Læs hele artiklen / se YouTube video

Sidens top

Et års fængsel for protest mod droner

Arbejderen d. 3/8 - 2014

50 fredsaktivister har brudt et polititilhold om ikke at komme tæt på militærbasen Hancock Field, hvorfra USA's militære droner styres. Nu har den første - Mary Anne Grady-Flores - fået en dom på et års ubetinget fængsel.

Læs hele artiklen

Sidens top

 

“The Satellite War”

d. 1/10 - 2014

sw

written and published privately by Norwegian journalist Bård Wormdal, deals with, among other things, Norwegian double standards on security policy in the Arctic. It is written by a journalist and so has a journalistic style which is engaging and informative. Although not all the statements and claims made are supplied with references for substantiation, it is certainly paints a believable (if not desirable) picture of the current state of the art in ‘sophisticated’ high tech warfare directed through satellite technology and gives some insights into how investigative journalists go about their business. Above all though, it provides some new (to me anyway) and important information on how the increasing militarisation of space is presenting challenges to existing treaties and agreements and how we need always to be vigilant to ensure that their principles are not eroded.

The book begins in a Foreward by explaining the author’s entry into the subject through the Vardø radar controversy. In 1998 a Raytheon Have Stare "high-resolution X-band tracking and imaging radar with a 27-meter mechanical dish antenna" which had been operational at Vandenberg Air Force Base in California since 1995, was quietly dismantled and moved to northern Norway. In California it was used in early development tests of the US National Missile Defense (NMD) program and in Norway it was reassembled by the US and Norway under the project name "Globus II" at Vardø just 40 miles from Russian border. The US and Norway claimed that the radar would be used to monitor space debris but Russian and US experts demonstrated how its principal use would be to collect detailed intelligence data on Russia's long-range ballistic missiles...

Læs mere her

Sidens top

 

 

Near-Term Risk: Killer Robots a Threat to Freedom and Democracy

LessWrong d. 14/6 - 2013

…One thing he didn't mention in this video is that there's a difference in obedience levels between human soldiers and combat drones. Drones are completely obedient but humans can throw a revolt. Because they can rebel, human soldiers provide some obstacles to limit the power that would-be tyrants could otherwise obtain. Drones won't provide this type of protection whatsoever. Obviously, relying on human decision making is not perfect. Someone like Hitler can manage to convince people to make poor ethical choices - but still, they need to be convinced, and that requirement may play a major role in protecting us. Consider this - it's unthinkable that today's American soldiers might suddenly decide this evening to follow a tyrannical leader whose goal is to have total power and murder all who oppose. It is not, however, unthinkable at all that the same tyrant, if empowered by an army of combat drones, could successfully launch such an attack without risking a mutiny. The amount and variety of power grabs a tyrant with a robot army of sufficient power can get away with is unlimited…

læs hele artiklen

Sidens top

 

FN-forbundet slår alarm:
Brug af droner som våben i ”krigen mod terror” undergraver menneskerettighederne

Vedtaget på FN-forbundets bestyrelsesmøde
d. 10/6 - 2013

læs hele udtalelsen

 

 

Forsker: Forbyd dræberrobotter nu!

Fremtidens militære robotter kan egenhændigt starte en krig. Derfor skal de forbydes, siger forsker i kunstig intelligens.

Metroxpress d. 6/4 - 2013

Klip:

I årtier har Hollywood skabt film om robotter, der trodser menneskets vilje og på egen hånd dræber massevis af mennesker.

Dette scenarie er måske ikke så langt væk. Derfor opfordrer Noel Sharkey, professor i kunstig intelligens ved University of Sheffield, til at verdens regeringer forbyder dræberrobotter.

Det er især et direktiv fra det amerikanske forsvar fra november 2012, der giver militæret lov til at udvikle autonome (uafhængige, selvbestemmende, red.) krigsrobotter, som bekymrer Sharkey.

- Er der nogen der overvejer hvordan autonome våben kan destabilisere verdenssikkerheden og skabe utilsigtede krige?, skriver han på cnn.com.

Læs hele artiklen

Sidens top

 

Skal vi male Dannebrog på bevæbnede droner?

Dr d. 6/3 - 2013

…Indtil videre er det kun USA, Storbritannien og Israel, der bruger droner. Men mange andre lande er på vej - bl.a. Rusland, Kina, Indien, Iran og en række EU-lande…

…- Nogle er kritiske, fordi de synes, det rykker ved sådan nogle etiske balancer i forhold til, om det virkelig kan være rigtigt, at vi kan gå i krig uden at udsætte os selv for fare. Er det så kun de andre, der risikerer at blive dræbt? Så der er nogle etiske og moralske problemstillinger, der er på bordet. Og så er der nogle folkeretlige gråzoner, som bliver endnu tydeligere af den her nye form for anvendelse af magt. Spørgsmålet om, hvor vi så må gøre brug af magt henne. Nu bliver det nemmere for en stat som USA at gøre brug af magt i Yemen, Somalia, Pakistan osv., og det aktualiserer et behov for, om man virkelig må det efter de eksisterende regler, siger Anders Henriksen til P1 Morgen…

Læs hele artiklen og hør indlæget

Sidens top

 

 

"Hvem har ansvaret, når robotter dræber civile?"

etik.dk d. 14/3 - 2013

…Det er blevet anslået, at der er blevet taget mere end 10.000 robotter i brug i krigene i Irak og Afghanistan, men det er især droneflyene, som i øjeblikket har mediernes interesse…

…Robotterne selv kan man jo ikke meningsfuldt straffe, så hvem må man så vende sig mod: fabrikanten, programmørerne, officeren, der sendte robotten af sted, eller måske den amerikanske præsident? En moralsk uacceptabel løsning, som dog risikerer at blive virkelighed, er den, at ansvaret ikke kan placeres…

Læs hele artiklen

Sidens top

 

Hjælp, dræberrobotterne kommer

Information d. 25/2 - 2013

Efter dronerne kommer en ny generation af ’selvstyrende våben’, der rejser endnu større moralske og juridiske problemer, advarer aktivistgruppe. De første dræberrobotter kan være kampklare inden udgangen af årtiet…

…Kampagnen Stop the Killer Robots vil blive sat i værk til april i forbindelse med en høring i det britiske parlaments underhus.

»Vi taler ikke om science fiction, men om en teknologisk udvikling, der er i fuld gang,« siger Sharkey.

»I USA arbejder Pentagons forskningsafdeling på at udvikle drone-prototypen X47B, der i overlydsfart vil kunne præstere rykvise manøvrer med en g-kraft. X47B vil være selvstyrende og kunne indgå i væbnet kamp overalt i verden.«…

Læs hele artiklen

Sidens top

 

 

 

 

D2

 

Norden i et rumperspektiv

En skriftserie af Den Svenske Fredskommitte
Ved Agneta Norberg 2012

Desværre har vi kun oversættelsen af 3 af de 4 hæfter i serien - Hanne Carlsson og Lisbet Skou har oversat.

Ingen krig kan idag føres med succes uden satellitter i rummet, datanedtagningsstationer eller radaranlæg på jorden. USA's anlæg til krigsførelse via rummet findes rundt omkring i verden. Selv i Norden, og frem for alt i Sverige og Norge, er udviklingen af rumstyrede våben og installationer, sket uden større opmærksomhed fra hverken massemedier eller fredsbevægelser.

Sverige har, ligesom andre lande, en stor rumteknologisk industri og spiller en betydende rolle i European Agency (ESA) – det europæiske rumselskab. Sverige ligger endda langt foran mange andre lande i udviklingen af den militære rumteknologi. I en rapport fra januar 2005 fastslår svensk industridepartement, det positive i en forøgelse af rumteknologi, når det gælder fly og rumindustri, for såvel militært som civilt brug. ”En forøgelse af rumindustrien vil blive en af drivkræfterne for væksten i Sverige og for en position som en højteknologisk nation”, siges det i rapporten, som åbent understreger vigtigheden af at samarbejde med såvel EU's og USA's rumindustri. ”Samarbejde i forskning og teknologi skal udvikles for at sikre tilgang til USA´s forsknings og teknologiudvikling samt sikre Sveriges firmaer muligheden for at eksportere højteknologiske systemer” Vigtige rumforsknings firmaer er Ericsson, Saab Space, Rumbolaget, Volvo Aero og Ångstrøm-laboratorierne. Sidstnævnte er blevet verdensberømte for udviklingen af minisatellitter og GPS-systemet.

En af verdens allerstørste downloadingsstationer for satellitter, Esrange, ligger i Norrbotten, i North European Aerospace Testrange, Europas største flytræningsområde, forkortet NEAT. Førerløse fly, såkaldte droner, som beror på rumstyring via Esrange, trænes kontinuerligt på området. I 2014 er det europæisk tilvirkede Neuron-fly, en drone stor som et kampfly, klar til afprøvning i NEAT. SAAB har spillet en vigtig rolle i udviklingen af dette fly.

Bekymring over militarisering af rummet er grunden til Svensk Fredskommittes projekt Norden i et rumperspektiv

Agneta Norberg 2012

Sidens top

 

Droner

- Fjernstyrede og førerløse fly

Droner trænes og afprøves i Norrbotten på det store træningsområde NEAT, North European Aerospace Testrange. Det er det største militære træningsområde i Europa. I Sverige kaldes området Vidsel Robotbase. Udviklingen af UAV (Unmanned Aeroaæ Vehichles) eller ubemandede fly har foregået siden anden verdenskrig. Præcis hvornår de første ubemandede fly med bomber eller missiler dukkede op, er svært at sige. Formodentlig var det et par år efter at USA indledte invasionen i Afghanistan efteråret 2001. Grunden til udviklingen af droner til krig er at man her forhindrer at egne soldater bliver dræbt.

Droner anvendes også som værktøj for regulære henrettelser fremfor alt i Pakistan. Hidtil er mennesker i seks lande blevet dræbt når droner udløste deres Hellfire missiler, styret fra baser i Fort Langley i Virginia, USA og Creech Air Base i Nevada, USA.

Dronerbaser findes endda i Etiopien, Somalien og på Seychellerne i Det Indiske Ocean. Ifølge AlterNet findes der baser for droner i mindst 15 lande. Kort tid efter at Muammar Gadaffi blev myrdet, var USA sammen med Frankrig i færd med at bygge en flyveplads i Libyen til først og fremmest anvendelse af droner.

En drone er et førerløst fly, som styres via satellitkommunikation. En drone kan være lille som et legetøjsfly. Den kan sendes afsted fra en hånd for at spionere f.eks. på den anden side af en bakke eller indeni en bygning for målsøgning. Denne type af svensktilvirkede små fly er blevet anvendt af de svenske soldater i Afghanistan siden årsskiftet 2009. Andre typer af droner er store, som almindelige krigsfly, og udrustet med krigsprojektiler. De kaldes også UACV (Unmanned Combat Arrial Vehicle) som f.eks. det delvis svenske tilvirkede Neuron-fly, som er klar til en prøveflyvning i 2012, altså i år. Neuron er et samarbejdsprojekt mellem bl.a. det franske firma Dassault Aviation, samt Italien og Sverige, som er den tredje største aktieindehaver i projektet. Øvrige partnere er Schweiz, Spanien og Grækenland. Det svenske SAAB i Linköping fik 600 millioner kroner af den svenske stat for at delta i dette projekt og SAAB har været et af de hovedansvarlige i samarbejdet.

NEAT er allerede tidligere blevet anvendt til at prøveflyve droner. Allerede i 2002 landede en Boing 747 på Kiruna flyveplads og i flyets lastrum var et afmonteret førerløst israelsk fly, som skulle testes af Rumbolaget og det svenske forsvar.

I den øvre del af Norrland er der ideelle forhold for vindkraft. Men flyvevåbnet og indirekte NATO sætter et stop for udbygningen af vindkraftværk, f.eks. i Jokkmokks og Arvidsjaurs kommuner. Forsvarsmagten har udfærdiget såkaldte stopområder, hvor det ikke tillades at opføre vindkraftværk eftersom de der er i vejen for først og fremmest Jas 39 Gripen, men også for tests af førerløse fly.

Brugen af droner i krigsområder er udsat for en hård kritik internationalt af bl.a. Ohilip Alston, FNs specielle rapportør om henrettelser uden rettergang. Han har rapporteret til FNs råd for menneskerettigheder og sagt, at ved at anvende droner, hvorfra projektiler afskydes og mennesker henretter, har en videospilsmentalitet udviklet sig. Han pegede specielt på at CIA, ved at anvende droner, har flere uskyldige menneskers liv på sin samvittighed.

Conveniant killing and the Playstation mentality, Chris Cole 2010

Drönare over Jokkmokk, Sveriges Radio Konflikt

TV: Premiärvisning av Saabs Neuron, Ny Teknik, 12-01-20

UN official criticises US over drone attacks, BBS News US & CANADA, 10-06-02

Vi kan inte äventyra Luftrummet over Aecidsjaurs kommun, insändare i Norran. 11-10-29

Drönare allt viktigare i framtiden. SVD 20.januari 2011

Sidens top

Hvad mener Den svenske Fredskommitte om droner?

USA har uden krigserklæring gennemført dronerangreb mod mål i Pakistan og derved dræbt tusindvis af civile hvilket kan sammenlignes med henrettelser. Siden midten af 2000-tallet har droner skudt ofre i Afghanistan, men også Irak, Somalia, Libyen og Yemen har været udsat. Det hvide Hus definerer ikke angreb med ubemandede fly som regulære krigshandlinger eftersom der ikke forekommer landtropper og ildstrid og anser at kongressens godkendelse ikke behøves.

Ubemandede fartøjes præsenteres som flyteknikkens udvikling og civile anvendelsesområder som spændende, og indsamling af information efter f.eks. et jordskælv fremhæves. At drivkraften bag dronernes udvikling er at få militært overtag forties. Droner skal mindske risikoen for egne (piloter) og dræbe andre.

Neuron-flyet, samarbejdsprojektet mellem Frankrig og Sverige, har som yderligere partnere og finansierer gældstyngede lande som Spanien og Grækenland, som satser på oprustning i stedet for på sine medborgeres eksistens. Også 600 millioner svenske kroner skulle kunne anvendes bedre inden for pleje, skole og omsorg.

Den svenske Fredskommitte kræver at produktion og afprøvning af UCAV-fly forbydes. Svensk territorium må ikke anvendes som testområde.

Vi er imod at Sverige samarbejder med et land hvis ledelse anser sig for at stå over international ret, Militær vold legimiteres som et sted at bedrive politik – noget som leder til brutalisering af de internationale relationer og til kaprustning.

Sidens top

 

Esrange

i Kiruna – en af verdens største nedtagningsstationer

Esrange raketstation i Norrbotten, nogle mil uden for Kiruna, er en af verdens aller største nedtagningsstationer for billeder af jordoverfladen taget af satellitter. Hvis Sydkorea vil have billeder af Nordkorea, eller hvis Israel vil have billeder af Gaza, så bestiller man foto, taget af satellitter, som nedtages i Esrange og sendes til bestilleren.

I januar 2005 udgav det svenske industriministerium en rapport, hvor der argumenteres for, at Sverige skal øge satsningen på rumteknologi og rumindustri til både militært og civil brug. Rapporten fastslog at udvidelsen af rum og flyindustri vil blive en af Sveriges drivkræfter for svensk vækst og for en stærk position i verden som en højindustriel nation. Denne satsning ses tydeligt på Esrange rumcenter, hvor nye radaranlæg bygges med rasende fart. Esrange ligger inden for det store træningsområde NEAR-North European Aerospace Testrange. Svenske Rumselskab kontrollerer og overvåger 24 satellitter via denne rumstation. Det betyder i praksis at Esrange kontrollerer 92 satellitoverflyvninger per døgn. Esrange er altid blevet markedsført som et civilt projekt. Men ifølge Loring Wirbel og Bruce Gagnon anvendes de civile programmer som skalkeskjul for et vidt omfattende spionsatellit program. (Star wars, US tools of Space Supremacy, 2004. Bruce Gagnon, Flamman, 9.oktober 2008) Også Bruce Wormdal skriver i sin bog Satellitkrigen, Oslo 2011, som afslører norsk militær anvendelse af satellitdata, at Esrange har samme kapacitet.

USA kan ikke på egen hånd betale, hvad det koster at militarisere og bevæbne rummet. Af den grund har Pentagon valgt at inddrage de lande, som ligger nær ved Rusland og Kina, i de rumteknologiske programmer. Derfor er Sydkorea og Sverige på begge sider af det euroasiatiske kontinent, hvor Kina og Rusland ligger, vigtige samarbejdspartnere for eksempelvis NASA i USA.

De svenske rumselskab samarbejder i rumspørgsmål med bl.a. USA, Sydkorea, Indien, Taiwan, Israel m.fl. lande. (U.S. Space Tecnology Controlling China and Russia, Bruce Gagnon, Peace Review, Volume 22, 2010. Space Daily, 2010-06-02, Agnetqa Norberg. The North Contribute to Space Militarization, Peace review, 2010.)

Sidens top

 

Galileo – en europæisk kopi af GPS

I december 2010 indviedes den nye jordstation Galileo med et Luciaoptog gennem byen Kiruna. Almenheden skal bibringes den opfattelse, at dette er et civilt projekt. Men fra starten af har Galileo været ment til militært brug. Halvdelen af de krypterede signaler skal anvendes i militære sammenhæng. Galileo indebærer at EU ikke behøver at underkaste sig USA's GPS-system. Europa opbygger nu sit eget satellitnavigationssystem, sit eget GPS. Galileo vil blive et ekstremt vigtigt værktøj i fremtidens krigsførelse.

Galileo vil blive anvendt til at styre bomber og missiler mod terrorister og andre typer af fjender langt fra Europa, når det er udbygget med landstationer rundt omkring i hele verden.

Informationerne er hentet fra Frank Slijpers bog From Venus to Mars- The European Union´s step towards the militarization of space. Den kan bestilles på www.space4peace.org.

Sidens top

 

Hvad mener Den svenske Fredskommitte om Esrange?

Den svenske Fredskommitte har siden begyndelsen af 2000 tallet øget opmærksomheden på den igangværende militarisering af rummet. SFK har til en begyndelse tekstet filmen Keep Space for Peace og distribueret den med titlen Lad himlen være i fred. Som medlem af Global Network Against Nuclear Power and Weapons i Space arrangerede Fredskommitteen, med bevilling fra Bernadotteakademiet i 2008, en nordisk foredragsturne med Global Networks generalsekretær og organisator Bruce Gagnon, som forelæste i Malmø, København, Stockholm, Kiruna og Oslo.

Bruce Gagnon besøgte da Esrange og mødtes endda med studerende fra Rumakademiet i Kiruna, som, ifølge ham, var overraskede over rumteknologiens dobbelte anvendelse (dual use-) den civile og den militære. Er det da ikke godt, at Sverige er langt fremme, når det gælder rumteknologi? Jo, der findes mange civile anvendelsesområder, men det er i skyggen af dem, at den militære anvendelse ekspanderer, skjult for almenheden. Vi misinformeres til at tro, at Esrange bare sender vejrballoner op. SFK kræver åbenhed om rumraketbasens hele virksomhed.

Og hvad mener vores astronaut Fuglsang, når han i SVT siger at Sverige fører europæiske vurderinger til rummet? EU er både ven og konkurrent til USA og det sagte tyder på at der er rivalisering, enten vi eller dem. Vi vil minde om rumtraktaten fra 1967, som er undertegnet af 98 stater, blandt andet af Sverige og USA. Aftalen forbyder okkupation, kernevåben, militærbaser og militærøvelser i verdensrummet og bekræftedes på ny i 1992 i en FN-resolution om The Prevention of an Arms Race in Outer Space (PAROS).

Alle nationer forpligtiger sig til at respektere resolutionen men er alligevel er vi på vej mod et rumkapløb. Sverige forbryder sig mod ånden i resolutionen ved at støtte USA og andre lande i deres bestræbelser for at udnytte rummet til militære formål. Dette vil Den svenske Fredskommitte hjælpe til med at afsløre.

Sidens top

 

radarv2
Foto: Magne Thyrhaug, Vardø, This is taken from the fortress, with the church in the middle of the photo and the two radards on both sides in the back, link til billede

 

Vardöbasen

USA's største radar lige uden for den russiske grænse

En af verdens allerstørste radaranlæg, Have Stare radarsystem, med dæknavnet Globus II, er i hemmelighed blevet installeret af USA i Vardö, i det nordlige Norge, få mil fra den russiske grænse. Dette skete i år 1998, og udløste stærke protester fra russisk side, eftersom handlingen var et brud mod den stabiliserende ABM aftale (Antiballistic Missile Treaty) som blev indgået i 1972. Denne aftale, som præsident Bush ensidigt trak sig fra i 2002, skulle forhindre at nogen part placerede udrustning i nærheden af andre stormagters territorier, og således kan anvendes ved et første slag med atomvåben.

Journalisten Inge Sellevåg afslørede i Bergens Tidende 1998, at dette radaranlæg, som den norske regering præsenterede som skulle spore rumaffald, i virkeligheden var flyttet hertil fra Vandenberg – flybase i Californien, USA. Denne radar blev tidligere anvendt ved en test af USA's robotforsvar (Bulletin of the Atomic Scientists, March/April, 2000) Vardö radaren er en af de meget stærke radaranlæg, som USA har placeret rundt omkring på Jorden, der fuldender USAs robotforsvar. Dette såkaldte forsvar kan anvendes offensivt. Radaranlæggene til dette slutmål er også installeret i Clear, Alaska, Fylingdales og Menwith Hill, England og på Thulebasen på Grønland og i andre lande rundt om Kina og Rusland.

Philip E. Coyle var satssekretær i USA's forsvarsministerie i årene 1994-2001. Han sagde ved et interview med journalisten Bård Wormdal følgende:” Norge er vigtig for det ligger nær Rusland og ganske nær ved Kina. Den korteste distance, hvis robotter afskydes mod USA fra Rusland eller Kina, er over Nordpolen. Derfor er en radar i Norge vigtig, ikke bare fordi den er nær Rusland og Kina, men også fordi den ligger på vejen mod USA”

De russiske sikkerhedspolitiske talspersoner ser Vardö radaren som et led i USA's førsteslagsstrategi med atomvåben og er dybt bekymrede. En af de få, måske den eneste protest i Sverige mod dette farlige anlæg, står ikke-voldsnetværket i Ofog for. I juni 2004 begav syv Ofogaktivister sig i bil den lange vej til Vardö i Nordnorge. De ville inspicere basen. Basens chef, Magne Tunestved, benægtede at denne radar indgår i missilforsvaret eller at amerikanere arbejder på bassen. Men i Vardö fortælles om at alt der har med basen at gøre er hemmeligt og at amerikanerne bor på øens eneste hotel, når de er der. Ifølge professor Theodore Postol ved MIT universitetet i USA, kommer Vardö sammen med radaranlægget på øen Shemya, på den anden side af det russiske kontinent, til at kunne bevogte hele det russiske kontinent for afskydning af missiler. Raytheom, bekræfter dette i sine dokumenter.

Kilder: Satellitkrigen, Militarisering af polarområderne og verdensrummet. Bård Wormdal, 2011, NRK, Norge

Peace Review, The North Contribution to Space Militarization, Agneta Norberg, 2010, Sverige

Sidens top

radarv
Foto: Amanda Graham, Radome at Vardø 1920, link til billede og tekst.
Globus-II radar in Vardø is a Norwegian, American cooperation. According to the Barents Observer, "Globus II is part of the Space Situational Awareness that encompasses intelligence and adversary space operations; surveillance of all space objects and activities; detailed reconnaissance of specific space assets; monitoring space environmental conditions; monitoring cooperative space assets; and conducting integrated command, control, communications, processing, analysis, dissemination, and archiving activities."

 

Hvad mener Den svenske Fredskommitte om Vardö basen?

De første tre hæfter i projektet Norden i et rumperspektiv tager den svenske medvirken til militarisering af rummet op. NATO-landet Norge har tidligt givet USA og andre lande tilladelse til sit territorium som i Vardö og Fauske. Det såkaldte missilforsvarssystem søsattes af USA under præsident Reagan med begrundelsen, at stater, blandt andet Iran, Pakistan og Nordkorea, kunne forventes at angribe USA med missiler, og at disse missiler derfor måtte mødes og skydes ned, inden de kom frem til målet. Radaranlæg som dem i Vardö sammen med afskydningsanlæg i andre lande udfylder en sådan funktion. Det viser at Norden spiller en vigtig rolle i missilforsvarssystemet.

Kritikken mod systemets offensive funktion er internationalt udbredt. Rusland har fra begyndelsen på det bestemteste protesteret, eftersom man anser, at dette er rettet mod Rusland. Men selv i Tjekkiet møder de af USA planlagte installationer for missilforsvar stærke protester, så stærke at planerne blev skrottet. Med et dygtigt træk har USA nu for tiden lagt missilforsvaret over på hele NATO som påstået beskyttelse for alle NATO-stater, og radaranlæggene og afskydningsramper vil nu blive oprettet på forskellige steder i Europa, f.eks. Polen og Rumænien.

Også på Svalbard anvendes norsk territorium til militære formål i strid mod Svalbardtraktaten fra 1920. Denne fastslår at øgruppen er norsk, men giver Norge internationale forpligtelser til at den skal anvendes til fredelige formål. Traktatens paragraf 9 fastslår at Svalbard ikke må anvendes til krigslignende formål. Alligevel findes der i dag satelitnedladningsstationer og to højpræsterende fiberkabelforbindelser til fastlandet. Satellitternes civile og militære funktion er svær at holde adskilt og de dobbelte anvendelsesområder (dual use) udnyttes for at omgås de internationale bestemmelser.

De nordiske folk må få garantier for at et nærmere nordisk forsvarssamarbejde ikke må indebære, at vores lande trækkes ind i NATOs krigsplaner. Den stærke opinion for fred og nedrustning, som kendetegner de nordiske lande, må få genhør og ikke mødes med hemmelighedskræmmeri. I stedet for at satse på militær teknik, skal det civile samarbejde i regionen fortsat have mulighed for at udvikles økonomisk, kulturelt og i miljøspørgsmål. Et akut spørgsmål at samarbejde omkring er det militære radioaktive affald i Murmanskområdet, som truer befolkningen på begge sider af den norsk-russiske grænse.

Den svenske Fredskommitte mener, at der udvikles en form for sindssyge ”over folks hoveder” i ordets dobbelte betydning. Flere og flere lande sender satellitter op i rummet. Jorden fotograferes med en meters nøjagtighed, den mindste bevægelse registreres, droner styres mod menneskelige mål, og vejrsatellitter informerer ikke bare dig og mig, men meddeler også det bedste vejrlig for næste militære operation. Menneskets drøm om at kunne nå til stjernerne forvandles til en katastrofe, når drivkraften bliver at få herredømmet i rummet. Lad himlen være i fred!

Sidens top