“The drone is the ultimate imperial weapon, allowing a superpower almost unlimited reach while keeping its own soldiers far from battle,” writes New York Times reporter James Risen in his important new book “Pay Any Price: Greed, Power, and Endless War.” “Drones provide remote-control combat, custom-designed for wars of choice, and they have become the signature weapons of the war on terror.”
But America’s monopoly on death from a distance is coming to an end. Drone technology is relatively simple and cheap to acquire — which is why more than 70 countries, plus non-state actors like Hezbollah, have combat drones.
The National Journal’s Kristin Roberts imagines how drones could soon “destabilize entire regions and potentially upset geopolitical order”: “Iran, with the approval of Damascus, carries out a lethal strike on anti-Syrian forces inside Syria; Russia picks off militants tampering with oil and gas lines in Ukraine or Georgia; Turkey arms a U.S.-provided Predator to kill Kurdish militants in northern Iraq who it believes are planning attacks along the border. Label the targets as terrorists, and in each case, Tehran, Moscow, and Ankara may point toward Washington and say, we learned it by watching you. In Pakistan, Yemen, and Afghanistan.”
Next: SkyNet.
SkyNet, you recall from the Terminator movies, is a computerized defense network whose artificial intelligence programming leads it to self-awareness. People try to turn it off; SkyNet interprets this as an attack — on itself. Automated genocide follows in an instant.
In an article you should read carefully because/despite that fact that it will totally freak you out, The New York Times reports that “arms makers…are developing weapons that rely on artificial intelligence, not human instruction, to decide what to target and whom to kill.”
More from the Times piece:
“Britain, Israel and Norway are already deploying missiles and drones that carry out attacks against enemy radar, tanks or ships without direct human control. After launch, so-called autonomous weapons rely on artificial intelligence and sensors to select targets and to initiate an attack.
“Britain’s ‘fire and forget’ Brimstone missiles, for example, can distinguish among tanks and cars and buses without human assistance, and can hunt targets in a predesignated region without oversight. The Brimstones also communicate with one another, sharing their targets.
[…]“Israel’s antiradar missile, the Harpy, loiters in the sky until an enemy radar is turned on. It then attacks and destroys the radar installation on its own.
“Norway plans to equip its fleet of advanced jet fighters with the Joint Strike Missile, which can hunt, recognize and detect a target without human intervention.”
“An autonomous weapons arms race is already taking place,” says Steve Omohundro, a physicist and AI specialist at Self-Aware Systems. “They can respond faster, more efficiently and less predictably.”
As usual, the United States is leading the way toward dystopian apocalypse, setting precedents for the use of sophisticated, novel, more efficient killing machines. We developed and dropped the first nuclear bombs. We unleashed the drones. Now we’re at the forefront of AI missile systems.
The first test was a disaster: “Back in 1988, the Navy test-fired a Harpoon antiship missile that employed an early form of self-guidance. The missile mistook an Indian freighter that had strayed onto the test range for its target. The Harpoon, which did not have a warhead, hit the bridge of the freighter, killing a crew member.”
But we’re America! We didn’t let that slow us down: “Despite the accident, the Harpoon became a mainstay of naval armaments and remains in wide use.”
U-S-A! U-S-A!
I can see you tech geeks out there, shaking your heads over your screen, saying to yourselves: “Rall is paranoid! This is new technology. It’s bound to improve. AI drones will become more accurate.”
Not necessarily.
Combat drones have hovered over towns and villages in Afghanistan and Pakistan for the last 13 years, killing thousands of people. The accuracy rate is less than impressive: 3.5%. That’s right: 96.5% of the victims are, by the military’s own assessment, innocent civilians.
The Pentagon argues that its new generation of self-guided hunter-killers are merely “semiautonomous” and so don’t run afoul of a U.S. rule against such weapons. But only the initial launch is initiated by a human being.” It will be operating autonomously when it searches for the enemy fleet,” Mark Gubrud, a physicist who is a member of the International Committee for Robot Arms Control, told the Times. “This is pretty sophisticated stuff that I would call artificial intelligence outside human control.”
If that doesn’t worry you, this should: it’s only a matter of time before other countries, some of which don’t like us, get these too.
Not much time.
(Ted Rall, syndicated writer and cartoonist, is the author of the new critically-acclaimed book “After We Kill You, We Will Welcome You Back As Honored Guests: Unembedded in Afghanistan.” Subscribe to Ted Rall at Beacon.)
COPYRIGHT 2014 TED RALL, DISTRIBUTED BY CREATORS.COM
6 Comments.
Even if AI drones “become more accurate” than their predecessor biological-intelligence drones have proven, they will be nowhere accurate enough (see 28:1 article below) … assuming either type should exist at all.
Can’t get a 100 mpg car though.
Did you mean “mph”?
Assuming so, this does not mean it won’t be attempted.
I’m just pissed at what tech can do and whan we don’t/can’t have.
That statement is not really precise. Actually, you just can’t get a 100 mpg car (past the Petroleum-Industrial-Complex) sold on the open market. Several have already been invented, bought out, and memory-holed.
DanD
Sorry, I though you meant a drone can’t hit a car traveling at high speed. I was still in my “what possibly could go wrong with AI-controlled drones?” mode.