America Can’t Afford to Lose the Artificial Intelligence War

Imagine if Nazis were the first with Killer Bots!

Today, the question of artificial intelligence (AI) and its role in future warfare is becoming far more salient and dramatic than ever before. Rapid progress in driverless cars in the civilian economy has helped us all see what may become possible in the realm of conflict. All of a sudden, it seems, terminators are no longer the stuff of exotic and entertaining science-fiction movies, but a real possibility in the minds of some. Innovator Elon Musk warns that we need to start thinking about how to regulate AI before it destroys most human jobs and raises the risk of war.

Share
  • The prospect of sending killer robots out to deal with the Muslims sounds good. Of course, that is so long as the robot manufacturers do not sell their wares to Saudi Arabia and Iran, which of course they will. So yes, there is a danger here. Not to mention, if robots fall into the hands of a dictatorship for internal uses, e.g. “police” work to keep populations well in line. Terminator might well happen in this sense. As regards the robots doing it for their own account, that I find a bit difficult to believe. Machines have no goals of their own, so I do not see why they would do that (unless so programmed).

    • Exile1981

      google had to shut down an AI experiment when the two computers invented their own language to converse in and wouldn’t tell the programmers what they were saying in english.

      • Yes, I read that a couple of weeks ago. I am not sure what it means, technically, what bug was involved. But one thing I am sure of – the language used suggesting that the AI machines have A WILL OF THEIR OWN, that is surely nonsense, the imaginative interpretation of people who have read too much science fiction.

  • Norman_In_New_York

    Israel has already developed drone missile boats and driverless reconnaissance vehicles. The latter are a prelude to remote controlled tanks. It has also tightened cyber defenses to the degree that it walked away unscathed from the recent ransomware attacks.

    • Yes. All that is true. However, note how drone technology has now become so widespread that ignorant terrorists can get it in a store, and use it at will. Same with mobile phones, same with all the apps that Israel helped develop. What is the use of high-tech if it is sold to the enemy?

      • Norman_In_New_York

        Israel developed its aerial drone technology 40 years ago, so the rest of the world was bound to catch up over that period of time.

  • Hard Little Machine

    Machines will do no worse than people. People make mistakes all the time, or worse, refuse to fight. Estimates are that no more than 15% of US GI’s in combat in WW2 ever fired their weapon. And of those that did most did, wildly. It took thousands of rounds fired to kill one enemy soldier. Are we afraid of doing worse than that? Or are we afraid that machines will form their own SS and round up civilians AFTER the battle to exterminate them in mass graves? Are we worried about friendly fire? We do that now. Or are we concerned that we will field weapons as remorseless and merciless as the Islamic suicide maniacs they’re fighting? Because if not that then we might as well nuke them. And if we have AI Cops, what’s the worst that happens? Someone screams they’re racist too?