AI could boost small states, reshape power balance with great powers
Small states are using AI to gain autonomy and influence while cheap drones and AI targeting erode the cost advantages of big militaries.

Artificial intelligence is no longer just a force multiplier for the strongest powers. It is giving smaller states and even nonstate actors cheaper ways to influence, disrupt and defend, while making traditional great-power leverage harder and more expensive to sustain.
RAND has warned that nations could see their power rise or fall depending on how they harness and manage AI, and that governments will need new regulatory frameworks to identify, evaluate and respond to AI-enabled challenges. That makes control of AI development the key strategic question, not a side issue. In one path, U.S. companies lead the field; in another, states tighten their grip. Either way, AI is shaping competition in the same way military power once did.
The small-state playbook is already visible. A 2025 RSIS analysis says countries such as Singapore, Switzerland, Finland, Ireland and Sweden are using AI to secure autonomy, enhance security and expand global influence by embedding themselves in AI value chains and shaping the rules around AI governance. Singapore stands out as a launchpad: Google has committed more than US$850 million there, Amazon Web Services is investing an additional S$12 billion by 2028, and the country hosts more than 70 data centers and over 25 subsea cables. In practice, that infrastructure gives a small state more room to maneuver than raw geography would suggest.
The military picture is just as disruptive. NATO’s 2024 report on AI says experts believe integration into military systems could revolutionize warfare through decision-making, autonomous systems and changes in the role of soldiers, even as procurement, interoperability and legal questions slow deployment. RAND has gone further, arguing that cheap commercial drones have sharply tilted cost asymmetry toward offense. The war in Ukraine, fighting in Israel and Gaza, and attacks in the Red Sea all show the same pattern: it is often cheap to strike and expensive to defend. In January 2025, the U.S. Navy said it had fired more than 200 missiles to repel Houthi attacks since November 2023, at a cost of hundreds of millions of dollars.
That imbalance matters because AI compresses the cycle between finding, deciding and acting. The Israel-Hamas war of May 2021 was described in the Israeli press as "the world’s first AI war," and the pace accelerated after October 7, 2023. Within the first few weeks of the Gaza war, an AI decision-support system called Lavender reportedly generated a list of 37,000 targets. For big militaries, that raises the possibility of faster strikes; for smaller actors, it lowers the threshold for meaningful effect.
For Washington, the lesson is blunt. Deterrence cannot rest only on bigger platforms, slower procurement and the assumption that superior force will always buy control. If AI lowers the cost of influence, then bargaining power shifts toward whoever controls the models, the chips, the data centers and the rules. Great-power leverage will not disappear, but it will depend more on governance, access and alliances than on scale alone.
Know something we missed? Have a correction or additional information?
Submit a Tip

