{"id":21075,"date":"2026-05-01T09:12:06","date_gmt":"2026-05-01T08:12:06","guid":{"rendered":"https:\/\/inernews.online\/?p=21075"},"modified":"2026-05-01T09:12:06","modified_gmt":"2026-05-01T08:12:06","slug":"what-do-ukraines-robot-soldiers-mean-for-the-future-of-warfare-russia-ukraine-war-news","status":"publish","type":"post","link":"https:\/\/inernews.online\/?p=21075","title":{"rendered":"What do Ukraine\u2019s robot soldiers mean for the future of warfare? | Russia-Ukraine war News"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div aria-live=\"polite\" aria-atomic=\"true\">\n<p>In a scene reminiscent of a computer war game, three battle-fatigued soldiers, dressed in white snow camouflage, emerge from a war-torn alley with their hands raised above their heads.<\/p>\n<p>They crouch down, following the orders being blasted at them, fear and shock etched across their faces as they stare down the barrel of a machinegun mounted on a so-called ground robot.<\/p>\n<section class=\"more-on\">\n<h2 class=\"more-on__heading\">Recommended Stories<!-- --> <\/h2>\n<p><span class=\"screen-reader-text\">list of 4 items<\/span><span class=\"screen-reader-text\">end of list<\/span><\/section>\n<p>This footage, released in January by Ukrainian defence company DevDroid, is said to show the moment Russian soldiers were captured by a Ukrainian robot using artificial intelligence.<\/p>\n<p>In April, Ukrainian President Volodymyr Zelenskyy said that, for the \u201cfirst time in the history of this war, an enemy position was taken exclusively by unmanned platforms \u2013 ground systems and drones\u201d.<\/p>\n<p>\u201cGround robotic systems have already carried out more than 22,000 missions on the front in just three months,\u201d he wrote in a post on X, alongside images of green machines with tank tracks and weapons mounted on top.<\/p>\n<p>But for analysts who have studied the intersection of artificial intelligence (AI) and warfare, the footage reflects an expected evolution \u2013 one that will unfold far beyond the front lines in Ukraine as the world wrestles with the ethical implications of controlling it.<\/p>\n<h2 id=\"uavs-naval-drones-and-robot-dogs\">UAVs, naval drones and robot dogs<\/h2>\n<p>For years, militaries have used ground robots primarily for bomb disposal and reconnaissance.<\/p>\n<p>But in Ukraine, their role has expanded rapidly, with some brigades reporting that up to 70 percent of front-line supplies are now delivered by robotic systems rather than soldiers.<\/p>\n<p>These machines transport ammunition, food and medical supplies, and evacuate wounded troops from dangerous positions.<\/p>\n<p>Yet the sight of robotic systems moving across the battlefield is part of a much broader shift in warfare \u2013 one that has been building for decades.<\/p>\n<p data-start=\"291\" data-end=\"427\">The modern debate about AI in warfare was largely driven by the rise of US unmanned aerial vehicle (UAV) operations in the early 2000s.<\/p>\n<p data-start=\"429\" data-end=\"629\">In 2002, the MQ-1 Predator drone was used by the US to carry out one of the first targeted air strikes in Afghanistan, marking a turning point in how wars could be fought remotely.<\/p>\n<p>Its use expanded rapidly throughout the 2000s and peaked in the late 2000s to mid-2010s, particularly in Pakistan, Yemen and Somalia.<\/p>\n<p>As AI has advanced, the debate has moved beyond remote-control operations.<\/p>\n<p>The focus shifted towards systems which can help identify targets, prioritise strikes and guide battlefield decisions, raising deeper questions about how much autonomy should be delegated to machines.<\/p>\n<p>Analysts say the question of autonomy must remain central, rather than being overshadowed by rapid technological developments, however striking the sight of increasingly anthropomorphic machines on the battlefield may be.<\/p>\n<p>\u201cThese technologies are here to stay,\u201d Toby Walsh, an AI expert at the University of New South Wales, told Al Jazeera. He described AI-driven military operations as \u201cthe third revolution of warfare\u201d.<\/p>\n<p>The transformation is also spreading beyond land targets.<\/p>\n<p>Naval drones packed with explosives have already reshaped battles in the Black Sea, while autonomous underwater systems are being developed for surveillance, mine clearance and sabotage missions by militaries worldwide.<\/p>\n<p>Robotic dogs, meanwhile, are already being tested for surveillance, reconnaissance and bomb-disposal missions, with some experimental versions even fitted with weapons.<\/p>\n<h2 id=\"human-involvement\">Human involvement<\/h2>\n<p>In recent years, the emergence of fully autonomous drones or so-called \u201ckiller robots\u201d has triggered a fierce debate after a United Nations report suggested that Turkish-made Kargu-2 loitering munition drones, operating in fully autonomous mode, had identified and attacked fighters in Libya in 2020.<\/p>\n<p>The incident prompted intense discussions among experts, activists and diplomats worldwide, as they grappled with the moral and ethical implications of a machine making \u2013 and executing \u2013 the decision to take a human life.<\/p>\n<p>However, there needs to be more focus on regulatory debate about the use of semi-autonomous weapon systems, \u201cwhere humans are still so-called in the loop\u201d, Anna Nadibaidze, a postdoctoral researcher in international politics at the Centre for War Studies, University of Southern Denmark, told Al Jazeera.<\/p>\n<p>A major concern, she said, is whether \u201cenough time and space\u201d is being given to the \u201cexercise of human judgement that\u2019s necessary in the context of warfare\u201d.<\/p>\n<p>The extent of human involvement is often something observers have to take militaries at their word on; a difficult task when their actions leave trust in short supply, said Toby Walsh.<\/p>\n<p>In the case of ground robotics in Ukraine, a human operator has, so far, remained in control, directing machines that can still be halted by obstacles such as uneven terrain.<\/p>\n<p>However, when AI is involved in the decision-making process, as is the case in Israel\u2019s attacks on Gaza and the wider region, the scale of attacks which have resulted in \u201chuge collateral damage and civilian casualties for a small number of military targets\u201d challenges the rules of international humanitarian law and, in particular, the idea of proportionality, Walsh said.<\/p>\n<p>The issue, Nadibaidze said, is that it is hard to enforce rules on the use of AI in warfare as it is essentially \u201ca matter of each military to decide what they consider to be a citizenship role for the human, and there isn\u2019t enough international debate on that\u201d.<\/p>\n<p>An April report by the Stockholm International Peace Research Institute warned that the AI supply chain is also fragmented, global and heavily dependent on civilian technologies, further complicating efforts to govern or control military uses of AI.<\/p>\n<p>The United States Department of Defense and the Pentagon are consistently incorporating privately developed software systems into their war apparatus.<\/p>\n<p>In the middle of last year, the Defense Department awarded OpenAI a $200m contract to implement generative AI into the US military, alongside $200m contracts for xAI and Anthropic.<\/p>\n<p>\u201cIf we\u2019re not careful, warfare will be much more terrible, much more deadly, a much quicker, much faster thing that humans can no longer actually really be participants in, because humans won\u2019t have the speed, won\u2019t have the accuracy or the ability to respond,\u201d Walsh warned.<\/p>\n<h2 id=\"ukraine-as-a-testing-ground\">Ukraine as a testing ground<\/h2>\n<p>Technology and AI are not inherently harmful, experts say \u2013 it is how they are used that matters.<\/p>\n<p>In Ukraine, ground robotic systems have also been used to rescue civilians and provide logistical support in heavily mined and treacherous conditions.<\/p>\n<p>Yet what is unfolding on the front line is, in many ways, a testing ground, and the international community will need to look ahead to how these technologies might be applied and regulated in future conflicts.<\/p>\n<p>There is also room for cautious optimism. Despite the \u201cmoral failure\u201d over Israel\u2019s actions in Gaza, Walsh said, there is a recognition in the international community that these issues must be addressed, including a series of UN meetings focused on regulating Lethal Autonomous Weapons Systems.<\/p>\n<p>The United Nations Institute for Disarmament Research (UNIDIR), an autonomous body within the UN which conducts independent research on disarmament and international security, is set to meet in June to examine the implications of AI for international peace and security.<\/p>\n<p>It is not the first time new weapons technologies have threatened to upend the rules-based order, said Walsh, pointing to chemical weapons as an example. While imperfect, international agreements were eventually put in place to bring those under some level of control.<\/p>\n<p>\u201cThere are a lot of actors based in the Global South that do want regulation, so there might be regional initiatives forming,\u201d said Nadibaidze, adding that even if such efforts do not initially include major powers or leading tech developers, they could still help to shape emerging norms.<\/p>\n<\/div>\n<p><br \/><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In a scene reminiscent of a computer war game, three battle-fatigued soldiers, dressed in white snow camouflage, emerge from a war-torn alley with their hands raised above their heads. They crouch down, following the orders being blasted at them, fear and shock etched across their faces as they stare down the barrel of a machinegun [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":21076,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[6],"tags":[],"class_list":["post-21075","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-europe-news"],"_links":{"self":[{"href":"https:\/\/inernews.online\/index.php?rest_route=\/wp\/v2\/posts\/21075","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/inernews.online\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/inernews.online\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/inernews.online\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/inernews.online\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=21075"}],"version-history":[{"count":0,"href":"https:\/\/inernews.online\/index.php?rest_route=\/wp\/v2\/posts\/21075\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/inernews.online\/index.php?rest_route=\/wp\/v2\/media\/21076"}],"wp:attachment":[{"href":"https:\/\/inernews.online\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=21075"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/inernews.online\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=21075"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/inernews.online\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=21075"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}