Stark Unveils AI-Powered OWE-V Drone as Global Autonomy Arms Race Accelerates

Image Source: Stark

A Munich-based German startup, Stark, has announced significant progress in developing its One Way Effector-Vertical Take-Off (OWE-V) drone, also known as Virtus, which leverages artificial intelligence for autonomous navigation and target tracking.

According to Stark’s managing director, Phil Lockwood, the OWE-V can strike targets up to 100 kilometers away, navigating independently in environments with limited or no operator communication, such as GPS-denied areas. The drone’s AI enables real-time decision-making to evade enemy defenses, marking a step toward fully autonomous strike capabilities.

The OWE-V, designed with a vertical take-off and landing (VTOL) system, can carry a 5 kg payload and operate for up to 60 minutes, making it suitable for missions like deep strikes or electronic warfare. Stark’s proprietary Minerva software allows a single operator to control multiple drones, potentially enabling swarm operations. The company’s focus on European-sourced components and automated production facilities positions it to meet growing demand for advanced drone systems.

This development follows Stark’s selection, alongside competitor Helsing, to supply kamikaze drones to the German Bundeswehr under a €100 billion rearmament plan, reflecting Germany’s push to bolster its defense capabilities amid evolving global conflicts.

[Read More: Germany's Helsing Urges NATO to Build AI-Integrated 'Drone Wall' on Eastern Flank]

Designed for the Battlefield: Lessons from Ukraine

The ongoing war in Ukraine has been a catalyst for rapid advancements in AI-powered drone technology.

Ukrainian forces have deployed AI-enhanced drones, such as those developed by the startup Swarmer, which use AI to coordinate swarm operations and improve strike accuracy from 30-50% to 70-80%.

These drones, often costing as little as US$100 to SU$200 to upgrade with AI, have become critical in countering Russian forces. Both sides have integrated AI for tasks like target recognition and navigation in environments disrupted by electronic warfare.

Stark’s OWE-V, designed with feedback from Ukrainian battlefield experiences, reflects this trend toward autonomy and resilience.

[Read More: Ukraine's AI-Powered Drones: Leading Autonomous Warfare Against Russia's Electronic Defenses]

Ethical and Strategic Concerns Over AI in Combat

The European Union and countries like Austria advocate for banning fully autonomous weapons, citing risks of unintended escalation or violations of international humanitarian law.

A 2021 UN report noted the use of a lethal autonomous weapon in Libya, raising alarms about machines independently selecting and engaging targets.

In Ukraine, while AI drones currently require human approval for strikes, developers like Saker have confirmed limited autonomous attacks in jammed environments, signalling that full autonomy is technologically feasible.

Critics argue that autonomous systems could lower the threshold for lethal force.

The lack of a global treaty on lethal autonomous weapons, despite UN discussions since 2014, leaves a regulatory gap as nations like the US, China, and Russia also pursue AI-driven drone technology.

Stark’s Lockwood has stressed maintaining human oversight for now, but warned that fully autonomous systems are “not far off”, urging responsible innovation to balance capability and accountability.

[Read More: AI Warfare: Autonomous Weapons Changing the Battlefield]

Global Drone Arms Race Intensifies

The race to develop AI-powered drones is reshaping global defense strategies.

Stark’s OWE-V competes with systems like Helsing’s HX-2 and Anduril’s Altius-600M, used by Ukraine and the US, respectively, highlighting a crowded market driven by demand for precision and affordability.

Russia aimed to produce 1.4 million drones in 2024, while Ukraine targets four million in 2025, demonstrating how drones have become central to modern warfare.

[Read More: OpenAI and Anduril Forge Strategic Partnership to Enhance U.S. Defense AI Capabilities]

Human Oversight vs. Full Autonomy

While Stark and others still emphasize human-in-the-loop models, the technological momentum toward full autonomy is undeniable.

Whether the global community will act in time to regulate this transformation—or allow market and military forces to shape it—remains a critical question.

[Read More: AI on the Battlefield: Navigating the Ethical Minefield of Autonomous Weapons]

License This Article

Source: UK Defence Journal, Lawfare Media, Business Insider

3% Cover the Fee
TheDayAfterAI News

We are a leading AI-focused digital news platform, combining AI-generated reporting with human editorial oversight. By aggregating and synthesizing the latest developments in AI — spanning innovation, technology, ethics, policy and business — we deliver timely, accurate and thought-provoking content.

Previous
Previous

NVIDIA Unveils AI Breakthroughs at ICLR 2025: From Robots to Real-Time Music and Healthcare

Next
Next

Imogen Heap and Jen Launch StyleFilter: Ethical AI Music Tool Using Licensed Tracks