
The development of聽autonomous weapons聽technologies in the military domain聽聽by academics and analysts聽as the third revolution of warfare, with rapidly increasing autonomy in weapons well underway.
Australia is in a group of countries leading the charge along with the United States, Britain, Russia, China, Israel, India聽and South Korea.
A range of aerial, land and under-water systems with autonomous capabilities are being developed and deployed, including in current conflicts. Weapons that would operate without human control over the selection of targets and decision to attack, referred to as fully autonomous weapons or 鈥渒iller robots鈥, are only a few steps away from current reality.
Such weapons pose legal, ethical and security risks as decisions of who to kill are delegated to machines. Australia is, however, embracing this rapidly developing industry with barely any public discussion.听
Australian company DefendTex聽聽300 of its Drone 40聽loitering munitions聽to Ukraine last year. There have been mainstream media reports on the use of drones, loitering munitions and other autonomous capabilities in Ukraine, including in the聽听补苍诲听, and how the conflict has become a vital testing ground for Western weapons.
This is mostly framed as a positive development with only聽limited聽.听
DefendTex is just one of many Australian companies in an expanding landscape of Australian development in AI for defence, involving private industry and universities. To foster collaboration between these sectors, from 2017 as part of its聽聽the former Coalition government established聽
The inaugural centre,听,听based in Brisbane, was awarded an initial $50 million investment from the government for its first seven years.
Universities are active in projects facilitated by Trusted Autonomous Systems, as well as other initiatives in partnership with Australian defence or arms companies that focus on autonomy and related capabilities. For instance, both聽the聽聽which聽connects defence, industry and Victorian universities 补苍诲听the聽聽have autonomy as a focus area.
STELaRLab is a partnership between the University of Melbourne 补苍诲听聽international arms company,听Lockheed Martin, which has been accused of complicity in聽.
A core area of work for STELaRLab is autonomy, robotics research and development. A student activist group in Australia,听Lockout Lockheed Martin, protested against the partnership involving university collaboration in weapons鈥 production.听
Given the nature of autonomy-related technologies, it鈥檚聽聽of their projects or research when these are in collaboration with arms companies or defence.听
聽who are concerned about these ethical risks will turn down opportunities to work on projects connected to defence, while others lack awareness and ethics education in fields such as computer science.
Australian companies on the frontline
Large arms manufacturers, as well as smaller Australian arms companies, undertake collaborative projects supported by Trusted Autonomous Systems, many pushing the autonomy envelope.听
DefendTex, the creators of the Drone 40聽loitering munition聽being supplied to Ukraine, is聽. A swarm is where numerous weapons are deployed in a connected group, moving en masse as they find targets. Without limits to their geographic area and duration of operating, these would be difficult for operators to adequately control and increase risks to civilians as well as intensifying the pace of warfare.
补苍诲听聽are two Australian arms companies based in Queensland developing weapons which skirt moral and ethical red lines, in the absence of any specific limits on autonomy.
Skyborne Technologies is developing聽, an autonomous drone carried in a backpack and equipped with rapid multi-shot grenades. Cyborg Dynamics is co-developing the聽聽with Australian robotics company BIA5, also armed with various ammunitions. These weapons have been exhibited by both companies at聽聽and international arms fairs and an upcoming US convention for military and industry.
To avoid disastrous consequences, particularly for civilians, these weapons must be controlled by human operators who can understand and evaluate the environment and not be unleashed in conflict zones which are increasingly urban areas. This is most crucial in decisions for targeting and whether or not to attack.
The above two companies also share a collaborative venture in聽聽which focuses on the development of an AI-enabled targeting system to track, identify and select targets. These targeting capabilities can be integrated into other weapons and are a dangerous step towards fully autonomous weapons.听
Athena AI capabilities are聽聽by Red Cat which makes drones for the US used for the protection of military bases and border control.听
There are risks in exporting systems, components or software for use by other companies or countries, especially as they may be used or adapted in new ways that are not lawful. It has not yet been clearly delineated what use of autonomy in weapons are legally and morally acceptable.
When this concern was put to the Anthony Albanese government in Questions on Notice聽in April, the written response on behalf of the Minister for Defence Richard Marles avoided confirming whether or not Australian arms companies or projects from Trusted Autonomous Systems were being exported and asserted that: 鈥淭here is no widely agreed definition of autonomous weapons, and Defence exports a range of goods and technologies including for training and operations鈥.
There are some significant projects where Australian defence has partnered with large arms companies. Recently, a partnership聽聽for autonomous submarines between the Royal Australian Navy 补苍诲听,听聽for the US military, also聽.
Anduril founder, Palmer Luckey,听聽for developing a VR headset that would kill its wearer if they died in a video game. Anduril聽聽last year, with the submarines central to that venture.听聽聽聽聽
鈥淕host Bat鈥 is an autonomous aircraft developed in a flagship project by Boeing Australia with the Royal Australian Air Force. Boeing is a multinational aerospace company that works in civil areas, such as commercial aircraft, and communication satellites, as well as defence. The company聽聽equipment to聽聽states, such as Saudi Arabia.
The Ghost Bat project has facilitated the聽聽near Toowoomba, in Queensland, as part of the聽.听The site聽is the first Boeing manufacturing聽site聽outside of the US. The aerospace hub currently centres on the Ghost Bat contract with the Australian government.
When the hub was announced in 2021, Queensland Treasurer Cameron Dick聽: 鈥淥ur vision for this precinct is to be the epicentre of aerospace and defence development, advanced manufacturing, globally, research and development and education.鈥
Autonomy is integral to innovation in aerospace in defence and civil domains but currently there is an absence of regulation in these areas. To innovate responsibly, clearer guardrails are needed from government and within the private sector.
AUKUS embraces AI
The innovation in autonomy and investment in AI for defence is shared by Australia鈥檚 allies, especially the US and Britain. Cooperation between these countries on autonomous capabilities is set to increase through the聽.
The alliance was announced to foster cooperation for regional security between these partners but has received criticism in Australia from analysts, academics, past government leaders including聽听补苍诲听聽and the public, including the .听
AUKUS鈥檚 next phase, or 鈥渟econd pillar鈥,听focuses on technology sharing of聽鈥渁dvanced capabilities鈥,听a major aspect being AI. Cooperation on advanced capabilities is intended to increase security and the聽ability of the three partners and their defence forces to work together. The advanced capabilities pillar was聽recently showcased in a joint聽聽hosted by Britain, in collaboration with all three militaries at the testing.听This is just the outset of collaboration on AI-enabled capabilities through the alliance.
Recently, the Australian government 聽a new scheme, the Advanced Strategic Capabilities Accelerator, with autonomy as a priority area. This聽was in response to聽the recent聽鈥檚 findings and AUKUS鈥 second pillar.
The Defence Strategic Review only contained one other reference to autonomy 鈥 a聽general one to air聽capabilities聽and specifically聽Ghost Bat. Given the extensive landscape of development in autonomy for defence, the limited references seem unusual, as it is repeatedly articulated as a priority.
By contrast, the 2020聽聽outlined how 鈥渆merging technology will be rapidly utilised and incorporated into the new strategic framework, with autonomous weapon systems and long-range weapons being increasingly developed, researched and tested鈥.
The last few years have illustrated the pursuit of this.听Deputy Prime Minister and defence minister聽Richard Marles聽聽on the Advanced Strategic Capabilities Accelerator and said: 鈥淎ustralia must invest in the transition to new and innovative technologies for our Defence Force.鈥
Autonomy is seen as central to these goals.听
Question of ethics
Some of the ethical concerns of developing this area have been recognised in a聽,听commissioned in 2021 by the Australian Department of Defence.
It proposed three different tools:聽an Ethical AI for Defence Checklist, Ethical AI Risk Matrix, and a Legal and Ethical Assurance Program Plan,听but these do not reflect current defence policy.
The government uses a framework for the development of all weapons called the聽.听This framework does not have specific considerations related to autonomous capabilities in the weapons design or how the weapon operates. Policy is lacking on limits to how autonomy is used in weapons and the amount of human control required, in particular over the 鈥渃ritical functions鈥 of selecting targets and deciding to attack.
Without the establishment of clear policy, development is unfettered. The legal, ethical and security risks are not being adequately addressed.
聽in 2017聽called on the government to support a ban on lethal autonomous weapons. An聽聽for global AI and robotics researchers and companies included high profile endorsements such as Elon Musk and American tech entrepreneur Steve Wozniak.
Recently,听the聽urged the prohibition of lethal autonomous weapons.听Australia is yet to heed such calls, which are echoed globally.
In response to the many legal, ethical, security and humanitarian concerns raised by autonomous weapons, the international community has called for new international law to be established. This includes the聽,听, the tech sector,听 and the聽campaign.
The UN Secretary General鈥檚聽聽calls for negotiations on a new legally binding instrument to address autonomous weapons to conclude by 2026.
A legally binding international instrument on autonomous weapons would establish specific prohibitions and other obligations. This may include prohibitions on weapons that select and apply force to targets without human control, or obligations regarding the duration of time and geographical space where a weapon with autonomous capabilities is used.
These regulations would seek to address the legal challenges聽of聽accountability and international humanitarian law, such as ensuring distinction between聽combatants聽and civilians and the聽proportionality聽of an attack. These require inherently human evaluations and cannot be achieved by a machine.
It would also establish a strong precedent for responding to ethical concerns, notably the delegation of life-death decisions to machines听补苍诲听digital dehumanisation.
Digital dehumanisation is the process whereby humans are reduced to data, which is then used in automated decisions that may have negative effects. Autonomous weapons that decide to attack and kill illustrate the most acute harms. Such regulation would also mitigate security risks including the acceleration and intensification of conflict聽due to the potential pace and scale of these weapons or machine error.
International talks have sought to address autonomous weapons for almost a decade. Since 2014, dedicated diplomatic meetings have been held each year at the United Nations in Geneva under the framework of the Convention on Certain Conventional Weapons (CCW).
To date,听聽have called for new international law to be established. However, countries leading in AI development for their military 鈥斅爀specially the US, Russia, Israel and India 鈥斅爃ave opposed any聽regulation.
Due to consensus rules, this has聽聽towards any kind of聽concrete action.听A stymied process in these diplomatic meetings聽,听such as on landmines, where eventually international law was established in line with global momentum.
Australia, along with the US and Britain reject the need for new international law on autonomous weapons. These governments have often acted in a group at the diplomatic meetings, along with Canada, Japan and South Korea, offering聽聽which disregard ethical concerns and obfuscate human control.听
Since the Albanese Labor government was elected in May last year, Australia has started to participate more constructively,听聽presented by other countries. However, it insists that any measures must not reflect a legal obligation.
In previous cases, such as with landmines, Australia鈥檚 position diverged most notably from the US to join global efforts to establish a new treaty when Australia signed the Mine Ban Treaty in 1997, while the US did not.
Australia is currently out of step with global progress in favour of new international law to address autonomous weapons.听But momentum is building towards a new legally binding instrument on autonomous weapons.
Last year, at the United Nations General Assembly, a聽聽on聽autonomous weapons was delivered. This was the first time for such engagement outside of the continually stagnated UN Convention on Certain Conventional Weapons (CCW) meetings.
This year a number of regional conferences, hosted by countries including the Netherlands, Luxembourg and Costa Rica for the Latin America and Caribbean region, have been organised. These were in addition to the continued but still stymied CCW meetings.
The United Nations General Assembly is also approaching and it is likely many countries will use the opportunity to take further actions to address autonomous weapons such as advancing a resolution.
Policy and international law typically trail behind the advent of new technology. But decisions are always required around what can and should be pursued for humanity鈥檚 betterment. Autonomous weapons are no different.
[Matilda Byrne is the National Coordinator of the Australian Stop Killer Robots campaign.听This article was first published by and has been reprinted with permission.]