checklist-1402461_1920

Society and technology – Take part in the international survey by DEFTECH / armasuisse

WHAT BOUNDARIES WOULD YOU SET TO THE USE OF TECHNOLOGIES BOTH IN A CIVILIAN AND MILITARY CONTEXT? WHICH APPLICATIONS WOULD YOU ACCEPT AND WHICH NOT? 

The Technology Foresight research programme by armasuisse Science + Technology asks for your opinion on the subject. The project’s aim consists in exploring the limits of the social acceptance of technologies.

Our time is characterized by technologies that develop very quickly. These developments allow for applications that were still considered pure science fiction a few years ago. A successful market launch of such products depends largely on their social acceptance. This applies to both the civil and military sectors. (DEFTECH)

CLICK HERE TO PARTICIPATE IN THE INTERNATIONAL SURVEY

You can express your views on a series of topics of your choice and in different languages. Everyone is invited to participate in the survey.

Autonomous Systems
Energy Consumption
Future Weapons
Genom Editing
Human Enhancement
Human-Machine-Teaming
Identity Recognition
Privacy (limits of privacy)
Space Activities

In democratic states, an army cannot avoid considering the social acceptance of technologies and their applications in its planning processes. (DEFTECH)

Results will be made available at a later stage. Thank you!


Swarming and Machine Teaming – Defence Future Technologies DEFTECH

A workshop in Thun (Switzerland) to assess the
state-of-the-art technology and research

Chiara Sulmoni reports.

On Wednesday, 21st November 2018 armasuisse S+T (Science and Technology) organised a day-long international workshop for military personnel, researchers, specialists and company representatives, addressing the subject of ‘swarming and machine teaming’.

The event is part of a series which armasuisse Science and Technology organises on a regular basis under the trademark DEFTECH (Defence Future Technologies). These high-profile meetings allow military and civilian experts to share hindsights, anticipate technology trends and make informed decisions in the field of security.

Swarming indicates the deployment of low-cost, autonomous elements acting in coordination with one another to carry out a specific task. Generally, it’s small drones or robots.

Swarms are common in nature, where many species -birds and fish, for instance- move or operate in vast groups. Research into ‘artificial’ swarming often starts with the observation and study of animal behaviour.

Swarming is dual-use, meaning that it can take shape in the civilian environment -for instance, with commercial drones flying in formation- or the military -where it’s principally a battlefield tactics and is associated to the issue of lethal autonomous weapons (LAWS), whose ethical aspects are discussed at UN-level-. Given the rapid development of technology -and the lack of an efficient defence system should a swarming attack take place- armasuisse wished to gain a better understanding of the challenges and risks related to it. The workshop was therefore aimed at getting to know the ‘state of the art’ within this domain. Experts from different fields were called in to provide their perspectives. What follows, is a brief report of some key points which have been touched upon during this meeting, which was organised under the supervision of Quentin Ladetto, Research Programme Manager at armasuisse S+T, and introduced by Director Thomas Rothacher.

Switzerland is a global leader in drone technology. Markus Hoepflinger from Swiss Drones and Robotics Centre (affiliated to armasuisse S+T) was keen to underline right from the start that not only domestic and foreign media dub it the “Silicon Valley of robotics” or “drones centre of the world”. The very same Federal Department of Foreign Affairs is an eager proponent of Swiss expertise (for more information, visit www.homeofdrones.org). Swiss research involves both academic and technical institutes in all regions, and the industry. Today’s environment is mainly mobile robotics with the strongest capability being autonomous flights. There are however a series of potential future military applications which are being looked into, with a view to enhancing search and rescue operations for instance, or for engineering work. Markus Hoepflinger also explained that swarming in the future could dominate war, with experiments underway in Russia, China, the US, Israel and to a lesser extent the EU (e.g. Royal Air Force and Airbus). But drone warfare is not yet happening, despite what has been (questionably) described as the first swarm attack in Syria against a Russian base in January 2018.

Despite rapid progress in all fields, Vincent Boulanin of the Stockholm International Peace Research Institute (SIPRI) emphasized how misconceptions and myths around autonomous systems and artificial intelligence represent a problem, insofar they tend to make policy discussion improductive and blind us from the true possibilities and limitations of these technologies. Programming machines for general tasks is difficult, as they can not generalise from previous situations: while they do process images and understand words (consider the mobile phone application ‘Siri’, for instance) common sense is not one of their assets. On the other hand, autonomous navigation is very context-dependent, with air or underwater environments presenting less obstacles compared to land. ‘Teaming’ is an important aspect of swarming, as machines must communicate with each other and their operator; these systems can share information, perform collaborative tasks (like flying together, complete surveillance assignments, inspect buildings in uncomplex environments). But there’s no symmetrical machine-human communication and finding the right ratio can also be complex. As pertains to the military field proper, Boulanin pointed out how targeting remains the most critical application of autonomy as systems do not have the ability to make a distinction between a civilian and a military target. In the end, autonomy is much easier to achieve for commercial applications.

Martin Hagström from the Swedish Defence Research Agency underlined how having ‘many of something’ does not constitute a problem; the objective is to be able to deploy cheap, efficient sub-systems and a reduced number of ground operators. He also recalled that the antagonist perspective is considerably different from the civil perspective. Swarms rely on satellite navigation (GPS) and are therefore vulnerable to attacks by adversaries who can master a high technological command and could disrupt communication in a contested environment. ‘Robust’ systems are quite expensive and Hagström is therefore persuaded that it might take some time before swarming can be adopted in the military. Other issues to take into account when thinking of flying objects are flight safety rules and policies (air space is not free) and last but not least, the complexity of testing. Stability and predictability are paramount in military applications and because a system acts within its own designed space, autonomy is to make that design space very large, so that it may include many potential events. But outside of (software-based) simulation, testing a system remains hard.

Georg Dietz works for German group IABG mbH and focuses on military airborne platforms. The expert explained that air operations today are increasingly complex for a number of different reasons: the sheer amount of players in the world, the fastest conflict dynamics, the speed of technological advances and information exchange, the rapid growth of sensor ranges and so on. Capabilities like platforms or systems can be insufficient while costs are high, with each new fighter aircraft, by instance, being at least twice as expensive as its predecessor. Future combat air systems will be designed as a system of systems (SoS) consisting of a variety of different components both manned and un-manned, enabling swarming operations. Design and control open up a series of questions though, as to the number and type of platforms needed, the degree of autonomy and technology gaps; on communication in highly contested areas; on human – machine interface and so on. Nevertheless, swarming represents a nearing future in air operations.

Jean-Marc Rickli from the Geneva Centre for Security Policy (GCSP) expounded the concept that swarming is the 5th evolution of military strategy and together with autonomy, it represents a key characteristic of the battlefield of the future. Other strategies (or ways to use force) are ‘denial’ -whose main target is the military; punishment -which hits civilians and infrastructure to exert indirect pressure (terrorism features as punishment); risk -consisting in threatening an escalation, e.g. US-USSR Cold War; and decapitation -which relies on technology like drones to eliminate enemy leadership-. But a large number of small units with sensory capabilities, easy to manouver, able to act in coordination -such is the description of a functioning swarm- can concentrate firepower, speed, forces in a way previously unseen. Swarming tactics is a means to wage asymmetric wars and cyber manifestations of it have already been encountered. 3D-printing of gun components and drones will have important implications, explained the expert. In 2017 in Mosul several Iraqi soldiers were killed by drones operated by ISIS in what was the first instance of the West losing tactical aerial supremacy. Should swarming become a mainstream strategy, we should expect a more conflictual international environment, concluded Rickli.

Marco Detratti from the European Defence Agency (EDA) underlined how, according to estimates, the market for autonomous systems’ products and technology in non-military sectors will be in the order of €100Bn by 2025, with defence playing only a minor part. But swarms have disruptive potential in many fields and while defence is not yet impacted, it nevertheless expects to be in the future. In defence (non-offensive perspective) swarms can change and improve capabilities. Specifically, they can offer ubiquity, resilience and invisibility and are therefore taken into consideration in all tasks and for all domains: land, air, maritime and cyber. From swarms, the military expects cost reduction, decrease in manpower and risk, technical advantages.  Since 2010, EDA has been trying to identify scenarios where swarm and multirobot systems could ‘deliver’; it started a series of projects accordingly. Despite technical evidence of feasibility and noteworthy research, problematics and challenges persist: Detratti went on to explain that there are no real autonomous systems in operation; systems are not resilient enough (consumption); they are not ‘smart’ enough; more progress is needed in testing the unpredictable (to be sure, for instance, that things continue to work when communication is interrupted, that information is not manipulated). There are also non-technical issues to take into account, like the need for a big shift in terms of military culture, doctrine and training; public perception; and ethics.

Autonomous (lethal) weapons have been raising ethical issues for years. George Woodhams gave a hindsight into the discussions and initiatives taking place at UN-level and within UNIDIR (UN Institute for Disarmament Research), which has been dealing with UAVs (un-manned aerial vehicles) since 2015. A specific concern regards the use of Reapers and Predators (drones). The Institute has been encouraging the international community to consider what new challenges may emerge from the proliferation of this technology and it also looks into strategic implications of un-manned systems. An issue for the UN to consider in the long term, is whether due to their low risk and cost of deployment, these systems might lead to problematic military practices. Woodhams went on to illustrate lines of debate within the frame of the Convention on Certain Conventional Weapons, a UN-negotiating body designed to address weapons systems with implications on international humanitarian law. A Group of Government Experts to address Lethal Autonomous Weapons systems (LAWS) was established in 2014, with military advisors regularly invited in. It focuses on what is called ‘meaningful human control’ and its ethical foundations, like retaining human agency in decisions over the use of lethal force, preserving human dignity and ensuring human accountability. Talks can be difficult, as the 84 States which are involved in discussions have different military capabilities and levels of hindsight, but everybody seems to agree on the need to identify best practices and practical measures for improving compliance with international law. Though swarming has not been mentioned specifically over the last four years, concluded Woodhams, it’s the one area of autonomy that catches the imagination the most.

From all the implications derived from the concept of swarming, to the practical side of understanding the many ways in which it can take shape. There’s a flurry of exciting and ground-breaking research going on in laboratories, aimed at addressing limitations and constraints, with a view to developing a higher degree of autonomy and coordination.

We already mentioned how research takes ‘inspiration’, so to say, from nature. In introducing his  line of work, Nicolas Bredeche from Pierre and Marie Curie University explained that methods used to study natural systems (like animal behaviour) can also be used to study artificial systems; and solutions for artificial systems are often a simplified version of what can be observed in nature. Bredeche oversees research on ‘adaptive mechanisms for collective decision-making in populations of simple individuals’ (such as insects or small animals). Simply put, he tries to understand the principles of collective behaviour, see how single members adapt to group strategies, and try and reproduce it in the lab in a way that is useful for artificial intelligence. With tigerfish and collective hunting as models, his studies reveal the importance of symbiotic behaviour and lead to conclude that a version of natural selection, with the ‘fittest’ individual winning over the rest of the population, can be transferred into robotics as well.

Dario Floreano from the Swiss Federal Institute of Technology in Lausanne described how animals in a swarm use different types of sensors -like vision, magnetic compass for orientation, and noise; they can also make use of local information, unlike drones which rely on information from ‘vulnerable’ GPS. The question is: can we have swarms that, despite resorting to available technology like GPS, will also follow their own rules instead of being controlled by a computer on the ground?  Floreano recalled how computer graphics’ rules for the animation of swarms with a certain degree of autonomy have already been laid down in the ‘80s by Craig Reynolds. Briefly put: when a drone is too close to the others, it will move far way (repulsion); when a drone is flying in a different direction with respect to the rest of the flock, it will tend to align to the others; when a drone is too distant, it will be attracted. But other variables like the ability to communicate, power capabilities (batteries), agility (quadcopters vs. fixed-wing drones) can greatly affect swarming and continue to be actively researched. Most importantly, one strand of Floreano’s research (commissioned by armasuisse and related to rescue drones’s ability to operate without GPS) has confirmed that sensor-based flight is possible and deserves attention.

Cooperation and teaming (human-robot-dog) in the field of rescue operations in rescue disaster areas is also a line of research at Dalle Molle Institute for Artificial Intelligence (Lugano). Within this context, maintaining connectivity -either within the swarm and among drones and people- is crucial. Researcher Alessandro Giusti explained how another important strand of work focuses on interaction between humans and robots; specifically, it’s about exploring ways in which to exert control over a drone. The lab came up with the idea of pointing at it -an easy, quite natural gesture for people-; the technological options for implementing this solution are wearable interface like bracelets, laser pointers, or a smart watch, which make it possible to direct the robot to performing its task by moving one’s arm. Vision-based control is also being actively tested.

From human-robot interaction to situational awareness. This is the project Titus Cieslewski (University of Zurich) is involved in. The motivational question being: how can drones know where they are, in a hypothetical situation where there’s a team of agents in an unknown environment, they can’t see each other directly (unlike in classic swarms!) and the further they move in exploration, the harder it becomes to communicate? GPS, explained Cieslewski, does not work indoors, can be reflected in cities and is subject to jamming and spoofing in a military context (jamming and spoofing are part of electronic warfare and consist respectively in disrupting your enemy’s wireless communication and sending out wrong positioning). Computer vision can offer a way out, maintained the researcher; through the images captured by their cameras, drones can build ‘sparse visual maps’ resulting from processes like place recognition, pose estimation and optimisation. What Titus Cieslewski is currently bent on, is trying and reduce the amount of data exchanged in the process, which would translate into the possibility of enlarging the team of robots.