Nelly Kibet & Michelle Digolo
Drones, also known as Unmanned Aerial Vehicles (UAVs), autonomous robotics and automated systems have changed the landscape of modern day warfare. As these new technologies change our world and the conflicts we are involved in, they raise a host of ethical questions that must be dealt with. UAVs for example, General Dynamics MQ-1 Predator and MQ-9 Reaper and semi-autonomous robots with varying degrees of lethal force such as the United State's Sensor Fused Weapon system, Israel's Harpy and the United Kingdom's Taranis are already in use in modern battlefield. Advocates of automated systems praise the efficiency, expediency and effectiveness of such technologies. The use of advanced artificial intelligence technology and machines to develop drones and other space weapons has led to numerous implications on their use, which warrants moral reflection and ethical analysis.
Contrary to popular belief, UAVs have been used for humanitarian purposes to deliver food and medical aid in remote and dangerous areas such as in Syria and Darfur. Medical drones have been used to improve services in the health sector by delivering emergency life-saving blood, plasma, and coagulants to transfusion clinics across Western Rwanda. Humanitarian drones are heavily relied on because they are cost effective, weather resistant and work round-the-clock. UAVs have also been used for surveillance and spying purposes in Mali and Central African Republic, Somalia, Pakistan, Iraq and Afghanistan.
From a militaristic point of view, weaponized drones reduce casualties on the military side as they have no boots on the ground and are cheaper to make. They carry an array of sensors and cameras that can watch both day and night unlike a traditionally piloted aircraft. Since the UAVs remain aloft for extended periods, they allow time enough to identify the target before launching an attack. This feature allows drone operators to make morally justified decisions about when to strike enabling reduced risk of civilian harm, which improves the moral and ethical outcomes of battlefield encounters.
An asymmetric war with one side using drones cannot be morally justified because of the lack of reciprocity of risk between combatants on each side. The permission to kill and the liability to kill lacks in asymmetric war since the technologically advanced belligerent with drones are permitted to kill but are not liable to kill by other party. This lack of mutual risk in such a war is morally unacceptable because drone pilots and autonomous machine programmers do not enter reciprocal relationship of risk with their enemy foot soldiers. The situation is even dire in case an enemy combatant decides to surrender because a drone affords no opportunity to surrender hence can result into an elimination of a state's combatants (without drones).
Drones strikes in target killing missions fall typically within one of two categories: "personality strikes" where the target is known by name and deemed to be a high-value or a particularly dangerous individual and the "signature strikes" which target unknown persons based on anomalous and thus suspicious behavioral patterns and characteristics. As much as drone programs are said to focus on personality strikes on targets that, were known to the intelligence community and known to potentially pose an elevated terrorist threat, signature strikes have increased. Signature strikes have taken place in Somalia, Yemen, Pakistan. In Afghanistan, the American intelligence thought that a wedding celebration was in fact a Taliban gathering. Forty-three members of a family were eliminated. This has also been seen in Syria where a hospital was attacked. The targeting of individuals whose identities are unknown should be unlawful and considered a war crime.
The desensitization of drone operators has led to the coining of the term 'cubicle warriors'. The military selects and recruits the operators based on their gaming skills and mandated to undertake operations entirely through computer screens and remote audio feed. As such, there is a risk of developing a "PlayStation" mentality to killing. Despite firing missiles far from the battlefield, drone pilots and sensor technicians, who witness civilian casualties or horror of war experience combat stress and Post -Traumatic Stress Disorder (PTSD). On the flip side, living under constant threat of drone attacks causes psychological trauma on civilians as they hover twenty-four hours a day striking without warning.
The advancement in technologies to develop autonomous manned and unmanned aerial systems poses an arms race that will likely destroy international law and world order. Drones not only behold a potential military danger but a threat to civil liberties. They perpetuate violence, rather than deter it. The chances of drones falling into the hands of rogue nations or terrorist groups is heightened with a lack of legal frameworks regulating their use on the international front. Which body will be used to regulate this type of warfare? What of the private companies that manufacture such weaponry, where will they lie on the continuum of conflict perpetuation and wanton killing of civilians? How will states be held accountable? Hence, for the sake of international security, stability and peace it is prudent to develop space governance mechanisms and settle on regimes of constraint for UAVs.