The creation of intelligent robots with human-like qualities has long been a theme in literature and cinema, beginning with Mary Shelley’s Frankenstein, written in 1818, and achieving particular fame with the “Robot Series” of stories and novels by American writer Isaac Asimov (1920-1992).
Like Frankenstein, Asimov’s most famous novel, I, Robot was made into a film (released in 2004, starring Will Smith). As advances in artificial intelligence (A.I.) and autonomous systems have progressed in recent years, the film industry has introduced increasingly sophisticated, human-like androids, such as the child character David in Steven Spielberg’s “A.I. Artificial Intelligence” (2001). Now comes a new variation on this theme, “I’m Your Man,” directed by Maria Schrader and starring Dan Stevens as a super-intelligent robotic human replica.
Like many of those earlier books and films, “I’m Your Man” asks deeply probing questions about what it means to be human and whether or not we humans should create robots with the ability to look, act, think, and emote like humans. In Schrader’s smart and surprisingly touching movie, the Stevens character, Tom, is designed in every respect (including sexual performance) to be the perfect partner to Alma, a withdrawn and sorrowful archaeologist at Berlin’s Pergamon Museum played by Maren Eggert.
As a favor to her boss, Alma agrees to cohabitate with Tom for three weeks to test his A.I. system’s ability to adapt to a female human companion and provide her with pleasure and contentment. At the end of the experiment, Alma is expected to provide the company that produces Tom and other such androids with extensive feedback on the adaptive capabilities of their device.
This, in turn, will help the company (never named or pictured) to perfect robots intended to serve as life-companions to humans lacking a mate and, fearful of ever finding a human one, are prepared to partner with a human look-alike. (In the film, all these couplings are all of a heterosexual nature.)
Two key questions drive the film forward: Can Tom’s operating system adapt to Alma’s quirky personality and win her confidence and affection, and will Alma overcome her distrust of a human-like machine, however “perfect” in all respects, and come to embrace Tom as a life-partner. The two leading actors—Dan Stevens and Maren Eggert—do a superb job of conveying their character’s respective transformations, with Tom gradually learning to comprehend human emotions (and respond to them appropriately) and Alma slowly overcoming her misgivings about relating to a human-like machine (and embracing Tom as a welcome partner in her life).
Throughout all this, however, the movie raises important ethical and legal questions about endowing machines with the ability to reproduce human thoughts and emotions. Is it truly in our best interests, Alma asks at one point, to design machines to perfectly satisfy our emotional and sexual needs if that means we will never interact with human partners and thus be compelled to accommodate to their needs, quirks, and desires – thereby becoming more sensitive, adaptive, and quintessentially human ourselves?
And, in another throwaway line with myriad ramifications, Alma’s boss reminds her of the importance of her evaluation of Tom’s performance, saying important decisions will have to be made: “whether these things will be allowed to marry, to work, to get passports, human rights, or partial human rights.” Weighty questions indeed!
In the end, the movie does not provide answers to these questions, but leaves them to the audience to struggle with. In this respect, the movie succeeds both as a cinematic experience, entertaining us with the twists and turns of Tom and Alma’s odd romance, and as an intellectual exercise, inviting us to think about questions with real-world implications.
Human-like robots with the ability to think on their own are not yet within our technological reach, but giant corporations and the world’s major military forces are moving to create such systems, and progress on them is proceeding swiftly. It will not be long, then, before we will be confronted with questions like those posed by Alma’s boss.
By far the most challenging of the questions posed by A.I. and autonomous systems are likely to arise over the development and deployment of military robots of various sorts. The United States, Russia, China, and other technologically advanced countries are currently developing a wide range of A.I.-empowered robotic combat systems, collectively known as autonomous weapons systems, or AWS.
These include unmanned aerial vehicles (UAVs), or armed drone aircraft; unmanned ground vehicles (UGVs), or robotic tanks and combat vehicles; and unmanned surface and undersea vessels (USVs and UUVs), or drone warships and submarines. At present, the U.S. Department of Defense (DoD) is spending billions of dollars to field prototype versions of each of these types, as are the military forces of America’s principal rivals.
Although “unmanned”—the military’s term for crewless combat systems—every autonomous weapon in the U.S. arsenal is expected to be subject, in one fashion or another, to human oversight. Under prevailing Pentagon policy, as enshrined in DoD Directive 3000.09, “autonomous and semi-autonomous weapons systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.” But just exactly what “appropriate” means has yet to be fleshed out, and the autonomous systems now being developed are expected to operate with considerable independence from human supervision.
Consider, for example, efforts by the U.S. Air Force to field a “loyal wingman”—an A.I.-empowered UAV designed to accompany and assist advanced piloted aircraft like the F-35 stealth fighter on high-risk missions in contested airspace above or adjacent to Russian or Chinese territory. Just as Tom was designed to protect and nurture Alma in “I’m Your Man,” the “loyal wingman” will be expected to mirror the F-35 pilot’s every move and protect him or her (but probably him) when exposed to enemy fire.
This might involve intercepting enemy fighter planes or attacking heavily defended anti-aircraft radar stations; if necessary, the drone is expected to sacrifice its existence to protect the human pilot, for example by luring enemy fighters so the piloted aircraft can escape.
To advance this concept, the Air Force is testing a prototype “loyal wingman,” the XQ-58 Valkyrie, and a sophisticated software system, called “Skyborg,” to control such aircraft when operating on their own. Skyborg—or, more formally, the Skyborg Autonomous Control System (ACS)—is still in the development stage, but the Air Force has already conducted tests in which it has assumed the role of a human pilot in actual flight operations.
On April 29, 2021, Skyborg took control of a drone aircraft, the Kratos Unmanned Tactical Aerial Platform (UTAP-22), for the first time. On this occasion, and during a second test held on June 24, the ACS conducted basic flight maneuvers on its own, albeit while being supervised by human controllers on the ground. Eventually, Skyborg is intended to control multiple drone UAVs simultaneously and allow them to operate in “swarms,” coordinating their actions with one another with minimum oversight by human pilots.
The U.S. Army and Navy are conducting similar experiments. In April, for example, the Navy conducted its first-ever maritime combat exercise fought almost entirely by unmanned surface and undersea vessels. Known as the Unmanned Integrated Battle Problem 2021, the exercise was conducted by the U.S. Pacific Fleet in waters off San Diego and included participation by two prototype USVs, Sea Hunter and Sea, and several small UUVs. Equipped with advanced sensors and computing gear, these uncrewed, autonomous weapons were set loose on the high seas to locate and identify simulated enemy warships and relay this information to manned warships for live missile strikes on the mock targets.
As is true of cinematic experiments like “I’m Your Man,” these military exercises raise critical ethical and legal issues.
What happens when an android malfunctions, or evolves, and fails to operate in compliance with its original software? In the movie world, this can lead to human suffering and large-scale violence, as in the “Terminator” series. In the real world, similar dangers may arise. An armed drone, for example, an unmanned submarine, could lose touch with its human overseers and, its software determining that an attack is underway, fire its weapons at an adversarial warship – thereby provoke an international crisis, even though no such attack was intended by human military officers.
In the event of actual combat, moreover, military commanders are legally responsible under national and international law for the outcome of their actions. Should an atrocity occur—for example, the slaughter of unarmed civilians in the course of a protracted battle—they are vulnerable to indictment and punishment. But what happens when a robotic plane or tank goes rogue and commits an atrocity? Who bears responsibility then? These are difficult issues that may well arise in the not-too-distant future as the U.S. and other major military powers begin to deploy such systems on the battlefield.
In “I’m Your Man,” Alma advises against producing any human-like companion robots, saying they will stifle human interaction, daring, and creativity. In a similar vein, many governments, human rights groups, and other entities have argued that autonomous weapons systems should be banned outright, arguing that humans must never delegate life-and-death decisions to robots, however intelligent.
In a 2018 statement, for example, UN Secretary-General António Guterres said that “machines with the power and discretion to take lives without human involvement are politically unacceptable, morally repugnant, and should be prohibited by international law.”
My recommendation: Go see “I’m Your Man” and enjoy an engaging, well-acted movie. And give some thought to the important questions it raises about human nature and the imminent arrival of super-intelligent robots.