A somewhat unique “morality play” is being staged . . . and the Artificial Intelligence (AI) community is all abuzz. It involves the idea of programming military robots with “morality.” We’re not talking the lovable “Star Wars” duo of C-3PO and R2-D2 here, but robotic killing machines which have been programmed with a “morality” that enables them to be autonomous and effective on the field of combat . . . all according to certain prescribed rules of warfare. Writing in the recent issue of DEFENSE ONE, Patrick Tucker reports on how the US government has made a number of grants to universities with the goal of seeing if it is indeed possible to program a “moral code” into a machine that will govern its conduct under combat conditions.
How might Christians respond to this whole idea? If you are a Christian pacifist, your response is an easy one. Since the whole concept of war is immoral, any talk of “moral” robotic soldiers is simply ludicrous, and a waste of time and effort to even consider.
But I am not a Christian pacifist, so I find myself more than a little intrigued by the whole idea. So, are there biblical and theological clues that might inform an authentic Christian response as to what may soon become a hi-tech reality in the area of warfare? First of all, we must ask where the whole idea of “morality” comes from. “Morality”—the idea that there are certain standards and/or behaviors that are “right” or “wrong”—is very much a biblical idea, with the source and final arbitrator of what is “moral” being a Supreme Judge, none other than God Almighty, who reveals His moral standards in His written Word, the Bible. I’m fairly certain that the university research teams will not be using Scripture as their moral guidepost, so that begs the question: “Whose morality will these robots be programmed with?”
A deeper theological question is: is any part of creation—animate or inanimate—capable of exercising a genuine morality other than the one being who has been created in the image of the ultimate Source of morality? Only human beings possess—as part of being made in the image of God—a conscience which is linked to a soul which is linked to a rational intellect which has the capability of moral decision-making. No other species—no other thing in the entire universe—possesses these attributes. Thus, it can be argued that nothing in all of creation, other than human beings, can act in a moral or immoral way. Morality involves an act of the will in the face of an ethical dilemma, not the fine-tuning of robotic electrical circuitry no matter how technologically refined.
It’s interesting that even an enthusiastic AI proponent and expert such as Noel Sharkey bucks against the whole idea of “moral robots.” Although he doesn’t make his argument from Scripture or theology, Sharkey is on to something very biblical when he says, “I don’t think that they will end up with a moral or ethical robot. For that we need to have moral agency. For that we need to understand others and what it means to suffer. The robot may be installed with some rules of ethics but it doesn’t really care. It will follow a human designer’s rules of ethics.” We know that the oldest book in the Bible is Job. It’s as if, chronologically, God is saying, “True faith, out of which authentic moral discourse arises, begins with the whole concept of suffering.” My mentor in seminary, Dr. John Leith, used to often say, “Gentlemen, you will never be good pastors until you have looked into the abyss.” Robots cannot look into any abyss. Robots, no matter how well programmed, will never be able to feel pain or experience loss, and, thus, will never be able to empathize with or care about anyone or anything . . . not to mention “love” anything or anyone. The very essence of “morality” is wrapped up in the love of God and neighbor—something the most highly refined robot cannot ever even approximate.
After the American Civil War ended, Dwight L. Moody held a number of evangelistic meetings across the South. At one of the meetings, after Moody had preached, his song leader, Ira Sankey, sang a beautiful hymn. After the meeting had ended, Sankey was approached by a man who asked him if he had served in the Union army during the war. Sankey said that he had. The man then asked him if he was guard duty on a certain night during a certain battle. Sankey said that, as a matter of fact, he was on guard duty that very night. The man then informed Sankey that he had been a Confederate scout during the war, and on that night he had been spying out the Union position, and he had seen a Union soldier standing guard duty silhouetted in the moonlight. He had decided to shoot him. As he lined the guard up in his rifle sights, suddenly the soldier began to sing the very hymn that Sankey had sung at the meeting. “Did you sing that hymn the night you stood guard duty?” the former scout asked Sankey. “Why, yes, I did”, he replied. “Well, that hymn saved your life,” said the Confederate veteran. “After you finished singing, there was no way I could pull the trigger.”
I wonder how even the best morally programmed robot would have played out that scenario?
Dr. Ron Scates has been a Presbyterian pastor for 34 years serving the First Presbyterian Church of San Antonio for 10 years, the Central Presbyterian Church of Baltimore for 11 years, and the Highland Park Presbyterian Church of Dallas for the past 13 years. Holding a BA in Pre-veterinary Medicine and an MS in Cell Biology, Dr. Scates was involved in medical research at Baylor College of Medicine before going into pastoral ministry. Upon graduation from college, he also served as Assistant Baseball Coach for two years at his alma mater—Trinity University in San Antonio. He has authored numerous articles in both the scientific and theological disciplines. He is presently on sabbatical at Redeemer Seminary in Dallas, studying the Incarnation through the eyes of the early church fathers.