In light of our culture’s cantankerous disagreements about the extent to which our biological and social nature is fixed and therefore inaccessible to radical change, it is worth noting how greatly our Founding Fathers’ defense of independence from Great Britain was grounded on a fixed human nature. In fact, if we were today debating the same decision to break off from our mother country’s perceived tyranny, during a period more agnostic about both moral absolutes and natural law, we would likely need to rely on less objective and less rational justifications than the appeal to the “laws of Nature and of Nature’s God.”
The most well-known expression of our Founders’ sense of the legitimacy of the American Revolution is the Declaration of Independence, which directs our attention to the previously mentioned “laws of Nature and of Nature’s God,” apparently without seeing the need to prove the existence of either.
And those writing and deliberating about the Declaration were not alone in this. Revolutionary-era pamphleteer Tom Paine’s Common Sense (published, like the Declaration, in 1776) accuses Great Britain of “declaring war against the natural rights of all mankind, and extirpating the defenders there from the face of the earth.” He defends his preference for a restricted form of government by referring to “a principle in nature, which no art can overturn, viz., that the more simple any thing, is the less liable to be disordered.” Enlightenment political thinkers like Locke, whose thought the Founders respected and absorbed, acknowledged natural human rights, principal among them being the rights to life, liberty, and property. In our own time, such “natural rights” ascribed to a God-given, stable human nature may strike many of our citizens as uncomfortably dogmatic or unnecessarily transcendent, but in the Enlightenment era (and long before that—reaching back in conscious thought at least to classical Greece and the Hebrew Scriptures), this understanding of a stable human nature with defined capabilities, characteristics, and rights was both deeply embedded in Christian Europe and, in its particular political implications, cutting edge.
So ubiquitous were natural law-style arguments for human rights throughout not only the pre- and post-Revolutionary period of U.S. history but also the nineteenth century that even champions of human rights abuses turned to natural rights arguments. One example was slavery. While many of the slave-owning Founding Fathers admitted the inconsistency of defending this institution while resisting, on grounds of a natural human right to liberty, Britain’s curtailment of their own political freedoms, a later generation of Southern slaveholders preceding the Civil War defended slavery as a positive good for the slaves as well as the slaveholders. John C. Calhoun, among lesser-known Southern contemporaries, drew on Aristotle’s theory of natural slavery (briefly, that slavery was a state naturally suited to human beings less capable of independent living). What is interesting for our purposes is the resort to arguments on behalf of slavery that interlock with the accepted idea of a stable and identifiable human nature, rather than merely appealing to pragmatism and economic necessity. This human-nature defense of slavery distorts natural law in attempting to defend what the slave-owning Washington and Jefferson had earlier conceded was morally indefensible, but the inclination of slaveholders to justify their “peculiar institution” in this way is telling.
When we skip ahead to today’s human rights issues, we often find people expressing agnosticism about a stable and dependable human nature, though these expressions are frequently accompanied by emotional appeals to people’s unanchored rights to autonomy, choice, and happiness. Such rights, however, are unstable and cannot be legitimately appealed to if they are grounded in nothing more substantial and enduring than emotion encoded in positive law resting upon fluctuating majorities and a changeable Constitution. At that point it is not clear whether these fiercely defended rights and liberties are based on nothing more than a distaste for tyranny. (This is generally a healthy distaste, I would agree, but we can only conclude that if we understand what it is based on.)
One of the popular justifications for treating the lives of unborn human beings differently from those of the safely born is that the unborn human (or sometimes the unborn up to a certain milestone of development, such as heartbeat or quickening or the possibility of surviving outside the womb) is not yet really one of us, the safely born. Often this attitude is couched in terms of the unborn’s attainment of merely “potential life,” often distinguished from “actual” human life through the standard of personhood. We possess certain rights because we are no longer merely fertilized ova journeying, cell division by cell division, toward the status of human personhood, but actual persons. Once we have achieved personhood, we are henceforth invested with these precious human rights (which may, however, still be lost under certain conditions, such as extreme physical disability or dementia).
Basing human rights on our discernible status as human persons—a status ascertained by, perhaps, perceived emotional and intellectual responses to stimuli, or even by crude physical markers such as breathing outside the womb—makes them precarious and unstable. They are unstable objectively, because our condition or others’ perceptions of that condition can change. However, they are also unstable subjectively, because of the ambiguity that surrounds human personhood. Right off the bat, if we are relying on our perception of someone’s similarity to us before awarding human rights, we may (it has happened often enough) exclude certain races, or those with low IQs, or those with deformities in appearance, those with physical and mental disabilities that hamper their full participation in the life and work of society. And at many times and in many places, using such markers to judge whether someone has achieved full humanity has proven convenient to some group or groups in order to leverage them above others or gain some advantage. Holding fast to a stable standard of human definition—and to a correspondingly stable standard of moral behavior—would close off those easy outs, those convenient escapes from responsibility to others.
As we once understood, recognizing a stable human nature that operated according to the laws of Nature and of Nature’s God helps protect against capricious, self-interested, or emotional evaluations of our rights and the duties we have towards one another. The dissolution of our corporate agreement about what human nature is therefore not only threatens the vulnerable categories we have already singled out—the unborn, the unproductive, the expensive, the senile, and those who, because they look or act differently from us, we judge to be subhuman—it also potentially threatens the rest of us.
Recently I was reflecting on this question of human nature from the vantage point of mythological and fictional depictions of created beings. I was considering how the nature of various beings played out in The Silmarillion, the foundational mythology on which J.R.R. Tolkien’s books The Hobbit and The Lord of the Rings rely. The Silmarillion opens with the creation myth of his legendarium, continues with the introduction of evil into this fictionalized version of our prehistorical world, and then spirals into the tragedies that ensue when elves and mortal human beings attempt to surmount or violate the laws of their nature.
Much of the tragedy of Tolkien’s humans during the fictional millennia preceding The Lord of the Rings derives from their desire to escape death— the so-called “gift of Iluvatar” (the name of their rather distant Creator) to human beings. Their desire is exacerbated by their envy of the seeming immortality of the elves.
Technically, as Tolkien explains in The Silmarillion and posthumously published work, the elves’ gift from Iluvatar is not precisely earthly immortality but “limitless serial longevity” (Letter 208)—the long endurance of life, barring violent death, throughout the long ages of the world. And this built-in consequence of their elven natures carries its own burden of sorrow and temptations to escape. Or as Tolkien explains the separate challenges of men and elves in Letter 186: “The real theme for me is . . . death and immortality. The mystery of the love of the world in the hearts of a race doomed to leave, and seemingly lose it; the anguish in the hearts of a race doomed not to leave it until its whole evil-aroused story is complete.”
Now, the hankering of Tolkien’s mortal human race for that “limitless serial longevity” and their quest to escape their own mortal fate by toppling the laws of nature and of nature’s God resonate strongly with aspects of our own times. Consider, for example, our speculations about how far medicine can expand the bounds of the human lifespan; the preoccupation of Ray Kurzweil and much of the Silicon Valley crowd with achieving the Singularity (the point at which technological growth accelerates to such an extent that, among a mixed bag of consequences, human bodies can be shielded from the effects of aging, effectively launching our own less picturesque version of elvish limitless serial longevity). And then there is the geekier, less viscerally satisfying version of immortality through uploading of the brain’s contents into a computer (the sort of sidestepping of death that surely only a socially challenged techie would find appealing).
While privileged pockets of humanity plot to cheat death in various ways through the progress of medicine and technology, there are other less extreme but more broadly applicable sorts of grappling with a heretofore fixed human fate. Since the goal of euthanasia by definition is death, mercy killings and assisted suicide cannot be labeled as solutions to mortality. However, they are promoted as ways to give human beings greater control over the timing and circumstances of death. In a sense (if only symbolically), they wrest control over our mortality from God (who, whether we who were banished from Eden consider death his gift or his punishment or a combination of both, is as little esteemed by our age as Iluvatar was by the Numenoreans rebelling against the natural order in Tolkien’s mythological pre-history).
But we post-moderns have developed much more imaginative ways of denying the constraints of human nature. We now insist that gender itself is not fixed, and in pursuit of the power of self-creation pump our bodies full of hormones and wield scalpels to chisel them into shapes and functions intended to align with our imaginings. Although a great many people catch the insanity of this project, and back away from its Tower-of-Babel implications, it is astonishing how many go along with it. If a lion in the wilds of Africa gave up catching prey to live life as a vegetarian, we would perceive the obstinate self-destructiveness of its dietary choices, because we know a lion is by nature carnivorous. If our pet dog leaped from our second-story window under the delusion that it was a bird, we would not benignly endorse its choice of identity. The truth is clear enough when we picture other species choosing categories of behavior that do not accord with their nature and can therefore only end in disaster. Why don’t we react to our fellow deluded humans with similar seriousness and a determined grip on (stable) reality?
The Numenorean rebellion against mortality was also accompanied by a turn to devil worship, as they “made [human] sacrifices to Melkor that he should release them from Death. . . . But for all this Death did not depart from their land, rather it came sooner and more often, and in many dreadful guises” (The Silmarillion, pp. 273-274).
In the real world of the late twentieth and early twenty-first centuries, we have adopted our own version of those human sacrifices. The enormous numbers of those aborted in modern times must surely cast into shade all those sacrificed by such earlier practitioners of human sacrifice as the Aztecs, Incas, some of the peoples of the ancient Near East, and many other primitive cultures. A mountain of corpses reaching the height of Everest likely would not deplete the available inventory, which is daily being added to. The ancient pagan human sacrifices were generally offerings to obtain good fortune of some kind from the gods, to accompany the deceased king to the underworld, or to ward off evil. Although our own era’s abortions are not intended to placate actual pagan idols, they may perhaps be understood as sacrifices to our contemporary idols of human autonomy and sexual fulfillment.
It is not that the immediate motivations of the women sadly occupying molded plastic seats in a Planned Parenthood waiting room can easily be identified with the idol worship of sexual pleasure or moral relativism. Many of the women who wind up in abortion clinics are abandoned by partners or family, young, and frightened by the life-altering implications of their pregnancy. Some have swallowed our contemporary fairy tale about how “natural” sexual desire is, and therefore (in a conclusion even those ancient cultures would not have arrived at) how inconsequential and unprofound it is, how unnecessary to restrain and, indeed, how necessary to indulge. One result of accepting this fairy tale is the funeral march of hundreds of thousands of American women to abortion facilities each year. There they take part in a great national act of sacrifice at the altar of sex free from unintended consequences.
The ancient pagan societies that tolerated acts of abortion or infanticide at the margin or in exceptional times or circumstances still understood better than we do the laws of nature and of nature’s God, though that knowledge remained partial and restricted. They understood that sex was an activity specially set apart from other human pleasures and pastimes like eating and drinking because it generated life, and therefore was in some sense sacred. In tandem with this understanding was their grasp of the critical social importance of properly caring for the children that were not sacrificed to those pagan gods. From this context arose the double standard under which fornication or adultery was judged more damning for women than for men: To their minds, this was a necessary adjunct to the effort to guarantee (as far as possible) that children would be reared in a family capable of handing onto them the laboriously accumulated customs, skills, and cultural wisdom required to extend into the future the life of their society. All the effort of each generation to pass on to the next the tools and traditions it has acquired represented a desperate flinging into the future of the hope for a bloodline that would survive and multiply. If each generation did its part, there would be an increasing store of history and wisdom to hand on, as a relay racer hands the baton to the runner of the next leg of the race.
The stakes are very high, as many of us can see by our own civilization’s bobbling of the baton transfer. If one generation, one link in the chain of tribal history, decided to jettison its treasury of knowledge about how to live and work and worship, what would happen both to that generation and to those who would have been receiving the precious legacy? What would happen if one of those ancient tribes decided, for example, not to protect and support and encourage marriage, not to develop among themselves strong kinship bonds, reciprocal relationships of support, and an ethic of duty towards parents and elders? The tribe would fail to cohere, fail to offer meaning, purpose, and belonging to its members. But also, it would fail in very material and even economic ways. Members would face a predictable crisis in old age, if their children had not been taught to care for those who had given them life—or if they had never chosen to have children, or had reproduced irresponsibly and lived chaotic, unstable lives.
Today, in modern societies with social welfare nets and 401ks, where almost no one but the Amish attempt to make a go of the kind of family farm in which even pre-adolescent youngsters can be an asset, we may be tempted to view this primitive dependence on children as outmoded. But as we age, our own social and emotional dependence on the generation after us remains a greater staff of support than we like to acknowledge. And despite the political, economic, and scientific advantages of our contemporary lives, we too continually require a rising generation of peak productive adults to provide for those aging out. Younger replacements are needed to produce our food and goods and to pay for our Social Security, Medicare, and pensions. But in addition, no matter how prudent, self-sufficient, and financially successful we may be, in old age we will still need the services of police officers, firefighters, emergency personnel, and soldiers to protect us. We need doctors and nurses. Eventually, we will need someone to bury us, and we may hope for someone to pray for our souls.
If there aren’t enough younger people to fill all those roles—and to begin the process of bearing and bringing up the next generation—then we (and they) suffer. That’s an operation of natural law, too. Even China has recently recognized the disastrous consequences looming over them as their population begins to tip into decline. Attempting to reverse the effects of decades of draconian limits on family size, the Chinese government is now encouraging larger families, though it is unclear what effect this will have. In any case, even if their about-face is wildly successful, it will not relieve the next couple of generations of the struggle to survive the stresses of their inverted population pyramid.
Sometimes a society will misinterpret or misread or just plain revolt against a particular aspect of reality because it constrains us in an area where we refuse to be constrained. And sadly, almost every society I can think of has singled out some group or groups to treat inhumanely—racial or ethnic minorities, the poor, religious minorities, political opponents, foreigners, the very young or very old, the mentally ill or disabled. We may wish to try to minimize our own shortcomings by dwelling on the record of historical violators of human rights. However, we might take less comfort in comparisons with imperfect societies of the past if we took into account the numbers (and percentages) of those sinned against.
Let’s consider that mountain of the aborted I talked about earlier. Comparisons with primitive societies of the past are particularly telling when we remember their extreme precariousness. Although we still have people going hungry and living without basic necessities in our country, our agricultural, political, social, and technological development makes possible the kind of safety nets that a pre-modern subsistence society subject to famines, floods, disease, and other assaults of nature could not even have conjured up as a fairy tale. If members of one of these pre-modern pagan societies could speak to us, they might say something like this: “We sacrificed our children in cataclysmic times, or if we deemed them incapable of bearing their share of the necessary burdens of life, or if we worshipped brutal gods who demanded this of us. What is your excuse? Why do you sacrifice yours?”
I suppose that a partial answer relates to the very prosperity that makes our lives (up until their end, as we await the Singularity) less precarious, less vulnerable to the variations of nature, even in the era of Covid. Despite the deaths and hardships caused by the pandemic, our losses have been relatively minor compared to those societies that underwent any of the great epidemics before the era of modern medicine. In fourteenth-century Europe, the Black Death killed one-third to one-half of the population. However, the same medical and economic progress that enables us to better provide for the unborn and the just born and young children also raises their price tag and postpones their productivity, encouraging us to regard them more as costs and less as assets, at least in the short term. And our progress also opens us up corporately and individually to the illusion that we can more and more wholly devote our lives to individual fulfillment and personal pleasures without putting either society or ourselves at risk.
One of the reasons we can be blind to the consequences of violating natural laws related to our human nature and to our moral duties is that these laws do not necessarily prevent us from violating them, as physical laws like gravity do. Instead, our fixed boundaries and the moral map in which we are intended to navigate life become more apparent to us after we have violated the natural law, in the form of unpleasant or unforeseen consequences. We are physically capable of aborting a child or ending our own lives early or blocking puberty or producing male secondary sex characteristics in a girl. We are not capable of altering moral law to make these things be right, and therefore we are not capable of preventing them from having adverse consequences.
Like overwrought two-year-olds enthralled by the word “no,” we in our era have largely been seeking our own way, screaming our freedom to choose who we are and how we will live. Our denial of the need to live within the limits of the “laws of Nature and of Nature’s God” is our rebellious “No!” to the Universe and, ultimately, to its Creator. And we do have that freedom to choose and that right to scream “No!” to the Universe. But not without consequences. Closing our eyes to reality or putting our fingers in our ears to block out God’s voice is not in the end a viable strategy for either happiness or survival. Holding our collective breath until we all turn blue in the face will not render our bodies capable of living without oxygen, and it will certainly not transform us into Masters of the Universe. It is more likely to earn us a lengthy time-out.
Ellen Wilson Fielding, a longtime senior editor of the Human Life Review, is the author of An Even Dozen (Human Life Press). The mother of four children, she lives in Maryland.