Psychology, as a discrete area of scientific inquiry, came into being in the late 19th century. Prior to that, interest in the “human condition” was primarily the province of philosophy and religion, neither of which attended much to the subjective lives--emotions and thoughts; the personal experience of one’s self in relation to others and to ourselves--of human beings. Ancient Greek philosophers contemplated big-picture issues of ontology (the study of “being”), epistemology (how we know what we know), and aesthetics (what is beauty). In terms of the human experience, normative behavior was governed by logic and reason, faculties essential to the expansion of civic life and vanquishing the barbarism from which civilization had only recently emerged. Aberrations in human behavior continued to be viewed as the product of capricious mythological gods who, in the name of sport or retribution for human affronts to their dominion, inflicted people with “madness;” madness was conferred from “without” rather than arising from “within.”
Philosophy clearly being the handmaiden of more advanced societies, the decline and eventual fall of Rome around the 4th-5th century BCE, and the nail in the coffin hammered by St Augustine, effectively put this branch of inquiry to sleep in the West for about the next 800 years. What became entrenched--with the expansion of Christianity and associated hegemony of the Catholic Church—were god-centric rationales for the existence of things populating the physical world, while variations in human character could be accounted for in the epic battle between good and evil, personified by a benevolent deity and malevolent demon, respectively.
St Thomas Aquinas, a 13th century theologian, unsuccessfully sought to reconcile the schism between Aristotlean “reason,” which had been kept alive as formative concept by Islamic scholars in the East, and “revelation,” the cornerstone of religion, in support of church doctrine. While that particular effort bombed, his ideas became the substrate for the Humanistic philosophy that would follow, influencing and being influenced by the Renaissance and Enlightenment, the Protestant Reformation, and the Scientific Revolution. Humanism, or “secular humanism” as to which its often referred, turned the conversation towards the value, dignity, and rationality of human beings and in so doing set the stage for what would emerge hundreds of years later: the focused inquiry in to how, and in what ways, and under what circumstances, are the inherent rationality, dignity, and value of an individual compromised by his/her experiences at critical periods of development. In short, the field of psychology.
Putting philosophy and religion aside for a moment, it is my perspective that we have the Scientific Revolution most to thank for how we began to know it’s “hard to be human.” Before Copernicus, people believed they were “special,” occupying center stage in the universe as god’s most special project. As such, there was a reason for being, a sense that human life had inherent value, purpose, and direction, and best of all, there was the promise of an afterlife in which we would be reunited in an otherworldly paradise with all the people and pets that were similarly “good;” all of this was a soothing balm for jangly fears of death. Copernicus’s heliocentric theory—that the earth and other planets revolved around the sun rather than they around us—ruptured our special-ness in the cosmos. As god began—with Copernicus, Kepler, Newton, and others—to become irrelevant to any process by which the physical world and human life are brought in to being and/or sustained, the burden for the creation of meaning, direction, purpose—not to mention managing our own freakin’ fear of death—shifts to the individual. What could be harder than that?
Much thought has been given to how the rise of modernity itself was correlated with the emergence and creep of what we now think of as psychiatric disorders. By the 19th century,
"…progress toward individuation gave rise to new subjective forms of suffering. People were obliged to fashion images of themselves, and those images caused dissatisfaction. Little by little, birth ceased to provide a clear definition of someone’s place in the world….the increase in social mobility, the vagueness and vulnerability of social hierarchies, and the growing complexity of signs of social rank not only complicated the question of ambition but also resulted in indecisiveness, confusion, and anxiety...." (Aries & Duby, 1990)
The ever-increasing pace of change and pressure for adaptation, the competition fostered by expanded opportunities and (relative) prosperity that could be achieved in an increasingly urban society, and the novel cultural construct of happiness as a goal for human endeavor, swamped the sensibilities of societies that had not-so-long-ago been far more constrained.
This creeping sort of uncertainty is an essential theme at the dawn of the 20th century, as just described from a socio-cultural perspective but one that began to surface in other domains of knowledge and consciousness as well. The 19th century had unknowingly witnessed the apogee of the idea that history was progressive—a notion that took root during the Enlightenment but had some seeds in Greek philosophy--that civilization was on an indelible course from the primitive and to the perfect, to be achieved through the application of reason and science. A sequence of events and ideas would, within the first half of the 20th century make rubble of the progressive certainty of the five millennia prior.
Freud’s elucidation of the unconscious, and identification of the “irrational” as a driver in human emotion, thought, and behavior, eroded the notion that we are consistently capable of exercising the “right reason” essential to well-ordered societies, and artists of the early 20th century visually toyed with imagery challenging the permanence of underlying “structures,” depicted the destructive capacity of the unconscious, and abandoned aesthetic rules regarding that which is beautiful in defiance of the aesthetic consensus that had reigned since ancient times. Nietzsche pronounced “god is dead,” which could be done reasonably after Darwin’s theory of evolution de-necessitated a “god” from the account of human existence.
While most common folks weren’t sitting around their fireplaces reading Zarathustra, philosophy of the modern age had to tackle the nihilistic conclusion that without god, human life was--as the aforementioned Copernicus and others foreshadowed--not special. It was not purposive. It was not teleologically headed in any particular direction based on the goodness or evil one exhibited across the life span. Existential and phenomenological philosophers concerned themselves exploring the experience of being-in-the-world, the burden of meaninglessness, and the effort to posit metaphysical schemes that made room for a god-concept in spite of our scientific understanding a he/she/it was not required. A final blow to certainty was provided by Einstein, who upended the notion of absolutes in the physical realm in his theories of relativity, which explained how---because nothing in the universe is at rest—motion and gravity and time and space create a four-dimensional model of reality wherein one can only be understood in relation to another; everything is relative.
World events in the first half of the 20th century reinforced the uncertainty, irrationality, and meaninglessness of human life; the magnitude of destruction, barbarism, and human casualty wrought by the First and Second World Wars (and as a subset thereof, the Holocaust), and the Spanish Civil War further defied confidence in reason as the go-to governing principle for civic life, in any remaining notion that modern civilization was on an upward trajectory to enlightenment, and that people were inherently rational and/or good. By mid-century, it was clear it was very, very hard to be human. And no one gave a shit.
In the latter half of the 20th century, consolation from such profound existential dilemmas was sought through “returns” and “escapes.” We “returned” to family values and the fantasy of simpler times in both Eisenhower’s America of the 1950’s and Reagan’s of the 1980’s. We “escaped” in the 1960’s and 70’s—and basically since--through booze, drugs, sex, food, gambling, video games, music, television, self-help books, and Oprah and Phil and Deepak. Over these decades, as our “feelings” became pre-eminent in our cultural consciousness, we concurrently demonstrated we couldn’t much bear them.
Consequently, here in the opening decades of the 21st century, we doubled-down on our escapism as access to high-speed Internet connectivity became universal and, in each and every pocket, purse, and pram, we carry the mobile version of this parallel universe, generally preferring the World Wide Web to the actual wide world. And here we are—at a moment in history in which people have never been more connected--the consensus seems to be that we’ve never been more lonely, more estranged from ourselves, more empty, and more afraid. The work of being human—creating a sense of purpose and direction; taking responsibility for one’s own happiness or misery; transforming personal traumas into personal growth; knowing and respecting ourselves; making authentic connections with partners and kids and parents and friends and colleagues and followers on social media; residing in the always-precarious awareness that we don’t actually matter in a cosmic sense and that we’re going to die and that’s just it—is so immense a task it defies my ability to fully describe it, facilitate it, or fully complete it myself.
But I wouldn’t trade it, the imperative to do the hard work. And I won’t stop doing it, either. And the more I have worked, the bigger and deeper and more hopeful my life has become. And herein lies the existential value of therapy: not having to do the hard work of being human…..alone.
1 Aries, P. and Duby, G. (1990). A History of the Private Life IV: From the Fires of the Revolution to the Great War; Harvard University Press, Cambridge, MA & London, England; 615-616.