There is an interesting relationship between what we have come to call “mind,” and the world in which we live. The purpose of this note is to explore them in further detail. My thoughts re: these topics are a synthesis of, or distillation from, many different sources. While the original component insights therefore are not mine, I believe there may be several ways to assemble or re-configure them that have not yet been clearly stated.
I. Synopsis of Argument
1. As much as we have impacted it, we – people – have not “created” the world. Rather, from an evolutionary standpoint, the world has evolved us to its specifications. For example, we walk upright, our eyes are at the front of our heads (as opposed to, say, whales, where they’re on the side), and our ears are on the side of our heads. We are warm-blooded, and many of our features (such as arms and legs) are bilaterally symmetrical. There is no “necessity” or “imperative” to these aspects of our physical bodies; they surely could have been different in some other world. That world, however, would not be our world. For we are creatures “in,” and beings “of,” this world.
Reciprocally (and motion pictures like “Stargate” and “The Matrix” notwithstanding), it would not be possible for us to exist in some other possible world, even though such a world might meet all of the requirements and specifications of a “world,” such as having phenomenological texture and consistency. The reason why is, we are not adapted to it. We might be able to survive “physically” in such a world, for a while, at least. However, sooner or later, it is likely its predators would consume us, or succumb to its viruses or microbes. We simply are not evolved to its specifications. Even in the interim, it would be impossible for us to navigate it with skill and dexterity, and circumspection and perspicuity.
2. Our brains, with their associated neurochemistry, have evolved along with us. Without denigrating from its important role, the brain is but one component of the complex organism that is the human being. The brain enables us, among other activities and functions, to conceive of abstract ideas, to interact with others in group and community settings, and to communicate.
3. The enlightenment bequeathed to us the notion that we have a “self.” Among other complications, the notion of the “self” has led us to believe we are “conscious” and have “minds.”
Except in certain limited contexts, however, the idea of the “self” – while not entirely useless – is subordinate to a more pervasive modality. One that is more characteristic of us as human beings, that is, creatures of evolution who are being-in-the-world. Which is, in our interactions with the world, we unthinkingly and non-reflectively comport ourselves primarily in accordance with its needs and requirements, not our own.
On a personal scale, for instance, we go about our daily business and routines in a non-cognitive, pre-reflective way. We don’t stop to “think about” what we’re doing; we just go ahead and do it. We are at one with, or transparently absorbed by, the world, in which the activity takes place.
This also is the case even on a much larger scale. For example, the Federal government built Hoover Dam in a gorge where the banks of the Colorado River were particularly narrow. In principle, they could have built it somewhere else. However, we attenuate our human purposes and objectives to the exigencies and realities of the environment in which we live.
4. One of the artifacts resulting from our conception of “self” is that we are prone to imagine other hypothetical worlds where we might speculatively reside, or which we might inhabit. In point of fact, though, we don’t.
These alternative “other” worlds can be innocent fantasies, akin to daydreams – the product of wishful thinking, an outcome of projective thought. Or, they may be phantasmagorical nightmares, entirely outside of our control, a result of comorbid brain chemistry, itself a feature of our condition as human beings that have evolved in the world over time.
In its mild form, our brain chemistry might induce in us something like a mood disorder – depression, mania, or bipolarity. In its extremer form, it might result in a delusional compulsion, or even schizophrenia (an actual “other personality” inhabiting the “other world”).
5. Much of how we non-consciously, unhesitatingly interact, or are involved with the world, is culturally influenced, that is, affected by a myriad assortment of conventions, assumptions, background information and social expectations. Among other consequences, these are responsible for our pre-reflective understanding of who we are, how we fit in, how we are supposed to react, and what is contextually appropriate.
6. One of the main drivers of this set of cultural determiners in turn is “pop culture,” that is, an interpretation of culture or style that is disseminated primarily by media (by which I mean to include all forms of mass communication).
People don’t become enamored of particular movie stars, rock groups, or other celebrities, because of who they are, or their intrinsic qualities or features as performers, or even individuals. Rather, the persons who inhabit those roles primarily are incumbents, or avatars, in the sense that, if it weren’t them, it would be somebody else. Nonetheless, they become paradigms of potential roles that a person with a “self” can assume, or undertake, in these other possible worlds. Whereas, in fact, they simply are frothy concoctions created by the media, for its own purposes.
In a capitalistic society, one of those purposes is making money. Another more subtle, yet potentially pervasive purpose is to communicate, or reinforce, a shared sense of style. Not “style” like a fashion model, but rather, the specific and culturally-appropriate texture of our interactions with the world.
There is a kind of neural-feedback mechanism at work here, though, because the media needs consumers who require celebrities in order to envision alternative worlds, just as much as those consumers desire celebrities who in turn are created by the media.
A. We Have Evolved the Ability to Formulate Abstract Concepts
In Religion Explained (2001) and in In God We Trust (2002), Pascal Boyer and Scott Atran, respectively, analyze how our ability to formulate abstract concepts results from us having evolved in the world. Even though they were released around the same time, each book evidently was written independently. Each cross-cites the other in their respective appendices, to the effect that “here’s somebody else who has ideas similar to mine.” Both set forth original and exceptional insights.
While not disinteresting, the “God” and “Religion” parts are not the most innovative features of these books. Rather, what Boyer and Atran really look at is how it came to pass, from an evolutionary perspective, that human beings can think abstract thoughts – those not immediately related to, or in furtherance of, their basic personal needs such as food, clothing and shelter. All references to “God” and “Religion” therefore can be reformulated in these terms.
Because they do not take an ideological point of view on topics such as “Is there a God,” both books are informative in a way that more theologically or polemically oriented treatises are not. This makes them more rigorous, and less conjectural, than recent books such as The God Delusion (2006) by Richard Dawkins and Breaking the Spell – Religion as a Natural Phenomenon (2007) by Daniel Dennett. To me, at least, Dawkins and Dennett take a more conceptual, but less nuanced, approach.
B. The World Has Evolved Us, Not the Other Way Around
Boyer’s and Atran’s books should be read in conjunction with Guns, Germs and Steel (1997) by Jered Diamond. Although there is much in Prof. Diamond’s book that is speculative, his main accomplishment is that he manages to explain the inception and growth of culture, in non-cultural terms. Instead, he traces its origin back to natural phenomena, that is, phenomena found in our world, whether we are here or not.
Humans undoubtedly have impacted the environment in ways that, in principle, could affect the evolution of species. Precursors of today’s humans, for example, could have set fires that wiped out the food stocks of neighboring tribes, depriving them of the opportunity to reproduce.
Even today, we alter the environment in ways that potentially might affect the ability of individual humans to reproduce. For example, unless they have done so already, a person killed by a flood or a nuclear power plant meltdown will not have the opportunity to procreate. Less spectacularly, reproductive capacity may be impaired by synthetic compounds (such as asbestos), or drugs (such as cyclophosphamide), which have sterility as an unfortunate and undesirable side-effect.
The evolution of human beings (or their precursors) also has been impacted by the activities of other human beings (or their precursors). For example, Tribe A, with a certain sub-set of evolutionary characteristics, may have killed off Tribe B, with a different set of evolutionary characteristics. It isn’t necessary to hypothesize Tribe B was violently overthrown; for example, they simply could have been genetically diluted to the point where they no longer were a viable sub-species. This may, for example, have been what happened to the Neanderthals. While natural forces certainly were a factor in their extinction, it also is possible they succumbed to predation or miscegenation from alternative formulations of early-human precursors.
Speculatively, over time, Tribe B’s evolutionary characteristics may have been more “desirable,” in that they would have provided a better evolutionary “fit” between man and the world. Tribe B might have been better adapted to survive the dynamics and exigencies of the environment, or become more capable in dealing with it in ways that we might not even be able to imagine.
Needless to say, it is important to distinguish the ability of particular humans to procreate, as opposed to the species of humans. Evolutionary genetics is concerned with the overall gene pool, not personal traits or characteristics. However, it stands to reason that the further one meanders back in time, the more likely it is there might have been a species-wide consequence.
Diamond is not concerned with any of these forms of interaction. Rather, his key observation, understood correctly, is that the bare features of the world itself dramatically have affected the evolution of human culture. For example, societies have tended to evolve in trade-wind zones along an east-west axis, because they found it easier to cultivate nutritious grains, than ones aligned on a north-south axis. Compare, e.g., the Americas (oriented north-south) and Africa (also oriented north-south) with the Mediterranean (oriented east-west). Culture as we know it (the ancient Greeks, the archaic Israelites) developed primarily in the latter, not the former. Phenomena such as this also explain why the Aztecs were mystified to the point of non-comprehension by Cortez, and why Europe wasn’t conquered by hordes of Zulu’s riding rhinoceroses.
These simply are examples. Properly understood, Diamond’s point is that human culture involved in compliance with world-imposed constraints (as did, of course, humans themselves).
C. The Evolution of Language Also Has Been Constrained by World-Imposed Features
Language also is evolutionary constrained. For example, when speaking or writing, we use words serially. Sentences, when uttered or read, have a beginning in time, and then end later (depending on the length of the sentence). We are unable to communicate all of our thoughts at once, in a kind of “thought burst.”
In her book The First Word (2007), Christine Kenneally traces the evolutionary development of language, and its relationship to specific, world-imposed constraints on human development. To begin with, the brain solicit language – “In a sense, language simply ‘happens’ when you have a machine complex enough to accommodate it,” 55.
However, it is not the case that “human language can be acquired so long as you have a sufficiently powerful brain,” 88. Rather, “[O]ur vocal tracks are shaped to produce speech, just as our hearing is specialized to register it,” 64. Furthermore, “the physiology of breathing structures how we speak,” 70, as does the physiology of the tongue and the larynx.
These biological, evolutionary-determined characteristics result in syntax, with subjects, objects, predicates and verbs. The “building blocks of serial communication … allow us to talk about events, objects, places and times, agents and patients, our intentions and others’,” 60. .
All of which in turn imply the notion of a “self,” and a separate world in which it dwells. We have a “human tendency to believe that all of our complex ideas and ways of carving up the world are a result of the fact that we have language,” 92, which in turn results in an awareness of self. This I turn has profound ramifications for “thought, material culture, and social structure,” 104.
D. We Are Evolutionarily Predisposed Communicate about Certain Classes of Substantive Topics
One of Boyer’s most interesting theories is that, as a matter of the evolution of brain chemistry, we are predisposed to receiving and transmitting messages in the nature of gossip, rumor and innuendo. As an evolutionary imperative, we are intrigued by, if not vitally interested in, the doings and affairs of our fellow counterparts.
Any discussion of gossip necessarily must commence with Book IV of Virgil’s Aeneid. Dryden’s translation starts: “The loud report thro’ Libyan cities goes. Fame, the great ill, from small beginnings grows: Swift from the first; and ev’ry moment brings, New vigor to her flights, new pinions to her wings.” And so on, in a similar vein. A more literal rendering of the Latin might be something like “Rumor winds its way through Libya’s great cities.” Whichever translation you prefer, media and celebrity culture are the modern expression of this same phenomenon.
E. The Erosion of the Concept of a “Self”
Martin Heidegger was the instigator of the critique that Descartes’ idea of an opposition between res cogitans and res extensa is a misconceived notion of the enlightenment. His magnum opus is Being and Time, which, unfortunately, is obscure and confusing. Being-in-the-World (1991) by Hubert Dreyfus is the most cogent explanatory work. Dreyfus is Heidegger’s leading exponent in the U.S. – in fact, for some time, he was the only one. Heidegger Studies now, however, is a small industry, as Dreyfus’ graduate students have gone on to populate Philosophy Departments everywhere.
It’s a mistake to think of Heidegger as an “existentialist,” or as some kind of a weird Nazi sympathizer. The insights he has in Being and Time, when parsed correctly, are revolutionary.
A typically Heideggerian perspective might be to consider what little role the “self” plays in our daily interactions in, with and around the world. Upon circumspection, it is a subordinate notion, not a primary one. Consider, for example, the simple act of turning a doorknob. You don’t “think about” it, when you open the door. Rather, you simply use it as a tool, to respond to an affordances solicited by, or offered to you by, the world.
You utilize the doorknob (non-consciously) to accomplish a “for-the-sake-of-which,” that is, leaving the room. Then there is a broader “for-the-sake-of-which,” for example, to attend a meeting, which (like nested Russian dolls) in turn serves an even more general “for-the-sake-of-which,” such as being a university professor. Ultimately, this is how you define (non-consciously) the meaning of your being, that is, who you are. There are a myriad other examples along the same lines.
Put slightly differently, whereas Descartes primarily was concerned with ego cogito, Heidegger’s main concern is ergo sum. Instead of a thinking “self” understood in opposition to the world, what we have taken to calling the “self,” as a kind of proxy for many different modes of being, exists in, depends upon, and is constrained by, the world.
It certainly is possible for it to seem as though there is a “self,” particularly when equipment breaks down, or becomes non-functional. For example, the head flies off of the carpenter’s hammer, causing a disruption of the carpenter’s “maximum grip” on the world.
This concept was introduced by Heidegger protégée Merleau-Ponty in his treatise Phenomenology of Perception. Merleau-Ponty particularly was interested in how the human body interacts with the world, and what it is to have a body, to begin with.
When the head flies off the hammer, or something similar happens, rather than simply using the hammer and his knowledge of carpentry to cope transparently with the world (by hammering in the nail), the carpenter is called upon to pause and reflect on the nature of the hammer itself, as a self-sufficient (and temporarily dysfunctional) object, a “thing” that is “present-at-hand,” in-and-of-itself.
Although Heidegger doesn’t make a case for it, we as humans also are vulnerable to the same phenomena. An extraordinary event happens in our lives, for example, breaking the flow or rhythm of our day-to-day involvement and engagée. Such an incident materializes a “self,” which then is capable of examining the situation retrospectively, planning a course of action, and engaging in similar feats of psychological prestidigitation.
F. We Are Beings in the World
In a weird kind of way, Heidegger and Ludwig Wittgenstein were on the same path, although both are willfully (some might say belligerently) obtuse and difficult to understand. One of the main differences between them is, even though they both are German, Heidegger somehow got characterized as “Continental,” whereas Wittgenstein is “analytic,” thus more palatable to the English-speaking philosophical world. For considerable time, this affected their intellectual reception (pro Wittgenstein, anti Heidegger).
As with Heidegger, Wittgenstein Studies now is a small industry. There are several treatises comparing their respective views, though none (which I have read, at least) particularly are noteworthy. Wittgenstein’s Philosophical Investigations tends to frustrate the reader, because it is so allusive, so suggestive – but never conclusive. He specializes in raising questions, turning them inside and out, and interrogating imaginary interlocutors – but never answering them, or finalizing his inquiry.
Where both Heidegger and Wittgenstein end up, though, is that human culture and practices – the “background,” or the “world” – are decisive to our concepts of intelligibility, meaning-conferring activity, and the like.
To take just one example, at Philosophical Investigations §202, Wittgenstein introduces the concept of a “practice”: to “think one is obeying a rule is not to obey a rule.” In other words, we just go about our daily business, doing whatever it is we do. We are familiar with the world, its procedures and constraints, and we comport or conform ourselves to them. We certainly don’t “think” about what we are doing (except in those rare instances when we do).
Then compare this with what Heidegger says in his book History of the Concept of Time (a kind of precursor work to Being and Time) at 188: “In order to give a more accurate portrayal of the phenomenal structure of the world as it shows itself in everyday preoccupation, it must be noted that what matters … is not so much anyone’s own particular world, but that right in our natural preoccupation with the world we are moving in a common totality of surroundings” (emphasis in original).
An example of what both Heidegger and Wittgenstein are talking about might be Aristotle’s phronemos, who unthinkingly (non-cognitively) navigates the world with a practical understanding of its dynamics, and effortlessly responds to its challenges and stimulations.
Think of all of the elements the phronemos would have to enumerate, in order to specify exactly what it was doing. There would be a very large, potentially infinite, number of elements. And, most likely, such an exercise could not be accomplished in principle, because it doesn’t account for “common sense.” The whole undertaking is flawed.
Certainly in practice we don’t do anything like this, at all. Rather, like Wittgenstein says at Philosophical Investigations §199, we “master a technique.” But the technique we master isn’t simply a “language;” rather, it’s the technique of the “practices.” There is no “mental process of understanding,” §153.
Properly understood, although they were methodologically radical, both Heidegger and Wittgenstein were substantively conservative. Heidegger particularly emphasizes the role of social “convention” (his views on this topic partially are ameliorated by his notion of “authenticity,” a concept to which Wittgenstein seemingly is oblivious). In any event, both Heidegger and Wittgenstein must be cited in any discussion about “world;” it would be incorrect to refer to one, without also mentioning the other.
G. What Is a “World,” to Begin with?
With all of this talk about worlds, it’s important to specify the requisites of or criteria for what counts as a “world” per se – the components or elements a world must have, in order to qualify as one.
In his book Disclosing New Worlds (1997), Charles Spinosa discusses this topic in detail. It turns out (à la Heidegger) there are three essential ingredients. “It is a totality of interrelated pieces of equipment, each used to carry out a specific task such as hammering in a nail. These tasks are undertaken so as to achieve certain purposes, such as building a house. Finally, this activity enables those performing it to have identities, such as being a carpenter. These identities are meaning or point of engaging in these activities,” 17.
These congeries of circumstances in turn result in “shared human practices” which “tend to gather together into organizations that we recognize as worlds, people, and selves,” 16. In fact, “things show up for us in terms of our familiar practices for dealing with them,” 18, and would not otherwise. The way all of a community’s practices fit together might be called its “style” or “disclosive space.” It organizes and coordinates our “interrelated set of equipmental relations,” thereby yielding “roles that give a point to the activity of using that equipment,” 19. It is, in short, the “background” or “world” in which we exist.
H. The “Self” Existing in the World
Heidegger was interested in the concept of “Being” (upper case “B”) in general, that is, what it entails, how and why we think about it, etc. He was not particularly concerned with the “being” (lower case “b”) of particular individuals. Heidegger thought this inquiry not disinteresting, but by the same token, not pertinent to his own investigation.
Rather, it requires a different orientation, which is the discipline of phenomenological psychiatry. Sometimes (erroneously, but more fetchingly) called “existential” psychiatry, it was developed by Medard Boss in his book Psychoanalysis and Daseinsanalysis, and by Ludwig Binswanger his book Being-in-the-World (ironically, the same title as Dreyfus’ book). Borrowing from Greek, Heidegger calls what he does “ontological,” whereas what Boss and Binswanger do is “ontic,” that is, concerned with the being of particular individuals.
Both Boss and Binswanger specialize in lengthy, somewhat tedious case histories, the central point being to try to comprehend and articulate the patient’s “world.” Both are in implicit (somewhere, probably, express) opposition to Freudian psychoanalysis, or anything having to do with the “unconscious” or the “subconscious” – which they pick up directly from Heidegger.
Although they would like to pretend their techniques and methodology work for more pathological cases, it seems clear they would be bewildered if confronted by a person with severe mental illness. In such an instance, only strong pharmacology can help – which is conceptually appropriate, because it involves an adjustment to brain chemistry, and the brain is an organ, just like the heart or the lungs. The mechanism of genuine “brain disease” (by which I mean to include all forms of mental malfunction, from Phineas Gage forward) is not, in principle, different from the malfunction of other parts of the physical body.
One of the great accomplishments of modern psychiatry is to refine and extend pharmacological techniques back from the severest cases, to those that are more subtle or nuanced – those which involve finer, or more granular, adjustments to brain chemistry. These include, in particular, mood disorders such as depression, mania, bipolar disorder, milder forms of schizophrenia, and the like. The outcome from these interventions is that the patient better is able to cope with all of the complex protocols, conventions and understandings involved in transparently coping with, and being in, the world.
I. Constraints on the “Self”
In my opinion, the main problem with this entire group – Heidegger, Wittgenstein, Boss and Binswanger – is that they have no concepts of (a) brain chemistry, (b) evolution, or (c) what might be characterized as “utility,” in a practical economics sense. That is, we are utility-maximizing creatures – not rational ones, or ones created by God, or subject to an Hegelian-Marxist dialectic, or in the grips of libido, etc.
There now, of course, is a thriving discipline of “cognitive science” that attempts to address these dynamics. As promising a step forward as it may seem, my working hypothesis is that our understanding of brain chemistry never will be adequate to address decisively phenomena such as “consciousness” and “mind,” simply because they are illusions of “self,” whereas the brain is “of the world.”
One of the few people who starts to address them, however partially, is Antonio Damasio, in his book Descartes’ Error (1994). Although I don’t think he approaches it adequately, because you can read his book from start to finish and still not appreciate the true nature of the problem.
To me, “cognitive science” seems to rest on premises similar to “artificial intelligence,” that is, the effort to develop a syntactically and semantically correct simulation of human experience. Such an initiative inevitably must fail, if it has not done so already, if only on principle. Because there are infinitely many variables, frames of reference, notions of context, etc. which it would have to (and never will be able to) accommodate. Dreyfus makes this point over and over in published papers and in his books What Computers Can’t Do – the Limits of Artificial Intelligence (1978) and What Computers Still Can’t Do – a Critique of Artificial Reason (1992).
Several works export this “incompleteness” problem over to cognitive science. That is, it seems unlikely our understanding of neurophysiology and brain chemistry, however advanced, ever will be adequate to explain the notion of “mind,” much less “human behavior.” Nor should that necessarily be its purpose, particularly with respect to “mind,” which has the peculiar quality of appearing whenever you go looking for it, then invariably, and never anytime else.
Ironically, Dreyfus himself is somewhat of a recidivist when it comes to the Internet, primarily on the grounds it promotes disembodied communication between individuals, thereby fostering the notion of a “self” that is independent from “others” in the world. The reason why this is ironic is that, to the general public, at least, he is best known for his podcasts on topics such as existentialism and literature, and the concept of “God” in Western society. Which are available only on the Internet.
A good argument can be made that Dreyfus has it precisely backwards. That is, the Internet has promoted communications between niche affinity groups that are so geographically dispersed, they never would have been able to make contact otherwise. This in turn has fostered a sense of shared community, which is what a “world” is all about. The “heavy metal” music fan on the Sunset Strip in Hollywood, for example, has much more in common with his counterpart in Tokyo, than he does with his next-door neighbor, even though the latter is more geographically proximate.
J. A Case Study for “Other Worlds”
Probably the best case study for “other worlds” is Disneyland, with its various sub-domains including Fantasyland, Tomorrowland, Frontierland, Adventureland, and Main Street. Several Derrida-esque works analyze the semiotics of Disneyland. One of the best is Golden State, Golden Youth (2002) by Kirse Granat May. I grew up in La Jolla, California. My childhood hero was Davy Crockett. In-between “duck-and-cover” drills, in anticipation of nuclear holocaust, one of the high-points of the year was a trip to Disneyland. So Prof. May’s work particularly resonates.
A related topic is Disney art. For example, all Disney heroines have peculiarly large eyes. Why, one might ask? Answer: the better to enable you to peer “into” their “souls.” And why is that important? To reinforce in you, the passive viewer, the illusion that you too have a “self,” that is, a congeries of mentally-related events, the primary characteristic of which is that it stands in opposition to the world. This self in turn can participate, however vicariously, in the various worlds Disney has created, for your self’s delectation and benefit.
Primarily for this reason, a good case can be made that Disneyland is, if not a cause, certainly an abettor of mental illness (mood disorders, mild cases of schizophrenia, anything involving “other selves” or “other worlds”). This trend began initially in Southern California, but then conflagrated internationally as it was propagated by Western media and pop culture.
Disneyland collaborates with, or even co-opts, evolutionarily-designed features of the human brain (e.g., dopamine receptors), in order to create what only can be characterized a mind-altering experience, particularly from the standpoint of brain chemistry. This experience in turn reinforces the notion of a “self” that is able to explore and adapt alternative realities.
By way of clarification, creative authors always have endeavored to disclose other worlds, for the enjoyment of their readers. One of the hallmarks of great literature, such as Moby Dick by Herman Melville and The Brothers Karamazov by Fyodor Dostoyevsky, is that it convincingly invokes, and then transports the reader, into an alternative world.
Much, however, is left to the reader’s imagination. The vagueness and ambiguity of words, together with the impossibility of specifying phenomenological detail with sufficient clarity and comprehensiveness, enable to reader in effect to collaborate with the author to imagine and create the fictional world.
Disneyland, however, builds on this tendency, by making all of these presumptions, associations and connections explicit, crenellated with highly-specific cultural detail. For example, it took Euro Disney some time to establish itself financially, and perhaps it still has not done so, simply because the cultural references are too “America specific” (i.e., Western pop culture has yet to penetrate the thick Gallic mind, just as the thick American mind has difficulty comprehending Edith Piaf, Jean-Paul Sartre, and other precipitants of French culture).
And, it seems to me that one of the main points of Islamic fundamentalism, at least from an “existential” perspective, is to resist the encroachment of Western pop culture. It isn’t so much that Western pop culture is “bad,” per se. Rather, it propagates an alternative reality, revolving primarily around the concept of a “self.” This is ill-suited to the demands, precepts, constraints and requirements of their historical world. Until modern times, this world primarily involved traipsing about in harsh desert environments. This calls for a more disciplined, rigorous approach where the notion of a “self,” as opposed to a “community,” could be disastrous. Reza Aslan makes this point, though with a somewhat different purpose, in his book, No god but God: The Origins, Evolution, and Future of Islam (2006).
K. The Role of “the Media”
Celebrity culture is today’s version of Virgil’s rumor, the importance of which Boyer highlighted. Celebrity (or “pop” culture) is disseminated not only by the kind of tabloids one encounters, vividly displayed, at grocery-store checkout lines – but also by “mainstream” media. The Internet exacerbates its influence. It is likely, from a topical standpoint, that the vast preponderance of Internet “content” and searches are devoted to people who are cosmopolitan swingers, such as Paris Hilton, Britney Spears and Lindsay Lohan.
In so observing, it is not my intention to make fun of Ms. Hilton, Ms. Spears or Ms. Lohan, all of whom are interesting and unique people, facing their own sets of challenges in relating to the world. However, it seems clear their primary skill is to “make the scene.”
Which is to say, the party somehow is inferior, unless they show up; the club comes alive, when they enter; and the restaurant is validated in the eyes of the cognoscenti, by their presence. In other words, by deploying their personality, they transform an ordinary geographic location – a humdrum, mundane place (in Cartesian space) – into one sizzling with excitement, potential and opportunity (when perceived as “existential” space). In Heideggerian terms, they unfold a clearing, which in turn permits that place to be seen as something more than what it is – say, a dreary nightclub – and instead, interpreted in the semiotics of modern pop culture.
Critical to their success is the media, both tabloid and mainstream. The media needs something to write about. It conceals an entire infrastructure of reporters, editors, designers, printers, delivery persons, and retail outlets (such as grocery-store checker lines, subscription services, and stand-alone kiosks).
By degrees, these in turn report, eventually, to powerful, international, multi-media conglomerates, such as: Disney (which owns, in addition to Disneyland and Disney Studios, ABC Television); News Corp. (which, in addition to numerous other media outlets, owns 20th Century Fox and My Space); Time-Warner (Warner Bros.), CBS and Viacom (CBS Television and Paramount), Sony (Columbia Pictures), and General Electric (of all companies) (which owns Universal Pictures and NBC). Google (which owns You Tube), Yahoo, and Apple vie to join this club. Without personalities such as Paris, Britney and Lindsay, these structural elements would wither.
Which is to say, if Paris, Britney and Lindsay were not there to service the media’s architectural framework, “the media,” understood as an organism requiring constant servicing in order to maintain its infrastructure, would invent them. Paris, Britney and Lindsay simply are the deer caught in the headlights. They pose fetchingly and assume their allotted roles, bewildered at why they are not regarded as anything more than zombies.
The media in turn offers up personalities such as Paris, Britney and Lindsay to the public, in order to service the public’s insatiable demand for scandal, gossip, curiosity, perversity, titillation, and other prurient behavior. It solicits its readers to experience a cheap vicarious thrill, by perusing its wares. All the time pandering to a lowest common denominator of public taste and opinion, in order to achieve the broadest possible circulation.
Celebrity culture long has been recognized as one of the primary – and tastiest – ingredients in the media food chain. It accumulates personalities in a kind of self-reinforcing media centrifuge. Jeanine Basinger captures this dynamic in her book The Star Machine (2007), in which she characterizes the phenomenon of the same name as a “practical business plan that manufactured illusions.” Characterizing Louis B. Mayer, head of Metro-Goldwyn-Mayer during the “classic” studio years, Prof. Basinger observes: “Mayer knew it was good business to let the public feel they were the most important part of the star selection process – that stardom happened without calculation.”
A star, however, is “created, carefully and coldbloodedly, built up from nothing, from nobody,” 11. “The star machine process was not suddenly invented – it evolved. … The system institutionalized a natural set of circumstances … when moviegoers discovered they especially liked certain performers – and the movie business discovered the moviegoers’ discovery.”
Are Paris, Britney and Lindsay the latest incarnation of the “star machine” in action, or are they exceptions to it? In a recent article, the feminist theorist Camille Paglia states: “These are women who are clearly out of control because the old studio era is over. The studio system guided and shaped the careers of the young women who it signed up. It maximized their sexual allure by dealing it out in small doses and making sure you don’t have — what has become here — a situation of anarchy.”
Ms. Paglia’s assessment is incorrect. Paris, Britney and Lindsay are as much of a frothy media concoction as were Lana Turner and Marilyn Monroe, in their day. The media needs them, and if those particular persons were not there, as incumbents for a role that was required to be cast, then someone else would have taken their place.
Writers like Tyler Cowen adopt a “bottom-up” approach. In his book What Price Fame (2000), he theorizes that fame proliferates because fans create stars, who then become famous. “Fans use stars as a way of advertising their tastes, distinguishing themselves from others, signaling their cultural standing, and seeking out the like-minded,” 3. Fans “stimulate fame production” because their “desires for information, evaluation, and enthusiasm create incentives for critics to coin new awards, augment the meaning of old ones, and promote new stars, 114 (for “critics,” substitute the word “media”).
This, however, is precisely backwards. Rather, it is society’s need for fame that invites or solicits certain people to become stars, which in turn catalyzes their fans. In their role as media spokespersons-cultural icons, Paris, Britney and Lindsay powerfully affect the common meanings we take for granted – the ways in which we “attune” ourselves to modern society.
As expressed by Prof. Spinosa, albeit in a different context, the best way to refer to them might be as “agents of common meaning,” because they pervasively influence, for example, what is erotically appealing, or our perception of Gen Y women. They have absorbed the prevailing cultural ethos of the time, and processed it – translated it, synthesized it, distilled it – and then fed it back to the rest of society, consuming themselves in the process. They are “cultural figures who cultivate solidarity,” 1, figments of the “cultural power elite that determines which imaginative works structure our aspirations and self-definitions,” 9.
One might object to their influence as potentially negative female role models. This factor, however, is far outweighed by everybody else’s fascination with their various contretemps.
Paris, Britney and Lindsay only are temporarily luminescent, at a certain moment in space and time. They only are as in control of their celebrity as (speaking figuratively) the culture-gods will allow. The montage of attributes they present is significantly different than, say, Madonna; who in turn was significantly different than, say, Marilyn Monroe. But Madonna’s influence has waned, and Monroe’s is so pervasive that it now is difficult to detect, except in a post-modern, ironic sense.
Heidegger completely understood these relationships. In his essay “The Question Concerning Technology, he stated: “The forester who, in the wood, measures the felled timber and to all appearances walks the same forest path in the same way as did his grandfather is today commanded by profit-making in the lumber industry, whether he knows it or not. He is made subordinate to the orderability of cellulose, which for its part is challenged forth by the need for paper, which is then delivered to newspapers and illustrated magazines. The latter, in their turn, set public opinion to swallowing what is printed, so that a set configuration of opinion becomes available on demand.”
I don’t want to over-emphasize the role of “media” in the pop culture equation. The most correct perspective probably is that it exacerbates a pre-existing tendency or dynamic – some aspect or feature of our brains – that is evolutionarily desirable. This in turn directly implicates the neurochemical functioning of the human brain, which itself has evolved from, and with, the world. Properly understood, “celebrity” itself is a kind of narcissistic personality disorder – a condition exacerbated by specialized brain chemistry, both for fan and celebrity alike.