Friday, May 30, 2014

Acronym Acrimony

So, my last post, Filler Up!, was on an area of nonverbal communication, and it's only fair that I follow it up with a post on verbal communication. In this case, it's on abbreviations, and especially acronyms, which are abbreviations that form a word—for example SNAFU is pronounced as one word, and therefore an acronym, as opposed to vocalizing it as ess-en-ay-eff-yu. And in case you don't know what it stands for, and I hope you will excuse the profanity, it's, Situation Normal, All Fucked Up, and like FUBAR, Fucked Up Beyond All Repair (or Reason or Recognition), is an example of military slang.

But note that while slang traditionally refers to a form of speech that has not been legitimized by appearing in print, acronyms and abbreviations are by-products of written communication, and the alphabet. (By the way, it seems that the story that the curse word spelled eff-yu-cee-kay is an acronym, something I recall hearing about in schoolyard back when I was in elementary school, is just an urban legend, at least according to Wikipedia.)

So, if the abbreviation can't be pronounced, it can't become an acronym. For example, HTML lacks any vowels, and it would be awkward and not very compelling to pronounce it as something like hitmul, so we just say aych-tee-em-el. If it is pronounceable, then it may be turned into an acronym, but it's not inevitable that it will be. For example, the abbreviation USA could be pronounced as oossa or youza, but it isn't, its you-ess-ay (repeated ad nauseum as a cheer if you like).

I should add that some acronyms are back formations that take an actual word and work out a workable phrase that it can serve as an abbreviation of. For example, a simple computer programming language that was popular once upon a time was dubbed BASIC, which stands for Beginner's All-purpose Symbolic Instruction Code. Similarly, Marvel Comics turned the word SHIELD into an acronym for its fictional spy agency, originally having it stand for Supreme Headquarters, International Espionage, Law-Enforcement Division, and then later changing it to Strategic Hazard Intervention Espionage Logistics Directorate, and then in their films and television programs to Strategic Homeland Intervention, Enforcement and Logistics Division.



Speaking of which, I think they did a pretty good job on the new Agents of SHIELD television program on ABC, which recently completed its first season. And speaking of comics, one of the all-time great acronyms is SHAZAM, which stands for the wisdom of Solomon, the strength of Hercules, the stamina of Atlas, the power of Zeus, the courage of Achilles, and the speed of Mercury. Shazam was the name of the wizard who gave the superhero Captain Marvel his powers, or rather who gave young Billy Batson the ability to turn into Captain Marvel by shouting, Shazam! 



Captain Marvel had powers similar to Superman's, so DC (an abbreviation of Detective Comics, although it was then known as National Comics) sued Fawcett Publications, and Fawcett eventually agreed to stop publication of Captain Marvel. DC later bought the rights to publish Fawcett's superheroes, including Captain Marvel, but in the meantime Marvel Comics had introduced their own superhero with that name, naturally enough given the name of that company, and DC consequently has been using Shazam more and more over the past few decades in reference to the character instead of Captain Marvel, and recently renamed the hero Shazam, dropping Captain Marvel altogether. 





Anyway, the point here is that the exclamation, Shazam!, worked its way into popular speech, and while its meaning as an acronym is still present in the comics, in other contexts that meaning was lost (which is reflected by the fact that it's printed as Shazam and not SHAZAM). This compilation of clips from the 1960s CBS sitcom, Gomer Pyle, USMC (pronounced yu-ess-em-cee), starring Jim Nabors, while a bit obnoxious, demonstrates how he used shazam! as an exclamation alongside golly and gosh (slang words that are not abbreviations of anything), and utterly devoid of its acronmymic meaning. This, by the way, was the first I ever heard of the expression, as the original Captain Marvel comic ceased publication several years before I was born, and he wasn't resurrected by DC until I was a teenager. Anyway, through Gomer Pyle, Shazam! was transformed into a kind of military slang, at least on television. And also on record album, it appears:





But this winding and long-winded introduction is meant to introduce the topic of abbreviations and acronyms as a part of the public discourse of our culture, and following the same pattern as my last post, the fact that I have some comments on the topic began with a query from Palash Ghosh of the International Business Times, which I'll provide first, along with my response.  In this instance, Ghosh sent me the following query: 

can you make a comment on the use of acronyms and abbreviations in media.

I have noticed that media/newspapers/magazines in my native India use abbreviations and acronyms all the time–perhaps even excessively. I realize the acronyms and abbreviations are necessary as short-cuts and they are also used in western media–but do you such things are used too much in media? If so, do they serve to confuse the reader? Do most media have to use these things due to the urgency of instantaneous communications?
And here now is my reply:

Acronyms and abbreviations are quite common in American culture. For example, POTUS recently delivered the SOTU, many members of the SCOTUS were present, and it went off without a SNAFU, and most Americans who watched it on TV thought it was OK. It seems as if we never pass up an opportunity to shorten a word, either by cutting off syllables, which is why the head of a committee is called a chair now, rather than substituting chairperson as a non-sexist alternative to chairman, or by substituting initials for full names and terms. Sometimes I think us Americans would be happiest if we could only reduce our words down to a series of grunts.

Abbreviations and acronyms are an unintended consequences of the invention of the alphabet, and have their origins in antiquity. When words could only be written out by hand, or even more laboriously chiseled out of stone, abbreviations were welcome, sometimes necessary to make words fit into a limited space, and had the added utility of functioning as icons for the illiterate, who could recognize the look of the symbols and recall their meaning without actually reading, that is sounding out, the letters. For example, in ancient Israel, the name Maccabee, associated with the holiday of Hanukkah, is an acronym that stands for Mi Chamocha Ba'alim Adonai, the beginning of one of the oldest prayers in the Bible, translated as, "Who is like unto You among the gods (that are worshipped), O Lord?" In the early Christian church, abbreviations such as IHS were used as symbols for Jesus Christ--in this case IHS corresponds to the first three letters of the name Jesus in the Greek alphabet, iota, eta, and sigma, and is also said to stand for the Latin phrase, Iesus Hominum Salvator, which means Jesus, Savior of Men.

Gutenberg's invention of printing with moveable type made the need for abbreviations less pressing than before. While there still would be a degree of economy achieved in using initials and acronyms when typesetting a page, the investment made in spelling things out would be returned in the mass production of the document, as the mechanization of what was once written out by hand would yield economies of scale in the form of multiple, identical copies with increased legibility, clarity, and accessibility. The invention of the telegraph in the early 19th century restored the need for economy of expression, as pricing often was based on the number of words used. As this first of the electronic media made instantaneous communication possible, speed went hand in hand with brevity, and the use of abbreviations and acronyms was intensified as never before. Telegraphic discourse not only was seen in the exchange of messages among individuals via telegrams, but in the transmission of news reports over the wires, which led newspapers to adopt the pyramid structure for articles (beginning with the most important information in the first paragraph, and continuing in descending order, rather than in telling a story in linear fashion from beginning to middle to end), the mosaic look of the front page (again, nonlinear and anticipating the current hypertextual interface of the World-Wide Web), and the big, bold headline, where abbreviations and acronyms were visually trumpeted. This made telegraphic discourse, including the use of abbreviations and acronyms, a central part of the culture.

Telegraphic discourse was further intensified by the addition of wireless telegraph, radio, and television, but it is especially in communication via electronic text that it is most apparent. This begins with the use of email going back to the 70s, and is reinforced a little later by the addition of synchronous messaging, aka chat, which puts even more pressure on the participants communicating in real time to use the fast, most efficient means of sending messages. The further economy imposed by the 160 character limit of SMS or text messaging via cell phones, and Twitter's 140 character limit piggybacking on top of SMS, has led to an explosion of such textual devices as lol, brb, bff, not to mention the innovation of emoticons, the use of typographical symbols to convey facial expressions and emotional states.

There are times when speed and efficiency are necessary, and times when the available writing space is limited, so that economical use of abbreviations and acronyms can be helpful. Acronyms also can serve mnemonic functions, as for example students memorize the names of the Great Lakes through the acronym HOMES (Huron, Ontario, Michigan, Erie, Superior). But in American culture, and in technological societies in general, the extent to which telegraphic discourse has come to dominate contemporary communication is a symptom of what Jacques Ellul called "the humiliation of the word." It represents a loss of the richness of language, a disregard for poetics and rhetoric, for eloquence in human communication, and as well a loss of nuance and precision, and consequently an increase in the likelihood of misunderstanding. Our overuse of abbreviations and acronyms is one of several facets of the degraded form of discourse that constitutes communication in the 21st century.

So that was my take on the media ecology of acronyms, and how we may be abbreviating ourselves to death, as well as informing, amusing, and amazing ourselves to death as well. 

So now, on February 5, 2014, the International Business Times published Ghosh's article on the subject, which is entitled, Alphabet Soup: Why Is Indian Media So Obsessed With Acronyms And Abbreviations?  And while he did bring up India in his query, I will admit to being surprised that his article focused on India, which made for an interesting cross-cultural comparison:

Consider the following passage from a recent article published in The Hindu newspaper of India: “CPI(M) general secretary Prakash Karat on Monday expressed hope that the AIADMK-Left alliance would ensure success in Tamil Nadu.”

Or this paragraph from Indian Express: “The allegations against Jaitley and Modi came from AAP MLA from Kasturba Nagar, Madan Lal… Reports suggested that Lal was one of the MLAs who would support expelled AAP MLA Vinod Kumar Binny. Along with JD(U) MLA Shoaib Iqbal and Independent MLA Rambir Shokeen, Binny had threatened to pull down the AAP government.”

Or this ditty from India Today: “Final postmortem report from the medical board of the doctors of AIIMS is awaited. This final report will also take toxicology report from CFSL/CBI into consideration… On January 30 at around 5 pm, he was brought dead in AIIMS accordingly a case u/s 302 IPC & 3 of SC/ST Act was registered."

With respect to the first example, CPI(M) refers to the Communist Party of India (Marxist) – to distinguish it from the regular Communist Party of India, which is usually labelled as just CPI.

The AIADMK refers to the extravagantly named All India Anna Dravida Munnetra Kazhagam, the political party that currently runs the southern Indian state of Tamil Nadu.

In reference to the second example, AAP refers to the Aam Aadmi Party, a new anti-corruption party that swept into power in the capital city of Delhi. MLA means Member of Legislative Assembly – an office that is just below MP (Member of Parliament). JD(U) refers to Janata Dal (United), a center-left Indian political party.

In the incomprehensible third example, AIIMS refers to the All India Institutes of Medical Sciences (a group of public medical schools across the country), while CFSL is the Central Forensic Science Laboratory, a branch of the Indian Ministry of Home Affairs. Also, CBI stands for the Central Bureau of Investigation, India’s principal police investigation agency. The last sentence in that paragraph simply may seem to defy translation – but 302 IPC refers to Section 302 of the Indian Penal Code, while SC/ST Act refers to the Scheduled Castes and Tribes (Prevention of Atrocities) Act of 1989, a law designed to prevent abuse and mistreatment of lower-caste Indians.

In any case, these three passages underline one of the fundamental realities of Indian media and communications – the obsession with acronyms and abbreviations. For the uninitiated, reading an Indian newspaper, magazine or government publication could become quite daunting.
At this point, we do get a connection to American culture, and perhaps to the English language in particular:

Of course, such devices are used to save space and time – and they are also widely used in Western media. For example, in the United States, John F. Kennedy has become the iconic “JFK,” while in Europe the notorious Dominique Strauss-Kahn has metamorphosed into the more familiar “DSK.”

Still, in the English-language media of the Indian subcontinent (which also includes Pakistan and Bangladesh) the use of acronyms and abbreviations in communications seems almost pathological. Within India itself, the plethora of political parties, political groups and titles for lawmakers, military personnel and educators has created an immense reservoir of acronyms and abbreviations that boggle the readers’ mind and threaten to drown the meaning behind the text of media pieces. The inundation of so many acronyms likely looks like gobbledygook to non-Indian readers.

Patralekha Chatterjee, a New Delhi-based journalist, admitted that Indian media rely too heavily on the use of acronyms, but explained the realities behind this practice. “Sometimes, it is also because of the constraints of space,” she said in an interview. “Space is at a massive premium because of [advertisements], etc.” In addition, the name of some Indian political parties and luminaries are so long, like the aforementioned All India Anna Dravida Munnetra Kazhagam, that abbreviations become necessary.

Indeed, English may be the lingua franca of Indian media, finance and politics, but it is a second language for hundreds of millions of people who speak hundreds of other different languages and thousands of dialects – rendering the use of an alien tongue a questionable and imperfect attempt to foster a kind of national unity.

But something has gotten lost in the translation.

Chatterjee also noted that most Indian journalists try to squeeze in some context and write the full name of an organization (or political party or branch of government, etc.) at the start of an article and then follow with acronyms. “But it ultimately depends on the readership,” she said. “If the [editorial] desk thinks [that] most of your readers would be familiar with a certain acronym, then it is used. But if you are referring to something which would be unfamiliar to the average reader of the newspaper/magazine, then it is the full name.”

Since Chatterjee writes for the international and national media, she tries to minimize the use of acronyms in her works. “I can imagine all this being extremely confusing to a foreigner or someone who is not familiar with what is going on in India and picks up a newspaper/magazine,” she conceded.

It's an interesting question. Certainly, the use of acronyms is a general function of the alphabet, and in some ways amplified by typography. And it does seem as if the English language in its written form makes greater use of abbreviations than other languages, perhaps because of the idiosyncrasies of English spelling? But its overuse does strike me as an effect of electronic communications, and the technological drive towards greater and greater efficiency. Those factors, I would think, are what have pushed this usage over the edge.)

The practice has also become epidemic in the United States. On a blog for the Baltimore Sun newspaper, John McIntyre called for a limitation on the use of abbreviations, amidst concern that their presence may compromise readability. “Professional publications embrace abbreviations and acronyms much more readily as a kind of lodge handshake identifying who is in the club,” McIntyre wrote. “Lawyers and civil servants are particularly addicted to the practice.” McIntyre suggested that his personal preference is to minimize abbreviations and acronyms “because they distract me and quickly convey a leaden bureaucratic tone to articles.”

As in India, documents produced by the U.S. government, medical, military and scientific organizations are typically overwhelmed with acronyms. As a blogger on the Baltimore Sun complained: “As one who has spent many hours editing Defense Department documents, I have seen too many pages reduced to incomprehensible alphabet soup by acronyms. Their use should be minimized; that's not even a question.”

What about my response, you may be wondering at this point. Or probably not, but I'll remind you anyway, and here it comes:

It seems as if we never pass up an opportunity to shorten a word, either by cutting off syllables, which is why the head of a committee is called a chair now, rather than substituting chairperson as a non-sexist alternative to chairman, or by substituting initials for full names and terms, commented Dr. Lance Strate, professor of communication and media studies and associate ‘chair’ for undergraduate studies at Fordham University in New York. “Sometimes I think us Americans would be happiest if we could only reduce our words down to a series of grunts.”

And that's all, at least for now, as we get a second opinion on the inhospitality of the practice:

No less a figure than J.W. “Bill” Marriott, the executive chairman and chairman of the board of the hotel chain Marriott International Inc. (NASDAQ:MAR,) has spoken out against the excessive use of abbreviations. In a witty blog he titled “T.M.A. – Too Many Acronyms!,” Marriott bemoaned the infiltration of acronyms and abbreviations in daily communications. “We have far too many acronyms,” he lamented. “It started with government agencies (GOV), and it has invaded corporate (CORP) headquarters (HQ). Like an invasive species, it’s threatening to choke off innovation.”

As a native of Washington, D.C., he noted, acronyms are part of his very blood. “There’s DC [District of Columbia], DOD [Department of Defense], DOT [Department of Transportation] and DOJ [Department of Justice],” he quipped. “Don’t confuse the FCC [Federal Communications Commission] with the FEC [Federal Election Commission] or the FAA [Federal Aviation Administration] and FDA [Food & Drug Administration]. We all know the CIA [Central Intelligence Agency], NIH [National Institutes of Health] and EPA [Environmental Protection Agency]. In the corporate world, we tremble when we hear SEC [U.S. Securities and Exchange Commission], IRS [Internal Revenue Service] or FTC [Federal Trade Commission].”

Marriott noted that the three-letter acronyms seem to have a “prestige and status” over the four- or five-letter ones, citing that the longer acronym usually salutes the shorter one – i.e., NHTSA [National Highway Traffic Safety Administration] to DOT or DARPA [Defense Advanced Research Projects Agency]to DOD. “I’ll leave you with my favorite acronym: STML – short-term memory loss,” Marriott concluded. “Let’s forget these useless acronyms. Chances are we already have. We don’t understand them.”

Now the emergence (and seeming omnipresence) of social media will likely give acronyms an even higher place in our daily lives – and that’s nothing to LOL about.

Ghosh does give me the last word, however:

But Strate cautions than acronyms have a long history and are likely to remain as long as humans communicate.

“Abbreviations and acronyms are an unintended consequences of the invention of the alphabet, and have their origins in antiquity,” he noted. “When words could only be written out by hand, or even more laboriously chiseled out of stone, abbreviations were welcome, sometimes necessary to make words fit into a limited space, and had the added utility of functioning as icons for the illiterate, who could recognize the look of the symbols and recall their meaning without actually reading, that is sounding out, the letters.”

Still, as a scholar and one who wishes to preserve the beauty of language, Strate laments: “Our overuse of abbreviations and acronyms is one of several facets of the degraded form of discourse that constitutes communication in the 21st century.”

And there you have it! Acronyms are, in moderation, a bit of a SHAZAM!-like magic transformation. But used too frequently and it's SNAFU, with language and culture threatening to go entirely FUBAR. At least, that's my abbreviated take on the matter.


 

Monday, May 26, 2014

Filler Up!

So, time for some filler. But I don't mean filler of the fluff variety, but rather the nonverbal communication phenomenon, ummm, ahhh, well, like, you know, yeah, okay, right? 

And maybe you're wondering what this has to do with my usual commentary on media, technology, symbolic form, consciousness and culture? If you are, well, you know, I am in the field of communication after all, and nonverbal communication is part of that field. 

And as for media ecology, folks sometimes miss the very wide definition of medium that we use in our intellectual tradition, one that encompasses all modes of communication, including our bodies as media, both sensory organs and physical presence and movement, and also, as a component of speech, the sound of our voices and all of the sounds that come out of us, including those that are not linguistic in nature.

So, back in January I was interviewed by Palash Ghosh on the subject of fillers, and subsequently was quoted in an article published in the International Business Times on January 29 2014, entitled Like, Uh, You Know: Why Do Americans Say 'You Know' And Use Other Verbal Fillers So Often?

And as is the custom here on Blog Time Passing, you can click on the link and see the article in its original context, or see it now, right here. But before you do, how about I share with you the original interview, since only a small part of it was excerpted? In some of my previous posts, I've done that as well, so that all of the comments I provided that were not used in the article wouldn't go entirely to waste.

This time around, I thought I'd give you the raw material first, which might provide a better angle on how journalists abstract quotes from interviews and construct their articles. And of course it'll give you a better sense of my own thoughts on the matter, which is, after all, what this blog is all about.

So here now is the interview:


Palash Ghosh: Are the use of “filler” words like “you know” more common now among English-speaking people than in years past?

Lance Strate: The term "fillers; is short for "filled pauses" as they are pauses in speaking that use some form of sound instead of silence, which is why they are also known as "vocalized pauses" and alternately, as "interjections" as they are typically inserted between bits of speech, or sometimes at the beginning or end of an utterance. They are a form of paralanguage, the nonverbal dimension of speech, and every spoken language has its own accompanying paralanguage—the two are inseparable.
The use of fillers in speech is perfectly normal and quite common, and the degree to which they are used today is probably no different than the extent that people used them in the past. It is not the frequency that changes so much as the actual fillers themselves, so that "you know" and "like" became much more common over the past several decades than they were in the past.

Palash Ghosh: What is the reason for the excessive use of 'filler'? Is it nervousness, indecision, lack of confidence, poor vocabulary?

Lance Strate: There is no one reason for their use, but nervousness is certainly one reason, which goes hand in hand with a lack of confidence. Indecision can be a different reason, not just as an expression of hesitancy, but as a means of filling the "dead air" while providing the individuals with a moment to think about what they are going to say next. For that reason, fillers are sometimes an indication that the person is lying, but only sometimes, and that can only be evaluated as part of a larger context, and in conjunction with other nonverbal cues.
And much depends on the speaking context. For example, teachers, professors, lecturers, and the like often make extensive use of fillers to provide space for thinking about what they're going to say, and sometimes that can be a habit as well, but the point is that it in no way is a sign of a poor vocabulary. By the way, politicians often resort to fillers in press conferences, when they are not reading from a prepared statement or speech. Ronald Reagan, who was dubbed "The Great Communicator," was able to speak very fluently when he was reading from a teleprompter, but when he held a press conference and it came time for questions and answers, his use of fillers skyrocketed. Quite often he would respond to a question by beginning with "well" uttered in a long, drawn out manner, which again gave him time to think about what he was going to say.

Palash Ghosh: Do you think phrases like “you know” degrades the language and our communication skills? Does it reflect a decay in western education?

Lance Strate: I think that is much too strong an indictment. As I've said, it's perfectly normal to use fillers, and their use does not reflect a lack of intelligence or education, but what the current state of speech reflects is a decline in emphasis on public speaking. And that is unfortunate, and ought to be rectified. It's not just the use of fillers, but proper pronunciation, the sort of thing they sang about in My Fair Lady, the rain in Spain falls mainly on the plain, the sort of thing that was an ideal in Jesuit education traditionally, known as eloquentia perfecta, perfect eloquence.
But speech used to be taught in the public schools, and it combined proper pronunciation and enunciation, fluency of language, avoiding fillers, and also speaking with lucidity and logic. The ability to stand up and speak in front of people is very important, it is something individuals need to do in a variety of organizational settings, but today most people focus on putting together a good PowerPoint presentation, and not being able to speak well, and that is truly unfortunate. And it is commonly said that surveys in the US indicate that public speaking is the thing people fear the most, followed in second place by death. And in the context of western education, going back to the medieval trivium, and into the 20th century, yes, it does reflect a serious loss, and one that also impacts reading and writing skills, as vocabulary, for example, is as much a function of the oral as the literate.

Palash Ghosh: How do “fillers” differ from slang?

Lance Strate: Fillers are not really words, at least not when they are used as fillers. Sounds like "um" and "ah" are not words, not symbols that stand for specific meanings, and when words like "well," "like," and "you know" are used as interjections, not used in any relation to their dictionary meaning, then they are not language, they are paralanguage, nonverbal vocalizations.
Slang on the other hand refers to actual words, which is why we can have dictionaries of slang. The old slang word "ain't" means the same as "isn't" for example. The reason why some words are considered slang is that they are only used in spoken language, and have not formally appeared in print. All human cultures have spoken language, but only some have writing systems, and in cultures where there is no writing, there is no slang—the words people speak are the words of the language. So slang does have a vague connection to fillers in that both are related to orality much more so than literacy, but otherwise slang is a form of verbal communication.

Palash Ghosh: Do non-English speakers also use “fillers”?

Lance Strate: Absolutely. Fillers are used in every language group, every dialect, every human population. Or as the saying goes, to fill-err is human...

Palash Ghosh: Is there anything good or beneficial about the use of “filler” words?

Lance Strate: Indeed there is. As I mentioned, it gives speakers time to think about what they are going to say next. And they often function in speech in the same way that punctuation marks function in writing. Moreover, there is an interactive element, providing a space for speakers to see if the persons they are talking to are listening, and for listeners to indicate that they are paying attention and following what is being said, for example, by eye contact, nodding the head, or even responding with another filler like "huh" or "hmmm" or "okay" or "uh huh." Fillers are normal, common, and universal because they are eminently useful to punctuate speech, help speakers compose thoughts and words, and govern interaction among individuals.

Palash Ghosh: Even prominent public figures (including Barack Obama, Hillary Clinton, Caroline Kennedy, movie mogul Harvey Weinstein) frequently use phrases like “you know” in interviews. Does this suggest that even (presumably) well-educated and successful people are not immune to this strange phenomenon? Are verbal skills no longer important even for people who are required to do a lot of public speaking?

Lance Strate: That's right, no one is immune, but there is a difference between, say, teenagers who use fillers frequently in conversation as part of their distinctive mode of talk, people in general in informal conversation where disfluencies of all sorts are perfectly normal and nothing to be ashamed of or concerned about, and public figures who may utilize fillers for the useful functions they provide. But you are also correct in your implication that there has been a distinct decline in public speaking ability throughout our society, a loss of eloquence, and a loss of the expectation for eloquence, so that someone with as poor a set of verbal skills such as George W. Bush could become president, and many even identified with his lack of fluency in the English language (and I am not referring to non-native speakers). While Bush was, in my opinion, unfairly mocked for his deficiencies, that's not something to make fun of, after all, although it does reflect a tragic loss of a vital part of any culture, and there is a relationship between the decline of oratory and spoken language ability in general, and the reduction of so much of our public discourse to infantile babble.


So, I think you can see now how there are connections that can be made to subjects more central to media ecology, such as orality and literacy, and the present condition of public discourse. 

And now this, the actual text of the article as published online:


On the evening of Jan. 17, Hollywood film mogul Harvey Weinstein appeared as a guest on Piers Morgan Live on CNN to discuss his plan to make a movie that will attack the National Rifle Association and to respond to accusations that his films portray the Catholic Church negatively. While the majority of the viewing audience likely focused on the content of Weinstein's replies, a smaller segment of the audience might have been alarmed (or annoyed or amused) by the movie producer's penchant for using the meaningless phrase “you know” in his discourse. Indeed, Weinstein used that term a whopping 84 times during the broadcast.

Linguists call interjections like “you know” and “like” and “um” and “I mean” and a multitude of others “filler” or “discourse particles”–that is, an unconscious device that serves as a pause in the middle of a sentence as the speaker gathers his or her thoughts but wants to maintain the listener’s attention. However, it would appear that such fillers–which have minimal grammatical or lexical value–have infiltrated daily conversations to such an extent that they threaten to further damage the beauty, power and effectiveness of verbal communication.

“Fillers can be overused, making the speaker sound nervous or otherwise unprepared,” wrote Heather Froehlich in the Examiner. “Someone who uses fillers comes off as more informal than intended, creating a dissonance.” Generally, younger people–whose mastery of their own languages are still evolving–tend to use fillers more than the older set, without much recrimination. But among adults, the excessive use of fillers can sometimes indicate personality quirks.

I think you can see from the article's first few paragraphs, as well as from the kinds of questions I was asked in the interview, that there was a negative view of fillers from the beginning, one that I sympathize with, but don't entirely share. So now, here is the first quote excerpted from my interview:


“There is no one reason for [the use of fillers], but nervousness is certainly one reason, which goes hand in hand with lack of confidence,” said Dr. Lance Strate, professor of communication and media studies and associate chair for undergraduate studies at Fordham University in New York. “Indecision can be a different reason, not just as an expression of hesitancy but as a means of filling the ‘dead air’ while providing the individuals with a moment to think about what they are going to say next. For that reason, fillers are sometimes an indication that the person is lying, but only sometimes, and that can only be evaluated as part of a larger context, and in conjunction with other nonverbal cues.”

And now it's on to someone else for another opinion, one that's more on the negative side than mine:


Dr. Stephen Croucher, currently professor of Intercultural Communication at University of Jyväskylä in Finland, who has studied such speech behaviors, estimates that the use of filler has increased over the past 30 years, with media proliferation and images of what is commonly called "Valley-talk" and ”California-speak.”

But J. Mark Fox, a communications professor at Elon University in Elon, N.C., believes that speaking skills are in serious decline in this country and elsewhere. As people develop a speech pattern over time -- and unless they make a concentrated effort to avoid them -- the filler words become normal, to the point that they do not even know they are using them. “I have asked students many times, ‘Do you know that you said ‘umm’ at the beginning of every sentence?’” Fox said. “Almost always, they admit that they did not know that. Until I point that out to them, they are not conscious of it at all.”

Of course, in a society increasingly dominated by social media and texting, brevity (i.e., terms like “lol” and “omg”) is popular and even valued as a quick and easy method of instant communications. For them, "you know" has become an accepted part of daily speech.

Now, it's back to one of the points that I made in my interview:

Even prominent public figures use fillers quite often, sometimes excessively. Former President Ronald Reagan (nicknamed The Great Communicator) was widely mocked for frequently beginning replies to questions with the ever-popular filler “Well...”

“[Reagan] was able to speak very fluently when he was reading from a teleprompter, but when he held a press conference and it came time for questions and answers, his uses of fillers skyrocketed,” said Strate. “Quite often he would respond to a question by beginning with ‘well’ uttered in a long, drawn-out manner, which again gave him time to think about what he was going to say.”

But back now to the theme of how we've been going to hell in a handbasket, filler-wise:

On the other end of the political spectrum, Caroline Kennedy, the daughter of President John F. Kennedy and the current U.S. ambassador to Japan, may be the all-time “filler champion.” In an interview with the New York Times in December 2008 (while she was pondering running for the Senate), Caroline Kennedy used the filler "you know" an astounding 142 times in what was essentially a 20- to 30-minute interview.

That infamous interview, with the Times' Nicholas Confessore and David M. Halbfinger, included the following incomprehensible piece of verbosity from Caroline: “So I think in many ways, you know, we want to have all kinds of different voices, you know, representing us, and I think what I bring to it is, you know, my experience as a mother, as a woman, as a lawyer, you know, I've been an education activist for the last six years here, and, you know, I've written seven books–two on the Constitution, two on American politics. So obviously, you know, we have different strengths and weaknesses." Caroline’s hopes for a Senate seat never came to fruition.

The current occupant of the White House, Barack Obama, is also enamored of fillers, though not to the same extent as Caroline Kennedy. Obama and Kennedy are both highly educated people who achieved great success–yet, their public speaking skills leave much to be desired. In this two-part interview with Chris Cuomo of CNN in August 2013, Obama used the filler “you know” (both in the beginning of a sentence and elsewhere) 29 times.

In another interview with George Stephanopoulos of ABC in March 2013, Obama used “you know” no fewer than 43 times, including four times in one paragraph.

Okay, so in all fairness now, my view that the use of fillers, however undesirable, does not represent the decline and fall of western civilization actually gets some attention at this point:

So, why has this practice of using meaningless interjections and verbal pauses–even among well-educated and powerful people–become so widespread? And is it really even a “problem”?

Strate doesn’t think the use of fillers necessarily spells a death knell for language and communication skills. He explained that fillers are a form of “paralanguage,” the nonverbal dimension of speech, and that every spoken language has its own accompanying paralanguage–indeed, the two are inseparable. Moreover, the “use of fillers in speech is perfectly normal and quite common, and the degree to which they are used today is probably no different than the extent that people used them in the past,” he said in an interview. “It is not the frequency that changes so much as the actual fillers themselves, so that ‘you know’ and ‘like’ became much more common over the past several decades than they were in the past.”

One must also consider the context–especially for teachers, professors and lecturers, whose jobs demand they speak in public, often for long durations, leading them to “often make extensive use of fillers to provide space for thinking about what they're going to say, and sometimes that can be a habit as well, but the point is that it in no way is a sign of a poor vocabulary.” Strate posits that the expanded use of fillers reflects not a decay in education but a decline on the emphasis on public speaking. “And that is unfortunate, and ought to be rectified,” he said. “It's not just the use of fillers, but proper pronunciation, the sort of thing they sang about in ‘My Fair Lady.’”

Strate further noted that speech used to be taught in the public schools, which combined lessons in proper pronunciation and enunciation, fluency of language, avoiding fillers, and also speaking with lucidity and logic. “But today most people focus on putting together a good PowerPoint presentation, and not being able to speak well, and that is truly unfortunate,” he added.

I think my view is the more nuanced and balanced, but of course I admit to being entirely biased on the matter. The article now returns to the more entirely negative orientation it began with:

Fox at Elon takes a dimmer view of the widespread use of fillers in daily speech, noting that it reflects a decline in Western education. “I tell my students that one of the reasons they want to learn to speak without fillers is that they [excessive filler words] give the impression to the [listener] that the speaker is not very intelligent, even though they may be extremely bright," he said. As for high-profile public figures like Obama, Kennedy and Weinstein speaking with so many fillers, Fox lamented that no one is immune from this behavior. “All of them are products of today’s educational system, which, let’s face it, is not anywhere near what it used to be, as standards and requirements have slid,” he said. “Verbal skills are just as important today; that means that those who have them will rise to the top of almost any profession.”

And let's get a little bit more from me on the role of the electronic media in making this an issue in the first place:


To be fair, there is something else to be considered here: As the private lives of public figures are increasingly out in the open, so, too, are their words increasingly uncensoredfor better or worse. In the past, print media edited out any fillers that an interviewee might have uttered, Strate noted. “The continued expansion of broadcasting and other forms of audiovisual recording and transmission make it harder to filter out the fillers, making any disfluencies of public figures much more apparent and well known,” he explained. “The electronic media are biased toward more informal formats than public speaking. They prefer more conversational formats such as interviews, and public figures by necessity need to rely on fillers in those kinds of formats. And reflecting the new electronic media environment, print media sometimes include the fillers too, no longer covering for the speaker in this sense, and perhaps as a form of criticism or mockery, but also with the effect of presenting fillers as a normal and accepted element of communication.”

In any case, for the record, ECG, a strategic communications consultancy, made the following admonition about fillers: “Fillers distract. They drown your message. They impair your delivery by diminishing your ability to align pacing, pauses and vocal variation to content. They make you seem uncertain, unprepared and unknowledgeable. They take up time and add no value.”


And so we end, once again, on an entirely negative note. And maybe it's because I use more than a few fillers myself when I'm teaching or talking on a panel where I'm not reading from a prepared speech, but I think you can see that there are some differences of opinion on the topic of fillers. So I guess you could say that it's a bit of an issue. Not much of one, to be sure. So it may just be a very slightly controversial subject, but it is an interesting topic talk about, after all. Ummm, ahhh, you know, okay?

Wednesday, May 21, 2014

Amazing Ourselves to Death

So, the major milestone of the past few months has been the publication of my new book, Amazing Ourselves to Death: Neil Postman's Brave New World Revisited. Have you ordered your copy yet? If not, here's an easy way to do it:









So, here is the rather overblown write-up from the back of the book:

Neil Postman’s most popular work, Amusing Ourselves to Death (1985), provided an insightful critique of the effects of television on public discourse in America, arguing that television’s bias towards entertaining content trivializes serious issues and undermines the basis of democratic culture.

Lance Strate, who earned his doctorate under Neil Postman and is one of the leading media ecology scholars of our time, re-examines Postman’s arguments, updating his analysis and critique for the twenty-first-century media environment that includes the expansion of television programming via cable and satellite as well as the Internet, the web, social media, and mobile technologies.

Integrating Postman’s arguments about television with his critique of technology in general, Strate considers the current state of journalism, politics, religion, and education in American culture. Strate also contextualizes Amusing Ourselves to Death through an examination of Postman’s life and career and the field of media ecology that Postman introduced.

This is a book about our prospects for the future, which can only be based on the ways in which we think and talk about the present.


And here are the two blurbs that accompany it, for which I am truly grateful, and humbled:

"When Neil Postman's Amusing Ourselves to Death is brought into the classroom, or given as a gift, or handed from one reader to another, a problem is created: into what frame should we place this book? For that’s how unique it is. Lance Strate has solved that problem by writing a graceful and learned companion to Postman’s original. It doubles as a biographical sketch of a great man and his intellectual times. It is also an act of love. And if you love the book it’s about, you will be grateful for Strate’s Amazing Ourselves to Death. I am. And I highly recommend it." —Jay Rosen, Professor of Journalism, New York University

"Lance Strate masterfully brings to a new generation, and a new century, Neil Postman’s enlightening and essential insights into the ways that our uses of media reflect and reshape our society. He further shows how we can reclaim control, so we can use the ever-evolving media rather than letting them use us." —Deborah Tannen, University Professor and Professor of Linguistics, Georgetown University


And finally, here's the About the Author bit:

Lance Strate studied with Neil Postman at New York University, where he earned his Ph.D., and is currently Professor of Communication and Media Studies at Fordham University. The author of Echoes and Reflections and On the Binding Biases of Time, he is a recipient of the Media Ecology Association's Walter Ong Award for Career Achievement in Scholarship.


And for a bit of background on how the book came to be, it all started when I was contacted by David Park, who edits a series called "A Critical Introduction to Media and Communication Theory" for Peter Lang, an academic publisher. Working with Peter Lang's Acquisitions Editor Mary Savigar, David asked me if I knew of anyone who might be willing and able to re-examine Postman's arguments in light of the changes to our media environment over the last three decades.

It was an offer I couldn't refuse, especially since I had the perfect title, one that I had used for a public lecture I gave at Medaille College in 2007, at the invitation of philosophy professor Gerald Erion. Amazing Ourselves to Death struck me as a good way to combine Postman's arguments about television in Amusing Ourselves to Death with his critique of our love affair with technology and especially information technology in Technopoly. And given that Postman began Amusing Ourselves to Deathby arguing that Huxley's dystopia better fit late 20th century American culture than Orwell's, and that Huxley had followed up on his 1932 novel, Brave New World, with a set of essays entitled Brave New World Revisited in 1958, the subtitle for my book—Neil Postman's Brave New World Revisited—was easy enough to come up with.




And now this, the Table of Contents from the book:


Acknowledgments

Foreword

Part I

Chapter 1: Fatal Amusements
Chapter 2: Building a Bridge to Neil Postman
Chapter 3: Media Ecology as a Scholarly Activity
Chapter 4: The Evolving American Media Environment

Part Two

Chapter 5: Breaking the News
Chapter 6: The Tribe Has Spoken
Chapter 7: Neon Gods
Chapter 8: Grand Theft Education
Chapter 9: The Tempest

References
Index

And here are the first two paragraphs from the Foreword:

I imagine there are two kinds of readers of this book, those who have already read Neil Postman's Amusing Ourselves to Death: Public Discourse in the Age of Show Business (1985), and those who have not. For those who have not, my goal is to provide you with a summary of Postman's arguments concerning the negative effects of the television medium, and technology more generally, on public discourse and social institutions, along with a demonstration of their continued relevance to our contemporary culture and media environment. I know there are some who inevitably question the value and validity of a book that is, as of this writing, almost thirty years old, and not getting any younger, and would perhaps remain unmoved by a reminder that we still study Plato's writings from the 4th century BCE. And there is no denying the fact that Amusing Ourselves to Death does not take into account the Internet, web, social media, and mobile technology, let alone the explosive growth of programming options made available via cable and satellite television, while the Reagan-era culture that Postman critiques continues to recede into the past. In presenting you with an up-dated analysis, I realize that the passage of time will render my references increasingly less relevant as well. For this reason, my intent is also to present Postman's overall approach, grounded in the field of media ecology, and show how it can continue to be applied in the future. Of course, if you have not read Amusing Ourselves to Death yet, I hope that this book will convince you to do so, and enhance your reading as you do so.

Readers already familiar with Postman are aware of his exceptional eloquence, a standard that I make no claims of approaching. Postman wrote for a general readership, addressing major issues and concerns of his time, and like many of his other books, Amusing Ourselves to Death is best understood as an extended essay, meant to stand on its own. In taking a scholarly approach to Postman's work, I have endeavored to relate Amusing Ourselves to Death to Postman's others books, especially Technopoly (1992). This is also essential to the task of updating Postman's arguments to take into consideration computers, information technology, and new media, and the proliferation of technology in general. I have chosen the title Amazing Ourselves to Death to reflect this larger scope, and the fact that it is ultimately our innovations in media and technology that are the cause for considerable concern. The subtitle, Neil Postman's Brave New World Revisited, alludes to Aldous Huxley's set of essays, Brave New World Revisited (1958), reflections on his novel, Brave New World (1932), which Postman highlights as prescient in its warnings of a future in which freedom is sacrificed for the sake of fun. In addition to situating Amusing Ourselves to Death within Postman's entire body of work, I have further endeavored to contextualize his arguments through a biographical sketch, and a general discussion of the field of media ecology with which he was associated. All of these subjects require much fuller treatment than can be accorded here, but I hope that what I have provided will be sufficient as a starting point for further investigation.


And we'll leave it at that, at least for now, but I'll be posting more on the topic in the near future.

Sunday, May 18, 2014

Hiatus, Discontinuity, and Change

 Well, it has been a while since I last posted here on Blog Time Passing. I suppose it goes without saying that I've been busy, but, hey, excuses, excuses. So there's a lot that I have to catch up on, and in thinking on how to get started on that, it seemed altogether appropriate to begin with my last guest post written for the Hannah Arendt Center blog's Quote of the Week feature, Hiatus, Discontinuity, and Change. It was originally posted on April 14th, and once more I am grateful to Bridget Hollenback for selecting the illustrations, which I've included here.




"The end of the old is not necessarily the beginning of the new."

Hannah Arendt, The Life of the Mind

This is a simple enough statement, and yet it masks a profound truth, one that we often overlook out of the very human tendency to seek consistency and connection, to make order out of the chaos of reality, and to ignore the anomalous nature of that which lies in between whatever phenomena we are attending to.

Perhaps the clearest example of this has been what proved to be the unfounded optimism that greeted the overthrow of autocratic regimes through American intervention in Afghanistan and Iraq, and the native-born movements known collectively as the Arab Spring. It is one thing to disrupt the status quo, to overthrow an unpopular and undemocratic regime. But that end does not necessarily lead to the establishment of a new, beneficent and participatory political structure. We see this time and time again, now in Putin's Russia, a century ago with the Russian Revolution, and over two centuries ago with the French Revolution.

Of course, it has long been understood that oftentimes, to begin something new, we first have to put an end to something old. The popular saying that you can't make an omelet without breaking a few eggs reflects this understanding, although it is certainly not the case that breaking eggs will inevitably and automatically lead to the creation of an omelet. Breaking eggs is a necessary but not sufficient cause of omelets, and while this is not an example of the classic chicken and egg problem, I think we can imagine that the chicken might have something to say on the matter of breaking eggs. Certainly, the chicken would have a different view on what is signified or ought to be signified by the end of the old, meaning the end of the egg shell, insofar as you can't make a chicken without it first breaking out of the egg that it took form within.

eggs

So, whether you take the chicken's point of view, or adopt the perspective of the omelet, looking backwards, reverse engineering the current situation, it is only natural to view the beginning of the new as an effect brought into being by the end of the old, to assume or make an inference based on sequencing in time, to posit a causal relationship and commit the logical fallacy of post hoc ergo propter hoc, if for no other reason that by force of narrative logic that compels us to create a coherent storyline.  In this respect, Arendt points to the foundation tales of ancient Israel and Rome:


We have the Biblical story of the exodus of Israeli tribes from Egypt, which preceded the Mosaic legislation constituting the Hebrew people, and Virgil's story of the wanderings of Aeneas, which led to the foundation of Rome—"dum conderet urbem," as Virgil defines the content of his great poem even in its first lines. Both legends begin with an act of liberation, the flight from oppression and slavery in Egypt and the flight from burning Troy (that is, from annihilation); and in both instances this act is told from the perspective of a new freedom, the conquest of a new "promised land" that offers more than Egypt's fleshpots and the foundation of a new City that is prepared for by a war destined to undo the Trojan war, so that the order of events as laid down by Homer could be reversed.

Fast forward to the American Revolution, and we find that the founders of the republic, mindful of the uniqueness of their undertaking, searched for archetypes in the ancient world. And what they found in the narratives of Exodus and the Aeneid was that the act of liberation, and the establishment of a new freedom are two events, not one, and in effect subject to Alfred Korzybski's non-Aristotelian Principle of Non-Identity. The success of the formation of the American republic can be attributed to the awareness on their part of the chasm that exists between the closing of one era and the opening of a new age, of their separation in time and space:


No doubt if we read these legends as tales, there is a world of difference between the aimless desperate wanderings of the Israeli tribes in the desert after the Exodus and the marvelously colorful tales of the adventures of Aeneas and his fellow Trojans; but to the men of action of later generations who ransacked the archives of antiquity for paradigms to guide their own intentions, this was not decisive. What was decisive was that there was a hiatus between disaster and salvation, between liberation from the old order and the new freedom, embodied in a novus ordo saeclorum, a "new world order of the ages" with whose rise the world had structurally changed.

I find Arendt's use of the term hiatus interesting, given that in contemporary American culture it has largely been appropriated by the television industry to refer to a series that has been taken off the air for a period of time, but not cancelled. The typical phrase is on hiatus, meaning on a break or on vacation. But Arendt reminds us that such connotations only scratch the surface of the word's broader meanings. The Latin word hiatus refers to an opening or rupture, a physical break or missing part or link in a concrete material object. As such, it becomes a spatial metaphor when applied to an interruption or break in time, a usage introduced in the 17th century. Interestingly, this coincides with the period in English history known as the Interregnum, which began in 1649 with the execution of King Charles I, led to Oliver Cromwell's installation as Lord Protector, and ended after Cromwell's death with the Restoration of the monarchy under Charles II, son of Charles I. While in some ways anticipating the American Revolution, the English Civil War followed an older pattern, one that Mircea Eliade referred to as the myth of eternal return, a circular movement rather than the linear progression of history and cause-effect relations.

The idea of moving forward, of progress, requires a future-orientation that only comes into being in the modern age, by which I mean the era that followed the printing revolution associated with Johannes Gutenberg (I discuss this in my book, On the Binding Biases of Time and Other Essays on General Semantics and Media Ecology). But that same print culture also gave rise to modern science, and with it the monopoly granted to efficient causality, cause-effect relations, to the exclusion in particular of final and formal cause (see Marshall and Eric McLuhan's Media and Formal Cause). This is the basis of the Newtonian universe in which every action has an equal and opposite reaction, and every effect can be linked back in a causal chain to another event that preceded it and brought it into being. The view of time as continuous and connected can be traced back to the introduction of the mechanical clock in the 13th century, but was solidified through the printing of calendars and time lines, and the same effect was created in spatial terms by the reproduction of maps, and the use of spatial grids, e.g., the Mercator projection.

And while the invention of history, as a written narrative concerning the linear progression over time can be traced back to the ancient Israelites, and the story of the exodus, the story incorporates the idea of a hiatus in overlapping structures:

A1.  Joseph is the golden boy, the son favored by his father Jacob, earning him the enmity of his brothers
A2.  he is sold into slavery by them, winds up in Egypt as a slave and then is falsely accused and imprisoned
A3.  by virtue of his ability to interpret dreams he gains his freedom and rises to the position of Pharaoh's prime minister

B1.  Joseph welcomes his brothers and father, and the House of Israel goes down to Egypt to sojourn due to famine in the land of Canaan
B2.  their descendants are enslaved, oppressed, and persecuted
B3.  Moses is chosen to confront Pharaoh, liberate the Israelites, and lead them on their journey through the desert

C1.  the Israelites are freed from bondage and escape from Egypt
C2.  the revelation at Sinai fully establishes their covenant with God
C3.  after many trials, they return to the Promised Land

It can be clearly seen in these narrative structures that the role of the hiatus, in ritual terms, is that of the rite of passage, the initiation period that marks, in symbolic fashion, the change in status, the transformation from one social role or state of being to another (e.g., child to adult, outsider to member of the group). This is not to discount the role that actual trials, tests, and other hardships may play in the transition, as they serve to establish or reinforce, psychologically and sometimes physically, the value and reality of the transformation.

In mythic terms, this structure has become known as the hero's journey or hero's adventure, made famous by Joseph Campbell in The Hero with a Thousand Faces, and also known as the monomyth, because he claimed that the same basic structure is universal to all cultures. The basis structure he identified consists of three main elements: separation (e.g., the hero leaves home), initiation (e.g., the hero enters another realm, experiences tests and trials, leading to the bestowing of gifts, abilities, and/or a new status), and return (the hero returns to utilize what he has gained from the initiation and save the day, restoring the status quo or establishing a new status quo).

Understanding the mythic, non-rational element of initiation is the key to recognizing the role of the hiatus, and in the modern era this meant using rationality to realize the limits of rationality. With this in mind, let me return to the quote I began this essay with, but now provide the larger context of the entire paragraph:


The legendary hiatus between a no-more and a not-yet clearly indicated that freedom would not be the automatic result of liberation, that the end of the old is not necessarily the beginning of the new, that the notion of an all-powerful time continuum is an illusion. Tales of a transitory period—from bondage to freedom, from disaster to salvation—were all the more appealing because the legends chiefly concerned the deeds of great leaders, persons of world-historic significance who appeared on the stage of history precisely during such gaps of historical time. All those who pressed by exterior circumstances or motivated by radical utopian thought-trains, were not satisfied to change the world by the gradual reform of an old order (and this rejection of the gradual was precisely what transformed the men of action of the eighteenth century, the first century of a fully secularized intellectual elite, into the men of the revolutions) were almost logically forced to accept the possibility of a hiatus in the continuous flow of temporal sequence.

Note that concept of gaps in historical time, which brings to mind Eliade's distinction between the sacred and the profane. Historical time is a form of profane time, and sacred time represents a gap or break in that linear progression, one that takes us outside of history, connecting us instead in an eternal return to the time associated with a moment of creation or foundation. The revelation in Sinai is an example of such a time, and accordingly Deuteronomy states that all of the members of the House of Israel were present at that event, not just those alive at that time, but those not present, the generations of the future. This statement is included in the liturgy of the Passover Seder, which is a ritual reenactment of the exodus and revelation, which in turn becomes part of the reenactment of the Passion in Christianity, one of the primary examples of Campbell's monomyth.

Arendt's hiatus, then represents a rupture between two different states or stages, an interruption, a disruption linked to an eruption. In the parlance of chaos and complexity theory, it is a bifurcation point. Arendt's contemporary, Peter Drucker, a philosopher who pioneered the scholarly study of business and management, characterized the contemporary zeitgeist in the title of his 1969 book: The Age of Discontinuity. It is an age in which Newtonian physics was replaced by Einstein's relativity and Heisenberg's uncertainty, the phrase quantum leap becoming a metaphor drawn from subatomic physics for all forms of discontinuity. It is an age in which the fixed point of view that yielded perspective in art and the essay and novel in literature yielded to Cubism and subsequent forms of modern art, and stream of consciousness in writing.

cubism

Beginning in the 19th century, photography gave us the frozen, discontinuous moment, and the technique of montage in the motion picture gave us a series of shots and scenes whose connections have to be filled in by the audience. Telegraphy gave us the instantaneous transmission of messages that took them out of their natural context, the subject of the famous comment by Henry David Thoreau that connecting Maine and Texas to one another will not guarantee that they have anything sensible to share with each other. The wire services gave us the nonlinear, inverted pyramid style of newspaper reporting, which also was associated with the nonlinear look of the newspaper front page, a form that Marshall McLuhan referred to as a mosaic. Neil Postman criticized television's role in decontextualizing public discourse in Amusing Ourselves to Death, where he used the phrase, "in the context of no context," and I discuss this as well in my recently published follow-up to his work, Amazing Ourselves to Death.

The concept of the hiatus comes naturally to the premodern mind, schooled by myth and ritual within the context of oral culture. That same concept is repressed, in turn, by the modern mind, shaped by the linearity and rationality of literacy and typography. 

As the modern mind yields to a new, postmodern alternative, one that emerges out of the electronic media environment, we see the return of the repressed in the idea of the jump cut writ large.
There is psychological satisfaction in the deterministic view of history as the inevitable result of cause-effect relations in the Newtonian sense, as this provides a sense of closure and coherence consistent with the typographic mindset. And there is similar satisfaction in the view of history as entirely consisting of human decisions that are the product of free will, of human agency unfettered by outside constraints, which is also consistent with the individualism that emerges out of the literate mindset and print culture, and with a social rather that physical version of efficient causality. What we are only beginning to come to terms with is the understanding of formal causality, as discussed by Marshall and Eric McLuhan in Media and Formal Cause. What formal causality suggests is that history has a tendency to follow certain patterns, patterns that connect one state or stage to another, patterns that repeat again and again over time. This is the notion that history repeats itself, meaning that historical events tend to fall into certain patterns (repetition being the precondition for the existence of patterns), and that the goal, as McLuhan articulated in Understanding Media, is pattern recognition. This helps to clarify the famous remark by George Santayana, "those who cannot remember the past are condemned to repeat it." In other words, those who are blind to patterns will find it difficult to break out of them.

Campbell engages in pattern recognition in his identification of the heroic monomyth, as Arendt does in her discussion of the historical hiatus.  Recognizing the patterns are the first step in escaping them, and may even allow for the possibility of taking control and influencing them. This also means understanding that the tendency for phenomena to fall into patterns is a powerful one. It is a force akin to entropy, and perhaps a result of that very statistical tendency that is expressed by the Second Law of Thermodynamics, as Terrence Deacon argues in Incomplete Nature. It follows that there are only certain points in history, certain moments, certain bifurcation points, when it is possible to make a difference, or to make a difference that makes a difference, to use Gregory Bateson's formulation, and change the course of history. The moment of transition, of initiation, the hiatus, represents such a moment.

McLuhan's concept of medium goes far beyond the ordinary sense of the word, as he relates it to the idea of gaps and intervals, the ground that surrounds the figure, and explains that his philosophy of media is not about transportation (of information), but transformation. The medium is the hiatus.

The particular pattern that has come to the fore in our time is that of the network, whether it's the decentralized computer network and the internet as the network of networks, or the highly centralized and hierarchical broadcast network, or the interpersonal network associated with Stanley Milgram's research (popularly known as six degrees of separation), or the neural networks that define brain structure and function, or social networking sites such as Facebook and Twitter, etc. And it is not the nodes, which may be considered the content of the network, that defines the network, but the links that connect them, which function as the network medium, and which, in the systems view favored by Bateson, provide the structure for the network system, the interaction or relationship between the nodes. What matters is not the nodes, it's the modes.

Hiatus and link may seem like polar opposites, the break and the bridge, but they are two sides of the same coin, the medium that goes between, simultaneously separating and connecting. The boundary divides the system from its environment, allowing the system to maintain its identity as separate and distinct from the environment, keeping it from being absorbed by the environment. But the membrane also serves as a filter, engaged in the process of abstracting, to use Korzybski's favored term, letting through or bringing material, energy, and information from the environment into the system so that the system can maintain itself and survive. The boundary keeps the system in touch with its situation, keeps it contextualized within its environment.

The systems view emphasizes space over time, as does ecology, but the concept of the hiatus as a temporal interruption suggests an association with evolution as well. Darwin's view of evolution as continuous was consistent with Newtonian physics. The more recent modification of evolutionary theory put forth by Stephen Jay Gould, known as punctuated equilibrium, suggests that evolution occurs in fits and starts, in relatively rare and isolated periods of major change, surrounded by long periods of relative stability and stasis. Not surprisingly, this particular conception of discontinuity was introduced during the television era, in the early 1970s, just a few years after the publication of Peter Drucker's The Age of Discontinuity.

When you consider the extraordinary changes that we are experiencing in our time, technologically and ecologically, the latter underlined by the recent news concerning the United Nations' latest report on global warming, what we need is an understanding of the concept of change, a way to study the patterns of change, patterns that exist and persist across different levels, the micro and the macro, the physical, chemical, biological, psychological, and social, what Bateson referred to as metapatterns, the subject of further elaboration by biologist Tyler Volk in his book on the subject. Paul Watzlawick argued for the need to study change in and of itself in a little book co-authored by John H. Weakland and Richard Fisch, entitled Change: Principles of Problem Formation and Problem Resolution, which considers the problem from the point of view of psychotherapy. Arendt gives us a philosophical entrée into the problem by introducing the pattern of the hiatus, the moment of discontinuity that leads to change, and possibly a moment in which we, as human agents, can have an influence on the direction of that change.

To have such an influence, we do need to have that break, to find a space and more importantly a time to pause and reflect, to evaluate and formulate. Arendt famously emphasizes the importance of thinking in and of itself, the importance not of the content of thought alone, but of the act of thinking, the medium of thinking, which requires an opening, a time out, a respite from the onslaught of 24/7/365. This underscores the value of sacred time, and it follows that it is no accident that during that period of initiation in the story of the exodus, there is the revelation at Sinai and the gift of divine law, the Torah or Law, and chief among them the Ten Commandments, which includes the fourth of the commandments, and the one presented in greatest detail, to observe the Sabbath day. This premodern ritual requires us to make the hiatus a regular part of our lives, to break the continuity of profane time on a weekly basis. From that foundation, other commandments establish the idea of the sabbatical year, and the sabbatical of sabbaticals, or jubilee year. Whether it's a Sabbath mandated by religious observance, or a new movement to engage in a Technology Sabbath, the hiatus functions as the response to the homogenization of time that was associated with efficient causality and literate linearity, and that continues to intensify in conjunction with the technological imperative of efficiency über alles.

hiatus


To return one last time to the quote that I began with, the end of the old is not necessarily the beginning of the new because there may not be a new beginning at all, there may not be anything new to take the place of the old. The end of the old may be just that, the end, period, the end of it all. The presence of a hiatus to follow the end of the old serves as a promise that something new will begin to take its place after the hiatus is over. And the presence of a hiatus in our lives, individually and collectively, may also serve as a promise that we will not inevitably rush towards an end of the old that will also be an end of it all, that we will be able to find the opening to begin something new, that we will be able to make the transition to something better, that both survival and progress are possible, through an understanding of the processes of continuity and change.