Sunday, August 31, 2014

Time Warped

So, time being a theme that I come back to here from time to time, at least in passing, I thought I'd share this video that came to my attention via our friends at Brain Pickings, in a post entitled The Science of Our Warped Perception of Time, Animated. The video, posted by BrainCraft, and entitled Your Warped Perception of Time, is quite well done indeed, and in my opinion, worth a few minutes of our time:

So, there is a psychological parallel to the relativity of time in physics. This no doubt is related to the sense that people have that time slows down during an accident. What is left unsaid is what accounts for this phenomenon.

Perhaps it has to do with the time lag that inevitably exists between events occurring in the world and our perception of them. After all, it takes time for the light rays and ripples of sound to hit our eyes and ears, a little more time for those stimuli to cause nerve endings to fire, a little more time for those signals to reach the brain, and a little more time for the brain to process them. The delay may be miniscule, but it is a delay. 

So, perhaps there is an ordinary delay that is sufficient for most events occurring in everyday life, and then an expedited process of perception that occurs when there's potential danger, i.e., something is coming at us, passing through our various distance zones from public to social to personal to intimate space. So the nervous system speeds up in response to a threat, which in turn provides the illusion of time slowing down.

Of course, this doesn't account for time seeming to slow down when we're bored and speed up when we're having fun. Could it be that our nervous systems register boredom as a threat? You would think so from the way that we have developed so many forms of entertainment and technologies with the purpose of keeping ourselves amused, especially apparent in the way we all whip out our smart phones these days when encountering the slightest bit of down time. We may be amusing ourselves to death, but apparently our brains find that preferable to the threat of being bored to death, perhaps working under the false presumption that it is either one or the other.

Well, my time's up, so I'll be taking my leave. I don't want you to think I'm a danger to you, after all, even if, maybe I am?

Thursday, August 28, 2014

General Semantics in India

So, I was pleased to have received a new volume in the mail, all the way from Delhi, an anthology edited by Deepa Mishra, a professor of English at the University of Mumbai, entitled General Semantics: A Critical Companion. I'd give you the link for Amazon, but unfortunately it doesn't seem to be available there, but you can order the book from the Kaveri Book Service, where it seems to be very reasonably priced ($13.95 list, discounted to $12.56, as of today, but I don't know what they're charging for shipping and handling).

So, I had the sudden urge to use the "Add a Photo" function on Twitter. Yeah, I know it's old hat, but I've generally avoided that sort of thing. But since there wasn't an Amazon page with the book cover, I figured, let me do one of those twitpics for once.  So here it is:

So like, yeah, ummm, what's wrong with this picture? I took it via Mac's Photo Booth application, and I should have remembered that it does mirror images from the last time I used it:

Well, just another reminder that the map is not the territory, and maybe I'll have better luck (or memory) next time. In any event, with a little help from Mac's Preview application, let's straighten things out:

Yes, that's much better, don't you think? So let me share a little more, but now via my iPhone's TurboScan app. First, here's the front inside cover book flap, or whatever you call it:

And let's get the back cover inside book flap thing as well:

Not too shabby, eh? And let's include the title page while we're at it:

And let's flip over from the recto to the verso side:

Now that you have the basic information, I want to share the fact that I was very touched by the dedication page:

I met Balvant back in 2009, when I was his guest, and gave an address and a workshop at the Balvant Parekh Centre in Baroda (see my previous posts, A Workshop in Baroda and Just a Few More Images From India). Here's a photo from then, with Balvant on the left, and his brother NK on the right:

I was saddened to learn of his passing last year. One of the amazing things I learned when I met with him was that he first learned about general semantics and became interested in the subject by reading the journal ETC: A Review of General Semantics back when Neil Postman was editor (which ties us back in to my earlier reverso photo of Amazing Ourselves to Death: Neil Postman's Brave New World Revisited). Anyway, there is a nice write-up about the book over on the Balvant Parekh Centre website, where it says,

As a tribute to Late Balvant K. Parekh who believed in sharing the wealth of knowledge, Deepa Mishra, Associate Professor of English at CHM College has collected and edited articles from scholars who have been studying and doing general semantics for many decades, significant academicians from humanities and social sciences, students, researchers and new entrants to the discipline/ method of general semantics.

There's a picture of Dr. Mishra on the site as well:

And the page also included her own write-up about the book:

General Semantics should not be confused as any particular philosophy or discipline; it can be best regarded as a way of life, which if practiced, can truly open up the latent, but enormous human potential. Breaking away from the Aristotelian tradition which is believed to have casted human thinking in a particular mould, General Semantics does not ground itself in any metaphysical or religious contemplation. It rather draws its conclusions from the rich human experience over the last few thousand years and defines a unique way of engaging with the world around us. This engagement is not only non-conflicting, but also complementary in nature, promising to bind time in an inclusive and forward looking manner; generation after generation. These concepts were initiated by Alfred Korzybski in Manhood of Humanity and subsequently elaborated in much greater detail in his magnum opus, Science and Sanity. However, over the last 90 years, more particularly in the 64 years after Korzybski’s death, his ideas have got limited scholarly and empirical attention, restricted, by and large, only to the United States of America.

The efforts of Balvant Parekh Center for General Semantics and Other Human Sciences can be best regarded as “work-in-progress,” as it is trying to create a new kind of intensity around this early twentieth century thought-innovation. This book is an attempt to capture and showcase some of the recent discussions. While a few of the essays provide a short and precise detour, many of them try to see these thoughts through the prism of the contemporary times. In summary, it is an effective grouping of the old and the new, refreshing to the reader and more importantly, it attempts to create a familiarizing orientation for any new reader.

Of course, no doubt what you would really want to know is, what's in the book, and who the contributors are. Well, depending on your background, you should find at least one familiar name, and maybe a few others. Here are the Contents pages:

So, all in all a very nice volume indeed! Congratulations, Professor Mishra, on a great contribution to humanity's time-binding efforts, and a job well done!

Wednesday, August 20, 2014

The Rule Against Drinking on TV

Did you know that little Doogie Howser, MD is old enough to drink? Well, of course, it's been many years since that sitcom first aired, back in September of 1989, and many years since it last went off the air, over two decades ago, in July of 1993. Since then, the star of the show, Neil Patrick Harris, has gone on to bigger and better things, and emerged as a major comedic talent.

But, it seems, he's not too big to shill out for the advertising industry, and in particular for beer commercials.  Perhaps you've seen his recent starring role as a pitchman for Heineken Light beer?

Hey, there's no question that beer commercials have a long history of being some of the most amusing forms of advertising you can find on American television. But that doesn't change the fact that the product they're selling, alcoholic beverages, is associated with negative effects that are far from funny or entertaining. 

And as you may know, back in 1987 I co-authored a research report based on an analysis of the myths and cultural meanings of beer commercials: Myths, Men & Beer: An Analysis of Beer Commercials on Broadcast Television, 1987, by Neil Postman, Christine Nystrom, Lance Strate, Charles Weingartner. You can download a scanned PDF of the publication from ERIC, just click here.  I'm not sure exactly how many print copies were distributed and sold by the research sponsor, the American Automobile Association Foundation for Traffic Safety, but I know it numbers in the tens of thousands, at least. I also published several articles and chapters on the subject as a follow-up, one of the lesser known ones can be found online here.

Anyway, getting back to this particular commercial, let me put aside the way that the ad associates beer with a disregard for rules, and how that relates to the American cultural myth of masculinity, with rules seen as a challenge to be overcome rather than a structure to honor and work within, and how the romantic notion of being a rule-breaker may be a staple of hero narratives, but is particularly problematic when it comes to concerns such as underage drinking, and drinking and driving. Instead, I want to focus on the fact that this ad brings up the question of why is it that you never see anyone actually drinking beer in television commercials? The ad seems to suggest that it is due to regulations, which implies federal legislation passed by Congress, or policy adopted by the Federal Communications Commission or the Federal Trade Commission. Many people out there seem to believe this is the case, and see it as another case of unwarranted and unwanted government intrusion on the private sector.

And they're wrong! The government has nothing to do with it, neither the executive or legislative branches. As it turns out, the rule originates with the broadcasting industry. Now, let me note that this question was brought to my attention by Jon Greenberg, a staff writer for PunditFact, which according to their website,

is a project of the Tampa Bay Times and the Poynter Institute, dedicated to checking the accuracy of claims by pundits, columnists, bloggers, political analysts, the hosts and guests of talk shows, and other members of the media.

We define a pundit as someone who offers analysis or opinions on the news, particularly politics and public policy. One can engage in punditry by writing, blogging or appearing on radio or TV. A pundit is not an elected official, not a declared candidate nor anyone in an official capacity with a political party, campaign or government.

PunditFact is funded in part by $625,000 in grants over two years from the Ford Foundation and the Democracy Fund. Seed money for the project was provided by craigconnects.

So, anyway, I was one of many sources that Jon contacted to check up on the claim made in the beer ad, which was evaluated as being "mostly true" (meaning not entirely): A "regulatory thing" means you can’t show someone drinking beer on camera.  You can click on the link to see the article in its entirety (no, I'm not quoted in it, but it's still worth a look). It begins with a discussion of the Heineken ad, and then poses the question:

But we wondered about the director’s claim that a "regulatory thing" stops people from drinking beer in commercials. We’ve seen plenty of beer commercials and just always assumed that someone was drinking at some point.

The fact is, however, ad makers successfully are getting us to see more than is on screen.

In case you were wondering, it’s not the long arm of government that’s stopping people from a sip of sudsy brew. A press officer at the Federal Communications Commission, the body in charge of decency and other rules for broadcasters, said FCC rules are silent on drinking on camera.

"Congress has not enacted any law prohibiting broadcast advertising of any kind of alcoholic beverage, and the FCC does not have a rule or policy regulating such advertisements," she said, citing the agency’s website.

If there’s an iron fist, it belongs to the broadcasters.

Tara Rush, senior director of corporate communications at Heineken USA, said the rules come from TV networks.

"This is a regulation with the actual TV networks," Rush said. "It’s a long-standing rule."

The broadcasters’ trade group, the National Association of Broadcasters, has no policy itself, but a spokesman sent us articles that describe how each network is free to set its own standards and, as it stands, when it comes to beer, they frown on public displays of ingestion.

The Heineken ad alludes to this. Near the end, the director talks about network execs getting in a room to agree on a set of rules.

Now, speaking of the NAB, what I did find in response to Jon's query was an article entitled Ad of the Day: Neil Patrick Harris Doesn't Get Why He Can't Drink Heineken Light on TV But here's the explanation, if you're interested, by David Griner, which appeared in Adweek magazine. And Griner provides a somewhat different explanation of the NAB's role in the matter:

While it's not really explained in the ad, there's no law keeping Harris—or anyone else—from drinking a beer on camera. The United States government doesn't actually limit alcohol marketing at all, or as the FCC notes, "Congress has not enacted any law prohibiting broadcast advertising of any kind of alcoholic beverage, and the FCC does not have a rule or policy regulating such advertisements."

The brewing industry's Beer Institute has its own voluntary guidelines, and they're generally OK with showing beer drinking, too: "Although beer advertising and marketing materials may show beer being consumed (where permitted by media standards), advertising and marketing materials should not depict situations where beer is being consumed rapidly, excessively, involuntarily, as part of a drinking game, or as a result of a dare."

However, several broadcast networks continue to stick to a long-expired portion of the Television Code that prohibited showing alcohol being consumed. (Thus the ad's reference to "network execs in a room somewhere.")

Also, Canada has a bevy of beverage restrictions, including a rule against showing "scenes in which any such product is consumed, or that give the impression, visually or in sound, that it is being or has been consumed." As you can imagine, other countries have their own rules, too, making a beer ad with global reach a truly hamstrung affair.

So in short, yeah, it's complicated. And it's not too likely to change anytime soon.

So, there is a historical connection to the NAB, and specifically to its Code of Practices for Television Broadcasters, one that continues to influence industry policy. Unfortunately, the link provided by the Adweek article is to a Wikipedia entry that does not specifically reference the policy on showing people consuming alcohol on camera, and I was not able to find a copy of the Code itself through a cursory search online. But I do find Griner's explanation to be reasonable and persuasive, and kudos as well to Canada for its contribution towards keeping the advertisers in check. 

I should add that Griner also suggests that, "we can probably expect a similar gag to come around every few decades," reminding us that back in the 80s a similar commercial aired, featuring Paul Hogan, aka Crocodile Dundee, hawking Foster's:

Returning to the PunditFact piece, here's the response from the beer industry's spokesperson: 

A spokeswoman for The Beer Institute, the voice of brewers and distributors, told us their members are loath to take chances with network policy.

"If you’re putting an ad together, you will be as conservative as possible so you know it will get past all the networks," said Megan Kirkpatrick, director of communications at the Institute.

Kirkpatrick said the brewers have no desire to stir things up and risk stirring a cry for a new law.

"The fact that it is self-regulated now, that’s not something brewers would want to put in jeopardy," Kirkpatrick said. "It’s the way they have operated for decades. You show a lot of people enjoying a football game or enjoying a baseball game but you don’t show any consumption. I don't think you’re going to see that change."


Rush left us with this tantalizing thought about the long-standing rule.

"Some networks are now beginning to change it," Rush said.

Note that Kirkpatrick's assurances are not backed up by the official guidelines of the Beer Institute, as noted in the Adweek article. And while the PunditFact piece ends on a lighthearted note—"We doubt Heineken is hoping for a quick shift. If commercials start showing people sipping away, that Heineken ad will be about as enticing as, well, old beer."—based on past history I think we can assume that the beer industry would love to see the broadcasting industry's policy altered, and eradicated. 

And maybe you're saying, what's the big deal, anyway? My response is that alcohol is a special kind of product, which is why the United States Department of Justice has a special Bureau of Alcohol, Tobacco, and Firearms, now the Bureau of Alcohol, Tobacco, Firearms and Explosives. There are laws regulating drinking age and prohibiting drunk driving. We expect individuals to refrain from drinking while on the job in most occupations, and in a variety of other situations that require a measure of seriousness and decorum, not to mention concentration and coordination.

And we impose few limits on communication in the United States, in keeping with our First Amendment, but we do impose some on commercial speech, such as truth in advertising, and the ban on tobacco commercials. There are very few limits imposed on alcohol advertising, however, too few in my opinion. This isn't about bringing back Prohibition, it's simply about asking for a reasonable amount of restraint. In holding to this one truly modest rule that says you can't show someone drinking on camera, broadcasters are acknowledging the fact that there is a significant difference between alcohol and toothpaste, between alcohol and smart phones, between alcohol and bottled water. I for one hope that our broadcasters will be able to not only hold their liquor, but also hold the line.

And I don't know about Neil Patrick Harris, but I am pretty confident that a certain Doogie Howser, MD, would agree...

Monday, August 18, 2014

Reflections on Reflection

So, my third op-ed for the Jewish Standard was published in their August 1st issue, and entitled Reflecting on Reflection, with the subtitle, "Jewish life needs—and provides—an opportunity to slow down, think, and soul-search."  And you may notice that I make use of some of the reading I mentioned in my previous post, Some More Reading for the Summer. Anyway, here it is:

One of the personal challenges we all face here and now, in 21st century America, is finding a time and a place for reflection. 
In the last century, it was said that no one has had a complete thought since the invention of the telephone, a device that we brought into our homes so that we could be interrupted by the outside world at any hour of the day or night. How quaint—and how naïve—that seems today, now that we carry our phones around with us wherever we go, and are continually bombarded by a variety of email and text messages, alerts, apps to play with, and yes, even actual phone calls. There seems to be no room in our busy schedules to simply sit and think, no escape from the deluge of information, interaction, and entertainment made available at our fingertips, the habitually twitching digits of this digital age.
Thinking, in and of itself, is not unique to our species, but human beings have developed a unique set of tools for thought that sets us apart from other forms of life.
First and foremost is language. Much of what we call thinking consists of talking to ourselves silently, carrying on an inner dialogue or monologue. Notice that for the most part, we do not think by somehow imagining that we are writing or typing, or reading our own words on a page or screen. Language is a set of sounds that convey meaning, and for tens of thousands of years—which is to say for most of our history as a species—human beings survived without the aid of the written word. And somewhere along the line, we learned how to internalize speech in the form of thought.
Compared to the spoken word, writing is a relatively recent development, dating back only about 5,500 years. Its purpose was to record speech in a durable form. Before writing, both speech and thought were fleeting, ephemeral, subject to the vagaries of memory. And while we should not discount the power of collective memory, writing gave language a permanence that we had never known before. Writing also made it possible to step back from our words, to see them as fixed signs, available for study.
In other words, writing gave us new tools for thought, allowing us to fix language in place, allowing our words to become the object of prolonged contemplation. Writing recorded the speech and the thoughts of others, allowing readers to view and review their statements and arguments. And writing gave us a way to step outside of our own thinking processes, to observe our thoughts from the outside.
Simply put, writing gave us a mirror for the mind. And in doing so, the written word made possible our capacity for reflection.
That capacity is the subject of an extended essay by Ellen Rose, a professor of education at the University of New Brunswick, which was published in book form, titled On Reflection: An Essay on Technology, Education, and the Status of Thought in the Twenty-First Century. In considering the meaning of the word “reflection,” Dr. Rose relates, “when I close my eyes and try to picture reflection, I immediately envision someone sitting in a book-lined room, reading or pondering silently.” She concludes that the essence of reflection is “deep, sustained thought for which the necessary pre-conditions are solitude and slowness.” Dr. Rose rightly argues that reflection is in decline—has been for some time now—because of our many technological innovations, particularly electronic media.

The decline of reflection is a cause for concern among thoughtful people everywhere, but it ought to be viewed as particularly alarming in regard to the future of the Jewish people. Our religion, tradition, and culture are based on the written word, on the Hebrew aleph-bet and the study of sacred texts, on Torah, Tanach, Talmud. Our rite of passage from childhood to adulthood, the bar or bat mitzvah, is a literacy test. Our houses of worship also are houses of learning, our synagogues also are schools.
It is worth recalling that one of the goals of Nazism was to wipe out the capacity for reflection, and not simply in the service of totalitarian domination. Consider the following observation on the part of historian Elizabeth Eisenstein in Divine Art, Infernal Machine: The Reception of Printing in the West from First Impressions to the Sense of an Ending:
"Anti-Semitic stereotypes attributed a soft, flabby, and sedentary lifestyle to the bookish Jew, in contrast to the masculine, muscular Aryan. Observers in 1933 witnessed the book-burnings of works by Jews and other “decadent” authors, along with the elimination of the same works from libraries and bookshops. The elimination of Jewish books served as a prelude to measures in the next decade aimed at eliminating the Jews themselves."

The problem we face today is not the elimination of books, but their growing irrelevance to our lives. Could the disappearance of the quiet time we need for reading and for thinking, for the solitude and slowness that forms the basis of deep, sustained thought, possibly be a prelude for a more serious threat to Jewish survival, as a culture or even as a people?
For Dr. Rose, the best hope for the future lies with education. But we also can turn to another opportunity to claim a time and space for reflection, in Jewish worship services of any stream, Orthodox or Conservative, Reform or Reconstructionist. Prayer is a form of thought, an exercise in ways of thinking that differ from our everyday thought patterns. And prayer provides an opportunity for profound forms of soul searching, serious introspection, contemplation, and meditation. If we are to reclaim our capacity for reflection, and in doing so safeguard what is essential to our tradition and culture, we will need both our schools and our shuls.

As compared to my two previous op-eds (see my previous posts, On Jewish Characters in American Television Series and Jewish Movie Marvels), this one is less popular culture-oriented, and more media ecological in its approach. And while is specifically addressed to a Jewish readership, I hope it is clear that the ideas apply to other religions as well, and to other non-religous practices that encourage mediation and mindfulness.

Friday, August 15, 2014

Some Comments on Defining "Medium" in Media Ecology Scholarship

I recently chimed in a couple of times over on the Media Ecology Association's discussion list, during a prolonged, and indeed overly long exchange that included an attempt to educate a graduate student about both media ecology and scholarly method in general. While my part in the interaction was limited, and I salute John Walter, a Fellow at the Walter J. Ong Center for Language, Media, and Culture at Saint Louis University, for his heroic efforts in that cause, I did want to share some of my comments here, for whatever they may be worth. I have modified the comments somewhat to take them out of the context of addressing particular individuals, and otherwise removed anything I thought to be irrelevant or not right for this context.

The first comment was on the question of how we define the key term of medium, and in it I try to make a serious point in a lighthearted way:

The sky is the medium for birds, hot air balloons, airplanes, and missiles. It's the medium of flight, and projectiles.

A stone can be a medium of conflict, just ask Goliath, or a medium for statues and monuments.

The book is a medium for storing writing. Writing is a medium for recording the spoken word in a visual, non-ephemeral form. Speech is a medium for expression, interaction, influence, transmission, and thought based on the code of language. Language is a medium for transforming the chaos of the perceived world into a verbal map that provides the illusion of stability, coherence, predictability, and abstraction.

This discussion list is a medium for the interchange of comments that range from the ridiculous to the sublime.

The following, more extended commentary was a specific response that draws on general semantics, in particular the need for operational definitions, in discussing the way that scholars and researchers proceed to set up their studies:

You should understand that definitions of key terms ought to be presented as operational definitions, meaning that they are put forth as definitions "for the purposes of this study." This doesn't mean that you can attach any random definition to any particular term, there still has to be support for it, but it does mean that the definitions you do wind up using cannot and should not be applied to uses of the terms outside of your study. That is to say, you cannot define your terms a certain way, and then act as if others using the term are using the same definitions as you are, ignoring the differences. And you cannot simply act as if others who do not use your definitions are using the wrong definitions. To do so, to ignore or forget that the definitions you put forth are your own creation and not some natural or essential quality of the phenomenon in question, and that makes you guilty of reification, which is exactly what you are doing, reifying your definition of "medium" (Neil Postman would call this either crazy talk or stupid talk, or maybe both).

Moreover, the propaganda technique of persuasive definition involves putting forth your own definition as the only possible definition. While I don't think your intention is to propagandize in the typical sense, you are trying to force your definition on the rest of us by asserting and insisting that it is the only correct way to understand the term. In this sense, you are acting as a propagandist for your views. This goes against the norms of acceptable scholarship.

Just to give a few counterexamples, let's take Harold Innis, who distinguishes between heavy and light media. In his work, the clay tablets used to record Sumerian cuneiform are a heavy medium. The papyrus sheets and scrolls used to record Egyptian hieroglyphics are a light medium. Or let's consider how artists refer to the materials they use as their medium, e.g., oil paints on canvas, charcoal on paper, carvings from wood, busts chiseled from marble, etc. Or how Edward Sapir referred to language as the medium for literature, and how different languages constitute different media. Or how Edmund Carpenter likewise stated that every language is a mass medium. Or how McLuhan argued that a medium is any extension of the human body, and thereby including all forms of technology and human invention and innovation. Or how Postman in his keynote address to the first MEA convention said that a medium is a technology within which a culture grows. Indeed, Joshua Meyrowitz has written several essays about the differences between definitions of "medium" that view the concept as a form of transportation or conduit, as a language, and as an environment.

As for your distinction between philosophical and practical discussions, I think there are many who would take issue with the idea that practical discussions are devoid of any philosophical basis, and maybe even that philosophical discussions are devoid of any practical value. In any event, to just assert that your definition is more practical is a meaningless statement, because the question is, practical for what? For what purpose? To what end? This brings us back to operationalism. Your definition may be more narrow, more limited, than other definitions, but that alone does not prove that it has greater utility. By that logic, defining "medium" as a device powered by electricity that enables two people to speak to one another would be even narrower and more specific, and therefore more "practical" in your sense of the word, but it would not be a definition that would be considered particularly well formed or useful within the community of media ecology scholars, and by most outside of our community as well.

And I provided one more set of comments in response to some discussion on the distinction between "natural" and "artificial" in regard to media:

I understand the "natural" tendency to make this distinction, which seems clear enough at first glance. But, as you no doubt know, the artificial-natural dichotomy is often invoked in criticisms of technology and innovation, usually unfairly.

For my part, I find the distinction breaks down when we start to look at it closely within the field of media ecology. We know that McLuhan argued that the concept of "nature" was a product of the literate visualism of ancient Greece, and involved placing ourselves as human beings outside of nature, and in opposition to it. Oral cultures did not share this view, and McLuhan pointed to the emergence of the ecology movement as evidence that we were returning to a worldview closer to that of acoustic space in the electronic media environment.

The distinction particularly breaks down in regard to speech and language, which we understand to be "natural" and not a product of deliberate human invention, but falling within the category of media, at least for most media ecologists. To use another example, if I pick up a stone and throw it at you, I have not created or altered the "device" in any way, but I am using it as a medium, and arguable one that is very much interactive.

Then there is the phenomena that Lewis Mumford and Edward T. Hall point to of animal technology. We view nests and hives as "natural" but they certainly are "devices" created as means to certain ends by the birds and the bees, and even if you restrict your definition to media of communication, the point would hold because communication is one of the functions of nests and hives.

We may call animal behavior instinctive, but our capacity for symbolic communication is also inborn, and has been referred to as the language instinct. And I would suggest that we also have a technology instinct.

We also distinguish between natural and artificial selection in discussions of evolution, but doesn't that place us outside of nature, rather than seeing our own impact on the environment as part of the system, not separate from it? Isn't evolution one of the "natural" processes of the universe, whether it occurs within the formation of galaxies and stars, or species, or societies and cultures, or languages and technologies?

Of course, these last questions are rhetorical, the answer, at least in my mind, being yes. And you may not agree with what I have to say here, but whether you agree with me or not come down to a matter of definition, does it not?


Monday, August 11, 2014

Space Oddities and Eventualities

With the relationship between the United States and Russia growing increasingly more strained over the crisis in the Ukraine, there has been a great deal of concern over our reliance on Russian rockets. Not only have we been depending on Russian Soyuz rockets launched from their base in Kazakhstan to get our astronauts to the International Space Station ever since the end of NASA's space shuttle program in 2011, but we also import their rockets into the US to get our own satellites into orbit, and this includes our military and spy satellites.

Of course, our use of Russian space technology was only supposed to tide us over while private, commercial companies moved in to fill our government's needs. That's the American way, after all, to turn things over to the private sector whenever possible. And it has worked well for us in many instances, but there are some things that the private sector is just not equipped to handle, and some things that just ought not to be privatized. Like roads and highways and prisons, for example, and police forces and fire departments. Moreover, Neil Postman pointed out that western nations that did not privatize broadcasting in the way that the US did were able to mitigate some of the negative effects of the television medium. 

On the other hand, we have a long history of commercial transportation by land, waterways, and in the 20th century by air as well. Stanely Kubrick's 2001: A Space Odyssey famously portrayed a future where Pan Am, ironically enough given that the airline has been defunct since 1991, provided shuttle service from the Earth to an orbiting space station still under construction, and from there to our lunar colony.

Now, I want to acknowledge the very powerful argument made by Lewis Mumford, among others, that much of our space program has amounted to an enormous waste of resources that are sorely needed in so many other places. Having grown up during the Space Age, cheering on as we won the Space Race with the Soviet Union, I still have an emotional connection to the idea of space exploration, and I am more than a little disappointed that we are not living in the future that Kubrick depicted during my childhood, not to mention the kind of future Gene Roddenberry gave us in the original Star Trek TV series. And while I was not in favor of Ronald Reagan's Strategic Defense Initiative, popularly referred to as Star Wars, I did note that many science fiction fans actually came out in support of the proposal, not so much out of political conviction or military necessity, but because it provided a reason to expand our involvement in space.

So, from a hawkish perspective, there are legitimate national security concerns, given our present dependence on Russia, and China's expansion of its space program—they have their sights set on the moon now. From a more centrist position, there still are reasons to want to see space dominated by western democracies rather than nations with authoritarian regimes. And apart from these more practical considerations,
space exploration does represent the intangible value of inspiration, not just in lifting our morale, but in giving us a new perspective on ourselves and our world, as  summed up by Buckminster Fuller's famous phrase, spaceship earth, and with it in granting us the basis of a utopian vision of humanity united and looking outward, instead of consumed by internal conflicts. Now wouldn't that be something?

And maybe space is the means by which we can transcend our current travails, but we have a long way to go, and the question of whether our current activities are worth the price tag remains open. Be that as it may, the current concern over our capabilities of getting into space on our own reminded me of the video sensation that was a product of happier times, just a little over a year ago, in May of 2013. As you may recall, Canadian astronaut Chris Hadfield, the first Canadian to walk in space and the first Canadian to serve as commander of the International Space Station, caused quite a stir before returning to earth from his final mission in Earth orbit, when he posted a YouTube video consisting of footage videotaped aboard the International Space Station. It was a music video in which he sang the David Bowie song, "Space Oddity," but with a somewhat different set of lyrics. In case you missed it, here it is:

Now, for those of us of a certain age, "Space Oddity" was one of the best known and most popular songs of the progressive rock era, and it was David Bowie's first big hit back in 1969. For those of us who were into that kind of music, it was a song we loved to sing along to, or sing by ourselves. This was marvelously illustrated in a scene from the Adam Sandler 2002 film, Mr. Deeds (which was, of course, a remake of Frank Capra's 1936 classic Mr. Deeds Goes to Town):

The song also comes up in Ben Stiller's 2013 film, The Secret Life of Walter Mitty (which of course was a remake of the 1947 film of the same name starring Danny Kaye, both being adaptations of the 1939 short story by the brilliant humorist James Thurber). First there's the spoken word reference that comes up towards the end of this clip:

And then there's this wonderful scene, which has the performance of the song occurring in Mitty's imagination motivating him to actually continue on his adventure:

The scene mixes together with Kristen Wiig's imaginary barroom performance with David Bowie's original version. And I suppose I really ought to include the original recording here too, while I'm at it:

Now, when you consider how Chris Hadfield changed the lyrics, it makes perfect sense for him to not only make the song more descriptive of his experience as an actual astronaut, but to shift the sense away from Major Tom's essentially suicidal space walk. The original sensibility of the song was not at all the positive spin that Hadfield gives it. In some ways, it's more in line with the 2013 film Gravity, which actual astronauts have hailed as the most realistic cinematic depiction to date of what it's like to be up in space, albeit one that portrays a kind of disaster that has not yet occurred, a snowball effect where a destroyed satellite starts to take out other satellites resulting in a large debris field orbiting the Earth and destroying all in its path. It's a scenario that is not at all impossible, and sets up the main conflict in the film:

By the way, Sandra Bullock absolutely deserved the Academy Award for Best Actress for that film, no question about it. She was entirely brilliant in her performance, and about 80% of the film was her alone. But holding that aside, the film served as a reminder that space is not all fun and games and singing our favorite classic rock hits while floating in zero-g. In fact, it's an environment that is completely hostile to any form of terrestrial life, and space travel, at least as it exists now and will exist in the near future, is risky, dangerous, claustrophobic, disorienting, and absolutely inhuman. This all connects back to Mumford's criticism of the space program.

Getting back to "Space Oddity" by David Bowie, the song is a product of the sixties counterculture, and as such, runs counter to the depiction of astronauts as heroic types, blessed with what Tom Wolfe famously referred to as the right stuff. In Bowie's song, that other Tom, Major Tom, had become a media celebrity, was feeling vulnerable and impotent circling the globe in his tin can, and was apparently suffering from depression, although the song is also clearly inspired by Kurick's Space Odyssey where Dave Bowman also leaves his capsule at the end, but that occurs as part of a transcendent encounter with an alien intelligence. So perhaps Major Tom is also seeking the transcendence of becoming one with the universe, but the point is that he is no longer functioning in the efficient and predictable manner of an astronaut.

I would also venture to guess that science fiction writer Ray Bradbury's short story "The Rocket Man" (included in his 1951 anthology The Illustrated Man) was an influence on Bowie as well. The story depicts a near future in which astronauts are no longer the best and brightest, cream of the crop types, but rather more like workers, albeit ones working in extreme conditions, and therefore relatively well paid for their labor. Written a decade before Soviet cosmonaut Yuri Gagarin became the first man to orbit the Earth in 1961, Bradbury gave us a vision of the future in which space travel had become somewhat routine, and outer space a place of work, like, say, an oil rig or coal mine.

Bradbury's story gives us a future in which going to outer space is neither glamorous nor heroic, and that's what Bowie's song does as well. "The Rocket Man" is also the inspiration for the Elton John hit from 1972, "Rocket Man" (included on his Honky Château record album):

Interestingly, the Wikipedia entry on "Rocket Man" states that, "the song echoes the theme of David Bowie's 1969 song 'Space Oddity' (both recordings were produced by Gus Dudgeon)." A third song I would group together with these two is neither as spacey as Bowie's nor as wistful as the lyrics Bernie Taupin wrote for Elton John, but rather one that exhibits a bit of humor about it all: Harry Nilsson's: "Spaceman":

While owing much to Bradbury, I think Nilsson's version is the most prescient of the three and it's certainly the most fun!)). By the way, the view of space as a working environment involving a great deal of drudgery was the basis of the little known 1974 dark science fiction comedy directed by John Carpenter, Dark Star. The script for this relatively low budget, independent film, written by Carpenter and Dan O'Bannon, borrowed heavily from Bradbury's short story. Here's the trailer:

Dan O'Bannon, it is worth noting, went on to become the lead writer on the 1979 Ridley Scott horror-science fiction hybrid Alien, which further elaborated on the vision of a future in which astronauts are nothing more than employees of a corporation, in this case divided between white and blue collar types, and all considered expendable in an effort to obtain the incredibly dangerous and deadly alien for the company's weapons division. 

But maybe the way to bring this post to a close is with the Grateful Dead song, "Standing on the Moon" from their final album, Built to Last released in 1989:

Although Robert Hunter's lyrics were written almost two decades after "Space Oddity" was released, the sensibility is in keeping with the sixties, perhaps less dramatic, more, dare I say it, down to earth? "Standing on the moon, with nothing left to do, a lovely view of heaven, but I'd rather be with you." Mumford would approve—what really matters, in the end, is not our technological prowess, but our human relationships.

Wednesday, August 6, 2014

Future Sh(er)(l)ock

So, ok, maybe I overdid it with the poststructuralist style title for this post, with all those parentheses and all, but hey, I did publish an article about two decades ago with the title, Post(modern)man, about Neil Postman, a revised version of which appears in my book, On the Binding Biases of Time—and speaking of time, it's time for another plug:

So, now that we got that over with, what I want to relate is that while I was working on Amazing Ourselves to Death—oops, I think I feel another plug coming on...

So, as I was saying, while I was writing the book, I naturally went back over many of Postman's publications, including the often overlooked collection entitled Conscientious Objections—uh oh, here we go again:

Originally published in 1988, Conscientious Objections included one of Postman's most pointed critiques of the social and behavioral sciences and scientism, coupled together with an eloquent statement on the importance of media ecology, in the lead essay, entitled "Social Science as Moral Theology" (which alone is worth the price of the book). The subtitle, Stirring Up Trouble About Language, Technology, and Education provides a sense of the range of subject matter covered, including several essays that exhibit a general semantics orientation, including one on Alfred Korzybski.

Conscientious Objections also includes two essays that sum up the argument he makes in his two best known works critiquing the impact of television, The Disappearance of Childhood and Amusing Ourselves to Death (wait for it, wait for it, ok, here it comes)...

The first of the two essays in Conscientious Objections is entitled "The Disappearance of Childhood" which appropriately enough summarizes the book of the same title. But the other essay, the one that provides the gist of Amusing Ourselves to Death, is entitled "Future Shlock" (which may come as a bit of a shock to you, I know).  And of course, it is a play on the title of Alvin Toffler's popular book of 1970, Future Shock. You know what's coming next here, now, don't you?

If you're not familiar with the book, it's a bit of popular media ecology, one that struck me as very profound when I was in high school and an undergraduate in college. The phrase future shock was itself a play on the established notion of culture shock, with the idea that the rate of change had accelerated so drastically in the postwar era that we easily fall victim to a form of culture shock without leaving home, a kind of temporal culture shock, as we are unprepared for the pace of progress we have been undergoing, and have no defenses or means of coping with all of the change that we are experiencing.

Of course, you no doubt recall my post here on Blog Time Passing back in 2009, entitled, Shockingly, The Future Ain't What It Used To Be, which included a bit more discussion of future shock than I am including here. The reason I wrote that post was that I had discovered that a documentary that was made back in 1972, based on Future Shock, and featuring Orson Welles, had been uploaded to YouTube. The film provides an interesting window on what we were experiencing at that time, and where we thought we might be headed. It's a bit of nostalgia for those of us of a certain age, and certainly a period piece, but not without its relevance for the present day.

So, in doing this post, I went back to YouTube, which has changed its policies since 2009 regarding the length of videos it allows (certainly a significant contribution to information overload), and I was not terribly shocked to find the movie now available in one piece, instead of chopped up into five segments as it was back in 2009, which is how it appears on my earlier blog post. So, let me take this opportunity to embed the full film here and now, for your viewing pleasure:

But all of this is a digression, so let me also explain that I sometimes teach a course entitled Writing for Online Media for Fordham University's School of Professional and Continuing Studies, a course that counts towards the Professional Studies in New Media major, offered by the Professional Studies in New Media program that I am director of. In fact I'm just finishing up a summer session section of the course, taught as an online class. And I also teach a graduate version of the class, Writing for the Internet, for Fairleigh Dickinson University's MA Program in Media and Professional Communication (being offered this Fall semester). And one of the assignments I give my students is to make an edit to a Wikipedia entry, and then blog about it.

So, after rereading Postman's essay "Future Shlock"which begins with him making the claim to have coined the phrase future shock prior to Toffler's use of it (without, I hasten to add, making any judgment as to whether Toffler took it from him or came up with it independently), I decided to take a look at the Wikipedia entry on Future Shock. It includes a section with the heading "Term" which read as follows:

Toffler argued that society is undergoing an enormous structural change, a revolution from an industrial society to a "super-industrial society". This change overwhelms people. He believed the accelerated rate of technological and social change left people disconnected and suffering from "shattering stress and disorientation"—future shocked. Toffler stated that the majority of social problems are symptoms of future shock. In his discussion of the components of such shock, he popularized the term "information overload."
His analysis of the phenomenon of information overload is continued in his later publications, especially The Third Wave and Powershift.

Nothing wrong with that. But I decided to add the following immediately after:

In the introduction to an essay entitled "Future Shlock" in his book, Conscientious Objections, Neil Postman wrote: "Sometime about the middle of 1963, my colleague Charles Weingartner and I delivered in tandem an address to the National Council of Teachers of English. In that address we used the phrase "future shock" as a way of describing the social paralysis induced by rapid technological change. To my knowledge, Weingartner and I were the first people ever to use it in a public forum. Of course, neither Weingartner nor I had the brains to write a book called Future Shock, and all due credit goes to Alvin Toffler for having recognized a good phrase when one came along" (p. 162).
 I actually made this change on March 13, 2013, and checking on the entry now, I am pleased to report to you that the addition remains unchanged, with the sole exception that the quote from Conscientious Objections was separated out and turned into a block quote, a reasonable enough modification.

So, that's my bit of detective work, which I hope justifies my inclusion of Sherlock in the title, along with shock and shlock, and maybe it is a bit of shlocky sleuthing on my part, but I don't think there's any issue here that might require the services of some Future Shylock, do you?