Saturday, April 29, 2017

On Being Weary and Wary of ‘Awareness’

Before the month ends, I think I better share my latest op-ed published in the April 28th issue of the Jewish Standard, and posted online on their website hosted by the Times of Israel. The title of the piece is On Being Weary and Wary of ‘Awareness’ and I think I'll let it speak for itself:

April is Autism Awareness Month. As we are close to the end of the month, chances are that you’ve already seen or heard that statement.

So let me ask you: Are you more aware of autism now than you were at the beginning of the month? And what do we mean by this vague thing we call “awareness” anyway?

I looked online and found a “Cause/Awareness Monthly Calendar,” which confirmed my suspicions that almost every month of the year has multiple causes assigned to it. April has six listings, including Parkinson’s Disease Awareness Month and Sexual Assault Awareness Month. If there’s a cause out there that does not emphasize the goal of awareness, I have yet to come across it.

And yet I don’t see much in the way of assessment of this goal. How is awareness measured? Who measures it? How are the results distributed? I believe that awareness actually refers to attention, which is the basic currency of our electronically mediated environment. The primary question is: Is the cause in question getting enough attention from the news media, the entertainment media, and our social media? And secondarily, are the audiences and participants paying enough attention to these messages?

My daughter turned 21 this winter. When she was 2½ years old, she was diagnosed with autism. Looking back some 18 years ago, I know that what we call autism awareness was not very widespread, not even here in northern New Jersey, where there are the largest numbers and the greatest concentration of children with autism in the United States.

Back then, most estimates ranged from 1 in 1,000 to 1 in 500 children with autism nationwide. Increased awareness coincided with increased incidence, and now the estimates range between 1 in 45 and 1 in 68. And given the higher numbers in our region, this means that chances are you know someone with autism, or someone with a family member who has autism.

As the numbers grew, autism advocates began to call it an epidemic. Specifically, they referred to the epidemic of childhood autism. And it was an epidemic that affected families from all walks of life, from every income bracket and socioeconomic status, as well as every race, ethnicity, and religion.

A major turning point in autism awareness came when a grandson of Bob Wright was diagnosed with autism. Wright was the CEO of NBC at the time, and he and his wife, the late Suzanne Wright, founded Autism Speaks in 2005. Through his influence, autism suddenly received much more attention in the news and entertainment media than it ever had before.

It is worth asking ourselves why social problems only receive attention when the rich, the famous, and the powerful are touched by them, when the problem is experienced by someone close to a media professional or politician. Of course we are grateful when someone with a public platform finally speaks out. But why do awareness and attention have to depend on a contemporary variation on noblesse oblige?

And again, what is “awareness” all about? It is certainly a far cry from understanding.

I recently spoke with a friend and colleague whose son, about 10 years older than my daughter, also has autism. And we talked about the fact that our children will never really grow up, be able to live independently, have their own place, hold a normal job, marry, or raise children. About how much they depend on us and continue to depend on us. And about how uncertain their future is as we grow older, grow less and less able to care for them, and eventually will become unable to provide them with a home and necessary supervision.

We talked about what will happen to them when we’re gone.

It is so very hard for us to watch the parents of typical children celebrate the usual rites of passage and talk with mixed feelings about becoming empty nesters, knowing that fate has something else in store for us. Our special needs children require so much more of their parents than typical children as they’re growing up, and their special needs do not magically disappear when they become adults. The pressure never lets up, and it never goes away.

Awareness? Feh! Let’s face it, if you don’t live it, you just don’t understand, just can’t understand, not really. Not fully. So forgive me if I find all this talk about awareness to be awfully shallow, promoting the illusion that something real is happening merely by calling attention to causes on our news, entertainment, and social media.

I remember when Ronald Reagan was elected president, budgets were cut, policies were changed, and all of a sudden we saw schizophrenic individuals who previously had been institutionalized winding up on the streets, homeless and helpless, unable to take care of themselves. It was a shonda, a national disgrace.

Now think this through with me. For the past two decades, we’ve been made aware that there is an epidemic of childhood autism, with numbers steadily increasing. And be aware that there is no cure for autism. So now, be aware that we are facing an epidemic of adults with autism. And let me ask you, are you aware of what is being done to deal with this ticking social time bomb?


Local school districts are required to provide people with autism with an appropriate education until they age out after their 21st birthdays. After that, services are limited, if any exist at all. And for all but the most severe and violent individuals, we parents will try our best to take care of our children for as long as we are physically and psychically able.

How much longer do you think that will be?

We could have begun to prepare for the problem when Barack Obama was elected president. He had the right outlook. But the economy had just crashed under George W. Bush, Obama understandably was preoccupied with recovery from recession and with affordable healthcare, and he was faced with an obstructionist Congress for most of his tenure. Now that we have a Republican president, House and Senate, our government is back to cutting social services, so I doubt we can expect any proactive measures in the near future.

No, in all probability nothing will happen until the time when the parents of adults with autism no longer are able to provide them with a home, and the streets again are flooded with homeless people helpless to take care of themselves. When that happens, in the not too distant future, awareness will become more than a matter of news reports, feel-good films and TV programs, and social media memes. Awareness will become a face-to- face reality, an embarrassment, a source of guilt for the more enlightened, a source of fear for others. And only then will the public demand action, and public officials respond in kind. That’s what happened with the schizophrenics on the streets back in the 1980s.

So what does awareness mean to you? I guess it means that you’re aware that it’s Autism Awareness Month. I guess that amounts to awareness of awareness. And maybe, maybe, if you’re really made aware, that can lead to being informed. Maybe.My guess is that how well informed you are about autism depends on how close you are to an actual person with autism. And even then, after all, being informed is a far cry from actual action.

So please forgive me for being weary and wary of awareness. But please be aware of what’s coming down the pike, and when it happens, be aware that you were warned about it. And be aware that it was a failure of understanding, compassion, and foresight, and above all political will, that caused the problem.

That is the kind of awareness that we need to get across right now, in this month of April.

Saturday, March 4, 2017

Farce, Tragedy, and Hope

So, I suppose I shouldn't wait too long to share my latest op-ed for the Jewish Standard, which has already been published online on my blog for their Times of Israel site. This one was published in their February 24th issue, and "Farce, Tragedy, and Hope" was my original title for the piece, but the editor changed it to "Seasons of Scenarios" (you can let me know which one you think is better).

As I try to stress in this piece, I am not hoping or wishing or looking or calling for any of these outcomes. I am simply assessing the situation and giving my opinion on what might happen. The odds are great against any one of the four main scenarios happening, but taken together, I think they are pretty good that one of the four will occur.

So anyway, without further ado, my op-ed:

Judging by the topsy-turvy nature of the Trump administration’s first few weeks in office, you’d think that Purim has come early this year. Except for the fact that the story of Purim is something of a farce, albeit one that involves narrowly avoiding a tragedy, while the Trump presidency, many of us fear, is a farce that may, or will, or already has become a tragedy the likes of which even Shakespeare could not have imagined.

Before looking ahead to what may come to pass, let me begin by noting that as far back as the autumn of 2015 I started saying that Trump was going to be our next president. This was not an act of prophecy, I hasten to add, but rather an exercise in the sort of futurism that Alvin Toffler made popular with the publication of Future Shock back in 1970. What this requires is a careful review of history and attention to patterns and trends of the past.


In this instance, I noticed the parallels between the reality-TV-star-turned-candidate and our first (and so far only) movie star president, Ronald Reagan. As different as their demeanors and even their messages may have been, both were masters of the electronic media. For Reagan it was radio and television; for Trump it is TV and Twitter. And both exhibited that Teflon quality, whereby scandals and accusations that would sink anyone else’s political career seemed to bounce right off them. I was sure enough of the outcome that I bet a colleague $100 that Trump would be our next president, and did so at a time when it didn’t even seem likely that he would gain the Republican nomination.

When I was making my prediction, some thought it meant that I wanted Trump to win. I most certainly did not. For me, the point was to analyze the facts objectively and draw a logical conclusion. I stress this because now I want to make it clear that what I think may come next is based on the same kind of analysis. I am not absolutely certain about this, but I do believe there is a better than average, maybe even a good chance, that Trump will not finish his term.

I want to emphasize that I am not predicting that this will happen. I simply am noting that there are four distinct ways in which Donald Trump could be the first president since Nixon to serve less than the full four years to which he was elected.

The first possibility, and the one on everyone’s minds, is impeachment. It is nothing short of astounding that the possibility was being discussed even before the election took place. I won’t bother to list the many reasons why the House of Representatives might vote to bring articles of impeachment against Trump, and a trial leading to conviction and removal from office might take place in the Senate. I only want to note that the possibility exists now, even with Republican majorities in both chambers, and would become even more likely if midterm elections gave the Democrats full control of Congress.

A second possibility is resignation. Recall that Nixon was the last (and only) president to resign, and he did so to avoid impeachment. Trump might follow the same course if impeachment seems likely, or he faces some other legal action regarding his finances. And while many believe he has the kind of personality that would lead him to hold on and fight, everything about him as a politician has been characterized as unprecedented, so is it really unimaginable that he might decide that being president isn’t worth it to him, that walking out would be just like declaring a bankruptcy, and that he could do so while pinning the blame on the media, his political opponents, and anyone else he deems an enemy?

A third possibility is based on the fact that at the age of 70, he is the oldest person to move into the Oval Office, which means that his future life expectancy is limited. It follows that there is a chance he might die in office, or be otherwise unable to fulfill his responsibilities due to medical disability. Despite claims of good health, little about his medical history has been released to the public. Even if he has no pre-existing conditions, there is no getting around the fact that he was born in 1946; as any insurance agency would explain, it’s all a matter of statistical probabilities. (I’m not including the possibility of incapacity due to psychological issues here, because all but the most extreme forms of mental incapacity are difficult to prove.)

No doubt, even if it was clear that disability or death were due to natural causes, conspiracy theories about assassination attempts would abound. And given the friction that seems to exist between Trump and the intelligence community, the possibility of some form of poisoning, a time-honored staple for monarchies, dictatorships, and film and TV melodramas, undoubtedly would come to mind. The more straightforward forms of assassination would also constitute a fourth possibility. The last president to get shot was Reagan, a little more than two months into his first term. Gerald Ford was the victim of two assassination attempts; both times the shooters missed.

It would be only natural to assume that any attempt on Trump’s life would come from someone on the left, or perhaps an angry Muslim or Mexican. But I think it might well come from one of those alt-right types or Second Amendment people that Trump has been courting throughout his campaign and first weeks in office. If he doesn’t come through on the promises he made to them, or that they think he made to them, we can only imagine the kind of anger that a sense of betrayal would produce in extremists of that sort. As the prophet Hosea observed, “they that sow the wind shall reap the whirlwind.”

I want to stress that I am not wishing for any of these outcomes, and certainly not advocating for them. Any one of them would constitute a national trauma, and leave the United States even more divided into hostile camps than ever before. And after all, wouldn’t it be better still if Mr. Trump had a change of heart, and mind, and became the kind of president we all would hope for?

For this reason, let me outline a fifth scenario, and let’s call it a Purim scenario, with Trump in the role of the foolish king, Ahashverosh. We have some good candidates for the part of Haman in his administration, most notably in his senior counselor, Steve Bannon. Melania Trump has pulled a Vashti by not joining her husband at the White House. To select a replacement, Ahashverosh held what is sometimes considered the very first beauty pageant—Trump has had a long history with such events— but if anyone can play the role of Esther in this scenario, it would be his daughter Ivanka, a Jew by choice, who has been acting as a de facto first lady. Trump actually has said that he would want to date Ivanka if she wasn’t his daughter, and some find these and other comments he’s made about her creepy, but then again the traditional Purim story does not quite fit modern standards of propriety when it comes to attitudes toward women.

The important point is that Ivanka is known to be a moderating, even progressive influence on her father, and she is in the perfect position to play the role of savior in the manner of Queen Esther. All we need now is a Mordecai to help to motivate her. With Purim almost upon us, hope (and hopefully humor) springs eternal.

Saturday, February 25, 2017

Swimming Up Mainstream

So, I had an interesting exchange with Andrew Hoskins, a professor at the University of Glasgow, based on my quotes in the New York Times, as discussed in my recent blog post, How Netflix Is Deepening Our Cultural Echo Chambers. 

Andrew is currently working on a book about news and the concept of the "mainstream" and how that ideal or myth or sociological reality (take your pick, or view it as some combination of all three) might relate to changes in the media environment. As he put it, "You are spot on when you say that broadcast TV at its height served very significant social, cultural and political roles, but I wonder then to what extent its absence/demise today has shaped the current crisis in faith in the ‘mainstream’?"

Here now is my response, with a bit of editing to make it suitable for Blog Time Passing readers:

I think it might be fruitful to trace the idea of the mainstream back to that of the public. At the start of The Gutenberg Galaxy, McLuhan states that the public was a product of printing. And I think that when you look at Elizabeth Eisenstein's study of typography and its effects, the argument that the printing revolution formed the basis of the public sphere as outlined by Jürgen Habermas, among others, makes a lot of sense. 

This is the basis of Jay Rosen's notion of public journalism. Like me, Jay was a student of Neil Postman's, and his idea parallel's Postman's in Teaching as a Conserving Activity in looking at print-based institutions as needing to work against the biases of the electronic media environment. That's why  Jay argues that journalists need to create a public, and not only try to reach one. 

Of course, the problem is that the public is no more in an electronic environment, the effects of which include the blurring of public and private, as McLuhan, Joshua Meyrowitz in No Sense of Place, and others have noted (much more has been said about the decline and disappearance of privacy, but the fate of the private and the public are intertwined).

I would also note that Jacques Ellul, in his book Propaganda, explains how individualism, in breaking down ties based on tradition, locality, tribe, etc., leads to the mass, which consists of large numbers of individuals without any organic ties. Perhaps we can break this process down, so that the first stage of individualism, which McLuhan, Walter Ong, and others connect to the isolating effect of literacy, results in the formation of the public. 

Detribalized, able to free themselves from the need, in the absence of any external storage medium, to preserve knowledge through collective memory, able to view and review their thoughts and engage in critical evaluation, to think independently and to think novel thoughts, a group of readers becomes a public. As individual members of a public, they share a common literate culture, but one that also depends on orality in the form of public speaking, discussion, debate, deliberation, etc. We associate this type of speech with the agora and other gathering places, from Eisenstein's printers' shops to Habermas's coffee houses, but again it is an orality produced by literate mentalities, as are the dialogues Plato attributes to Socrates. 

Media environments are always built on and incorporate the environments that came before, so the ideal of the Enlightenment is based on a balance between literacy and orality, as Postman has suggested. And maybe there is an inverse relationship between the amount of dialogue and speech that mediates between print media and readers, and the shift from a public to the mass. 

The shift goes along with new technologies, steam powered printing for shifting the orality-literacy balance away from hearing and towards reading, the mechanical reproduction of images and photography as antagonistic to the word in all modes (spoken, written, and printed), telegraphy and further developments in telecommunications as increasing the potential for mass communication. It would follow that what Daniel Boorstin in The Image describes as the graphic revolution, based on these and other innovations, results in a shift from the public to the mass.

Anyway, what I would say is that electronic technology amplified the effects of print, at first, for example in the way that telegraphic messages took the form of telegrams and wire service reports in newspapers. With radio and then television, print became the content of broadcasting, as McLuhan would put it, as programming was often scripted, including news reporting, while programming following a schedule is also very much a typographic type of structure. 

So typographic biases were initially amplified, but it is important to keep in mind that amplification often turns into distortion. 

It was the internet that fully unleashed the potential of the electronic media, bringing back in a new way a kind of neo-tribalism. This relates to McLuhan's laws of media, specifically the law of reversal, as the mass, as an effect of the first stage of electronic telecommunications, flips into siloing, a reversal from the anonymous heterogeneity of the mass into groups based on affinity and shared identity. And/or, maybe the mass in and of itself is ultimately unsustainable, certainly going against the grain of human nature? 

Certainly, printing was associated with homogenizing culture and society, and electronic media always had the potential and the actuality of undoing that effect, that potential muted as long as print remained the content of broadcasting, but now unleashed as broadcasting and telecommunications become the content of online media.

It follows then, that the crisis of the mainstream, or its actual disappearance, is an effect of the electronic media, and quite possibly an irrevocable one at that. 

So, those are my thoughts on the matter, more or less, at least for now. Where do we go from here? That is a hard question to answer.

Thursday, February 23, 2017

La Comprensión de los Medios en la Era Digital

So, what's up with the Spanish title for this post, you may be asking? Or maybe you're just saying, ¿Qué pasa?

Well here's the story. A while back, my friends and colleagues from Mexico, Fernando Gutiérrez and Octavio Islas, asked me if I would co-edit an anthology with them on the theme of the 50th anniversary of the publication of Marshall McLuhan's Understanding Media: The Extensions of Man. That book was published back in 1964, and our volume came out last year, just a couple of years past the anniversary.

Now, let me make it clear that I do not speak Spanish, so my role as editor was somewhat different from what it would normally be. Aside from contributing my own chapter, I solicited 7 English language contributions, which I was responsible for editing, prior to their translation. In case you were curious, those chapter authors are Corey Anton, Paul Levinson, Paul Lippert, Robert K. Logan, Eric McLuhan, James Morrison, and Michael Plugh.

I also solicited one other contribution that was written in Portuguese, written by Brazilian scholars Adriana Braga and Adriano Rodrigues, which I didn't edit, seeing as I don't speak Portuguese either. The rest of the contributors were Carlos Fernández Collado, Jesús Galindo, Jorge Hidalgo, and Claudia Benassini, along with Fernando and Octavio.

So anyway, the book was published by a Mexico City based publisher, Alfaomega, and here's the cover:

And you can read all about the book on the publisher's website, if you can read Spanish, that is. And here is the Table of Contents:


Prefacio James Morrison

Capítulo 1 Eric McLuhan
50 años después...Retrospección y perspectiva de la obra de Marshall McLuhan

Capítulo 2 Fernando Gutiérrez
La contribución de Marshall McLuhan para la comprensión de los ambientes mediáticos en la nueva era digital

Capítulo 3 Octavio Islas
Apuntes esenciales para una mejor lectura de La Comprensión de los medios como las extensiones del hombre

Capítulo 4 Lance Strate
El mensaje en La comprensión de los medios

Capítulo 5 Carlos Fernández Collado
Topoguía descriptiva para La comprensión de los medios

Capítulo 6 Jesús Galindo
La ingeniería en comunicación social y el pensamiento de Marshall McLuhan. Diálogo sobre constructivismo tecnológico de lo social

Capítulo 7 Corey Anton
Cinco formas para entender... Los Medios como las Extensiones del Hombre

Capítulo 8 Paul Lippert
McLuhan como una forma de arte

Capítulo 9 Robert K. Logan
McLuhan y su comprensión de los medios

Capítulo 10 Adriana Braga y Adriano Rodrigues
El pensamiento sistémico y el sujeto en la obra de McLuhan

Capítulo 11 Michael Plugh
Un mundo inteligente: La extensión del proyecto de automatización de McLuhan

Capítulo 12 Jorge Hidalgo
La comprensión del hombre como una extensión de los medios

Capítulo 13 Paul Levinson
McLuhan en la era de los medios sociales

Capítulo 14 Claudia Benassini
La nueva aldea global: el caso Facebook

And in case you want to order a copy, here's the Amazon link (and ordering through this portal does help support these here blogging efforts):

✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾  ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾

And I do want to add, given the current political climate, that I am especially proud to have a longstanding connection to my colleagues in Mexico, and that I have great respect and affection for them, and for their nation. 

I should mention that Octavio Islas is now on the faculty of the Universidad de los Hemisferios (University of the Hemispheres), in Quito, Ecuador. So this anthology is truly a pan-American product!

And, I guess that all that's left to say is, adios, until next time.

Monday, February 13, 2017

Bob Dylan, Nobel Laureate

So, I've shared in some previous posts the programs that I've run as president of the New York Society for General Semantics, and hey, just click on the old link to check out the website I set up for the NYSGS, and while you're there, you can subscribe for updates (you don't have to be local to do so), and avail yourself of some of the resources I made available.

And over here on Blog Time Passing, I also shared Political Talk & Political Drama Part 1: Election 2016 and Political Talk & Political Drama Part 2, and My Language Poetry. Well, it's time for the next installment.

On November 30, we held a panel discussion and debate on the topic of Bob Dylan being awarded the 2016 Nobel Prize in Literature. The idea for the panel came from my friend and colleague, Thom Gencarelli. You see, back during some down time at the 2016 New York State Communication Association conference (Thom and I both being past presidents of that organization), we got into a discussion and a bit of an argument (which is to say a difference of opinion, nothing at all heated) about whether Dylan deserved the Nobel Prize or not. My view was, shall we say skeptical, his view was much more positive. And I went so far as to say that, from a literary standpoint, I believe that a century from now, Leonard Cohen will be better remembered than Dylan.

I hasten to add that I would certainly cede the high ground to Thom when it comes to music, as he's a gifted singer, songwriter, guitar player, and band leader, the name of his band being Blue Race, check them out on iHeartRadio, SoundCloud, and wherever music is sold online, I highly recommend them.

🔵 🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵 🔴

Anyway, I wouldn't question Dylan's significance for popular music and popular culture, but this is the Nobel Prize for literature that we're talking about, and that's a horse of another color. So, our discussion and disagreement became the basis of the last NYSGS program for 2016, and here is the write up for it:


A Conversation about Bob Dylan

and his 2016 Nobel Prize in Literature

On Thursday, October 13, 2016, the Swedish Academy announced that it had awarded Bob Dylan its Nobel Prize in Literature “for having created new poetic expressions within the great American song tradition.” While Dylan’s lack of acknowledgment and acceptance of the award until two weeks later raised controversy, this paled in comparison to the controversy raised right away as pundits in the professional media and across social media weighed in: He deserves it. He doesn’t deserve it. Popular songs aren’t literature. Lyrics aren’t poetry. If the Academy’s prize for literature is expanded to include popular song, is Dylan the only deserving songwriter? Is he the most deserving? Et cetera.

This roundtable discussion seeks to address, make sense of, and try to come to some conclusions with respect to all of this ruckus. The participants will consider questions including: What is the relationship of lyrics to poetry? What is the symbiotic relationship between lyrics and music in popular song? Is poetry literature? Are popular songs literature? What is the meaning and significance of the Nobel Prize, or any award for that matter? What is the significance of Bob Dylan? What is the literary value of his lyrics? What is so new and distinctive about his “poetic expressions” and use of language? And is everything important about Dylan and his contribution simply a matter of language?

Finally… does he deserve it?

Panel participants:

Thom Gencarelli, Professor of Communication, Manhattan College
Callie Gallo, English Department Teaching Fellow, Fordham University
Sal Fallica, Professor of Media Ecology, New York University
Lance Strate, NYSGS President & Professor of Communication & Media Studies, Fordham University

Thom served as moderator as well as panelist for the session, which featured a wide-ranging discussion that included multiple intersections with the discipline of general semantics. Thom is also the co-editor, with Brian Cogan, of an anthology entitled Baby Boomers and Popular Culture, and interestingly enough, Sal Fallica wrote one of the chapters, focusing on Dylan and awards ceremonies! (I also have a chapter in the volume, mine is on science fiction film and TV).

🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵  🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵

Callie Gallo, who is working on her doctorate in English literature at Fordham University, and has an interest in media ecology, helped to provide a fresh perspective to the program. And she wrote a very nice guest blog post on her experience for Hook & Eye, subtitled Fast Feminism, Slow Academe, which in the About the Blog blurb says the following:

Hook & Eye is an intervention and an invitation: we write about the realities of being women working in the Canadian university system. We muse about everything from gender inequities and how tenure works, to finding unfrumpy winter boots, decent childcare, and managing life’s minutiae. Ambitious? Obviously. We’re women in the academy.

Anyway, Callie's post is entitled, The Perks of Saying Yes in Grad School, and it's worth a read, so why don't you click on the old link, open up a new window, and check it out. It's okay, I'll wait until you're done, and you can meet me back here.

You made it back! Well done! So now, let me just note that our program was written up in an NYSGS blog post, and is also available on an NYSGS resource page, but of course, it's all right here as well, including the video recording which was uploaded to YouTube under the title of Music Lyrics Poetry Language: A Conversation About Bob Dylan & His Nobel Prize. And you know, you can watch it on YouTube, via the NYSGS channel, but yes yes yes, you can also watch it right here.

I should add that, unfortunately, I didn't have a volunteer to hold the iPad this was recorded on, to keep faces in the frame. And it wasn't too much of a problem as long as we were seated, as we were for most of the session. But it does begin with my introduction, followed by Thom's, both delivered while we were standing, so the first few minutes of the video is not the most flattering, let alone not being at all professional. But the sound quality is good, and once we were done with the intros and sat down, everything looked fine, aside from the fact that a little bit of the shelf the iPad was sitting on is visible on the bottom left part of the frame. But anyway, for better or worse, here it is:

So, what do you think? Click here for a list of all of the Nobel Prizes in Literature awarded since 1901. Does Bob Dylan belong on this list? Or is this, in its own way, a weird example of celebrity logic that parallels having a reality television star as president?

Saturday, February 11, 2017

A Look Back

So, as you may recall, I spent the Fall 2015 semester at Villanova University, with a visiting appointment as their Margaret E. and Paul F. Harron Endowed Chair in Communication, aka the Harron Family Chair. You can check out some of the blog posts I wrote at that time, My Villanova Adventure, Fatal Amusements Talk, Villanova Grad School Interview, VCAN Connected, and Fatal Amusements. And,  as I was looking at some blog posts that remain incomplete and unpublished here on Blog Time Passing (saved as drafts), I noticed one leftover from that time, that just had a single photograph that had been taken for publicity purposes. Here it is:

This was in my Villanova office, and since I happen to have some photographs I took on my iPhone, I can round out this post with some of those. First, a closer look at the shelves behind me:

I have a couple more books to add now, but I'll save that for later posts. 

Instead, let me provide a view from the outside of the office:

And another one below with the door closed:

And I also wanted to add a few shots of the wall opposite my office. The images stretched too far for me to capture in one snapshot, but here are a few pieces of the mosaic:

The lack of color and contrast made it challenging to get a good photo of the quotes, but here you can see one by McLuhan that made me feel right at home. 

Another quote from Walter Ong, how can you go wrong with that?

Not as well known as the others, but Lee Thayer certainly qualifies as a media ecologist, or at the very least a fellow traveler.

And good old John Dewey, that's pretty cool!

I'd also categorize Barnett Pearce as a fellow traveler. Some great choices all the way around.

Now, if you're looking for something a little more exotic than the Philadelphia suburbs, I'll be sharing pictures from my more recent visit to Henan University in Kaifeng, China, in upcoming blog posts. Hope you like them!

Thursday, February 9, 2017

How Netflix Is Deepening Our Cultural Echo Chambers

So, back on January 12th, I was quoted in The New York Times, which does not publish, as some would have it, all the fake news that's fit to print. The article first appeared online the day before, written by Farhad Manjoo, the title being How Netflix Is Deepening Our Cultural Echo Chambers. And yeah, that's a link to the item over on their website, in case you want to read it in its original and unadulterated state.

As for the original print version, when I got my copy of the paper, I figured it wouldn't be in the first section, which is devoted to hard news and the op-ed pages. So I  looked for it in the third section, Arts, and it wasn't there. Then I looked in the fourth section, Thursday Styles, and it wasn't there either. That left the second section:

That's right, the article made the front page of the Business Day section. Interesting image they got to go with it, don't you think? And maybe it's kind of ironic, in a media ecological sense, that the image takes up more space than the text, as if the paper is somehow trying to compete with television on television's own terms, rather than emphasizing what newspapers do best.

Be that as it may, the bulk of the article was continued on page five:

And of course, it's a bit hard to read off of the images, which are in fact images after all, included via Blogger's insert image function, courtesy of my iPhone's camera. So, not to worry, here's the text in easy to read form:

When “One Day at a Time” started its run on CBS in December 1975, it became an instant hit and remained so for almost a decade.

In its first year, “One Day at a Time,” a sitcom about working-class families produced by the TV impresario Norman Lear, regularly attracted 17 million viewers every week, according to Nielsen. Mr. Lear’s other comedies were even bigger hits: One out of every three households with a television watched “All in the Family,” for instance.

Last week, a new version of “One Day at a Time” started on Netflix. Critics praised the remake for its explorations of single parenthood and class struggle, a theme that has faded from TV since Mr. Lear’s heyday.
So, a seemingly innocuous topic, at least to begin with, another of a seemingly endless run of remakes appearing on film and video, this time via Netflix. And this was the topic that Mr. Manjoo wanted to discuss when he contacted me. I should add that I have previously been interviewed regarding other Norman Lear TV programs, notably All in the Family and Barney Miller, as I related in my previous posts here on Blog Time Passing, All in for All in the Family and A Sitcom to Remember (not to mention my WNYC interview on the similarities between Archie Bunker and Donald Trump, discussed in last year's post, From Bunker to Trump (via Reagan), as you no doubt recall).

Anyway, let's get back to the article, and the difference that four decades can make:

Yet, well intentioned and charming as the new streaming version may be, there’s a crucial aspect of the old “One Day at a Time” that it will almost certainly fail to replicate: broad cultural reach.

The two versions of “One Day at a Time” are noteworthy bookends in the history of television, and, by extension, the history of mass culture in America. The shows are separated by 40 years of technological advances—a progression from the over-the-air broadcast era in which Mr. Lear made it big, to the cable age of MTV and CNN and HBO, to, finally, the modern era of streaming services like Netflix. Each new technology allowed a leap forward in choice, flexibility and quality; the “Golden Age of TV” offers so much choice that some critics wonder if it’s become overwhelming.

It’s not just TV, either. Across the entertainment business, from music to movies to video games, technology has flooded us with a profusion of cultural choice.

This is all well and fine in regard to the changing dynamics of the media industries and entertainment providers, but Manjoo has a deeper concern, one shared by many cultural commentators, and certainly amenable to media ecological analysis:

More good stuff to watch and listen to isn’t bad. But the new “One Day at a Time” offers a chance to reflect on what we have lost in embracing tech-abetted abundance. Last year’s presidential election and its aftermath were dominated by discussions of echo chambers and polarization; as I’ve argued before, we’re all splitting into our own self-constructed bubbles of reality.

What’s less discussed is the polarization of culture, and the new echo chambers within which we hear about and experience today’s cultural hits. There will never again be a show like “One Day at a Time” or “All in the Family”—shows that derived their power not solely from their content, which might not hold up to today’s more high-minded affairs, but also from their ubiquity. There’s just about nothing as popular today as old sitcoms were; the only bits of shared culture that come close are periodic sporting events, viral videos, memes and occasional paroxysms of political outrage (see Meryl Streep’s Golden Globes speech and the aftermath).

Instead, we’re returning to the cultural era that predated radio and TV, an era in which entertainment was fragmented and bespoke, and satisfying a niche was a greater economic imperative than entertaining the mainstream.
So, now, how about some historical context? That's where I come in, at least as far as this article is concerned:

“We’re back to normal, in a way, because before there was broadcasting, there wasn’t much of a shared culture,” said Lance Strate, a professor of communication at Fordham University. “For most of the history of civilization, there was nothing like TV. It was a really odd moment in history to have so many people watching the same thing at the same time.”

That’s not to romanticize the TV era. At its peak, broadcast TV was derided for its shallowness, for its crass commercialism, for the way it celebrated conformity and rejected heterodoxy, and mostly for often not being very creative or entertaining. Neil Postman wrote that we were using TV to “amuse ourselves to death,” and Newton N. Minow, chairman of the Federal Communications Commission under President John F. Kennedy, famously called it a “vast wasteland.”
As you may have guessed, I did mention Postman and Amusing Ourselves to Death in our conversation, and my point here is not to contradict the argument he made. After all, I did bring that very same analysis into the 21st century with my own book, Amazing Ourselves to Death: Neil Postman's Brave New World Revisited. And the argument still stands that on the whole, television has done more harm than good, but the fact remains that the medium did offer many benefits as well, which is what I emphasized in this context:

Yet for a brief while, from the 1950s to the late 1980s, broadcast television served cultural, social and political roles far greater than the banality of its content would suggest. Because it featured little choice, TV offered something else: the raw material for a shared culture. Television was the thing just about everyone else was watching at the same time as you. In its enforced similitude, it became a kind of social glue, stitching together a new national identity across a vast, growing and otherwise diverse nation.

“What we gained was a shared identity and shared experience,” Mr. Strate said. “The famous example was Kennedy’s funeral, where the nation mourned together in a way that had never happened before. But it was also our experience watching ‘I Love Lucy’ and ‘All in the Family’ that created a shared set of references that everyone knew.”

As the broadcast era changed into one of cable and then streaming, TV was transformed from a wasteland into a bubbling sea of creativity. But it has become a sea in which everyone swims in smaller schools.

Only around 12 percent of television households, or about 14 million to 15 million people, regularly tuned into “NCIS” and “The Big Bang Theory,” the two most popular network shows of the 2015-16 season, according to Nielsen. Before 2000, those ratings would not even have qualified them as Top 10 shows. HBO’s “Game of Thrones” is the biggest prestige drama on cable, but its record-breaking finale drew only around nine million viewers.

Clearly, we inhabit a different media environment in 2017 from the one that existed during the second half of the 20th century. And that is just in reference to broadcast and cable programming. What about steaming services such as Netflix?

Netflix does not release viewership numbers, but a few independent measurement companies have come up with ways to estimate them. One such company, Symphony Advanced Media, said Netflix’s biggest original drama last year, “Stranger Things,” was seen by about 14 million adults in the month after it first aired. “Fuller House,” Netflix’s reboot of the broadcast sitcom “Full House,” attracted an audience of nearly 16 million. On Wednesday, Symphony said that about 300,000 viewers watched the new “One Day at a Time” in its first three days on Netflix. (These numbers are for the entire season, not for single episodes.)

For perspective, during much of the 1980s, a broadcast show that attracted 14 million to 16 million viewers would have been in danger of cancellation.

That was a point that I made in our conversation. The criteria for a television show being unpopular during that time was nothing short of extraordinary, numbers that would have been considered wildly successful for any other medium. It really gave new meaning to the very concept of popularity. Which again points to the fact that we are now getting back to a more normal situation, when you look at the big picture historically. Although even the estimations about Netflix viewing, programming viewed by millions, are still pretty impressive. Anyway, let's get back to Manjoo's article:

We are not yet at the nadir of the broadcast era; cord-cutting is accelerating but has still not become a mainstream practice, and streaming services only just surpassed majority penetration. So these trends have a ways more to go. As people pull back from broadcast and cable TV and jump deeper into streaming, we’re bound to see more shows with smaller audiences.

“This is just generally true with how blockbusters across the media are going,” said James G. Webster, a professor of the School of Communication at Northwestern. “Some big ones could get bigger than ever, but generally the audience for everything else is just peanuts.”

A spokesman for Netflix pointed out that even if audiences were smaller than in the past, its shows still had impact. “Making a Murderer” set off a re-examination of a widely criticized murder trial, for instance, while “Orange Is the New Black” was one of the first shows to feature a transgender actor, Laverne Cox.

But let's return to that underlying point about audiences and a shared culture:

I buy this argument; obviously, powerful cultural products can produce an impact even if they’re not seen by everyone.

But I suspect the impacts, like the viewership, tend to be restricted along the same social and cultural echo chambers into which we’ve split ourselves in the first place. Those effects do not approach the vast ways that TV once remade the culture: how everyone of a certain age knows the idioms of “Seinfeld” (“It shrinks?”), or followed the “Cheers” romance of Diane and Sam, or how a show like “All in the Family” inspired a national conversation about the Vietnam War and the civil rights movement.

It’s possible we’re not at the end of the story. Some youngsters might argue that the internet has produced its own kind of culture, one that will become a fount of shared references for years to come. What if “Chewbacca Mom” and the blue and black/white and gold dress that broke the internet one day become part of our library of globally recognized references, like the corniest catchphrases of television’s past, whether from “Seinfeld” or “Diff’rent Strokes”?

That could happen. At the risk of alienating the youngsters, though, I’ll offer this rejoinder: “What you talkin’ about, Willis?”

Maybe so, but Chewbacca Mom is a Facebook video featuring a woman putting on a mask and laughing hysterically about it, while the blue and black/white and gold dress is just an image, albeit one that is a fascinating illustration of how perception differs among different individuals (I used it in my About page for the new New York Society for General Semantics website, and between you and me, I don't care what anyone says, it's white and gold). Compared to the combination of comedy and social commentary contained in 209 episodes of One Day at a Time that aired between 1975 and 1984, these examples certainly seem like a degraded form of discourse. 

And yet, One Day at a Time was hardly the most sophisticated form of television programming, in contrast to All in the Family, M*A*S*H, or The Mary Tyler Moore Show, and even then, these shows were only oases when placed in contrast to the rest of the vast television wasteland. They hardly compare to the shared culture created by print media, notably the role of the periodical press as the basis of open political discourse of the sort lamented by Postman, and longed for by Jürgen Habermas, as well as the role of literature in creating a unified national culture.

What television in its heyday was able to do, however, was capture the lowest common denominator in a way that no other mass medium had hitherto been able to, and that no other medium has been able to since the expansion of electronic communications via cable, satellite, and the internet. This created a shared culture that was, in many respects, a very low form of culture, but then again, one that unified our population in ways that may never again be possible.

One final note on the interview: we had a very stimulating conversation that lasted about 45 minutes, which was much more wide-ranging than the couple of quotes that appear in the article can possibly reflect. But that's par for the course. I think it is a little interesting that I was able to do the interview for The New York Times via Skype while I was in China (more on that in another blog post), while Farhad Manjoo was in California. This too is a function of our new media environment, as is the fact that I can share all this in this very blog post that I am writing right now (and that you are reading some time afterwards).