Thursday, February 23, 2017

La Comprensión de los Medios en la Era Digital

So, what's up with the Spanish title for this post, you may be asking? Or maybe you're just saying, ¿Qué pasa?

Well here's the story. A while back, my friends and colleagues from Mexico, Fernando Gutiérrez and Octavio Islas, asked me if I would co-edit an anthology with them on the theme of the 50th anniversary of the publication of Marshall McLuhan's Understanding Media: The Extensions of Man. That book was published back in 1964, and our volume came out last year, just a couple of years past the anniversary.

Now, let me make it clear that I do not speak Spanish, so my role as editor was somewhat different from what it would normally be. Aside from contributing my own chapter, I solicited 7 English language contributions, which I was responsible for editing, prior to their translation. In case you were curious, those chapter authors are Corey Anton, Paul Levinson, Paul Lippert, Robert K. Logan, Eric McLuhan, James Morrison, and Michael Plugh.

I also solicited one other contribution that was written in Portuguese, written by Brazilian scholars Adriana Braga and Adriano Rodrigues, which I didn't edit, seeing as I don't speak Portuguese either. The rest of the contributors were Carlos Fernández Collado, Jesús Galindo, Jorge Hidalgo, and Claudia Benassini, along with Fernando and Octavio.

So anyway, the book was published by a Mexico City based publisher, Alfaomega, and here's the cover:

And you can read all about the book on the publisher's website, if you can read Spanish, that is. And here is the Table of Contents:


Prefacio James Morrison

Capítulo 1 Eric McLuhan
50 años después...Retrospección y perspectiva de la obra de Marshall McLuhan

Capítulo 2 Fernando Gutiérrez
La contribución de Marshall McLuhan para la comprensión de los ambientes mediáticos en la nueva era digital

Capítulo 3 Octavio Islas
Apuntes esenciales para una mejor lectura de La Comprensión de los medios como las extensiones del hombre

Capítulo 4 Lance Strate
El mensaje en La comprensión de los medios

Capítulo 5 Carlos Fernández Collado
Topoguía descriptiva para La comprensión de los medios

Capítulo 6 Jesús Galindo
La ingeniería en comunicación social y el pensamiento de Marshall McLuhan. Diálogo sobre constructivismo tecnológico de lo social

Capítulo 7 Corey Anton
Cinco formas para entender... Los Medios como las Extensiones del Hombre

Capítulo 8 Paul Lippert
McLuhan como una forma de arte

Capítulo 9 Robert K. Logan
McLuhan y su comprensión de los medios

Capítulo 10 Adriana Braga y Adriano Rodrigues
El pensamiento sistémico y el sujeto en la obra de McLuhan

Capítulo 11 Michael Plugh
Un mundo inteligente: La extensión del proyecto de automatización de McLuhan

Capítulo 12 Jorge Hidalgo
La comprensión del hombre como una extensión de los medios

Capítulo 13 Paul Levinson
McLuhan en la era de los medios sociales

Capítulo 14 Claudia Benassini
La nueva aldea global: el caso Facebook

And in case you want to order a copy, here's the Amazon link (and ordering through this portal does help support these here blogging efforts):

✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾  ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾ ✾

And I do want to add, given the current political climate, that I am especially proud to have a longstanding connection to my colleagues in Mexico, and that I have great respect and affection for them, and for their nation. 

I should mention that Octavio Islas is now on the faculty of the Universidad de los Hemisferios (University of the Hemispheres), in Quito, Ecuador. So this anthology is truly a pan-American product!

And, I guess that all that's left to say is, adios, until next time.

Monday, February 13, 2017

Bob Dylan, Nobel Laureate

So, I've shared in some previous posts the programs that I've run as president of the New York Society for General Semantics, and hey, just click on the old link to check out the website I set up for the NYSGS, and while you're there, you can subscribe for updates (you don't have to be local to do so), and avail yourself of some of the resources I made available.

And over here on Blog Time Passing, I also shared Political Talk & Political Drama Part 1: Election 2016 and Political Talk & Political Drama Part 2, and My Language Poetry. Well, it's time for the next installment.

On November 30, we held a panel discussion and debate on the topic of Bob Dylan being awarded the 2016 Nobel Prize in Literature. The idea for the panel came from my friend and colleague, Thom Gencarelli. You see, back during some down time at the 2016 New York State Communication Association conference (Thom and I both being past presidents of that organization), we got into a discussion and a bit of an argument (which is to say a difference of opinion, nothing at all heated) about whether Dylan deserved the Nobel Prize or not. My view was, shall we say skeptical, his view was much more positive. And I went so far as to say that, from a literary standpoint, I believe that a century from now, Leonard Cohen will be better remembered than Dylan.

I hasten to add that I would certainly cede the high ground to Thom when it comes to music, as he's a gifted singer, songwriter, guitar player, and band leader, the name of his band being Blue Race, check them out on iHeartRadio, SoundCloud, and wherever music is sold online, I highly recommend them.

🔵 🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵 🔴

Anyway, I wouldn't question Dylan's significance for popular music and popular culture, but this is the Nobel Prize for literature that we're talking about, and that's a horse of another color. So, our discussion and disagreement became the basis of the last NYSGS program for 2016, and here is the write up for it:


A Conversation about Bob Dylan

and his 2016 Nobel Prize in Literature

On Thursday, October 13, 2016, the Swedish Academy announced that it had awarded Bob Dylan its Nobel Prize in Literature “for having created new poetic expressions within the great American song tradition.” While Dylan’s lack of acknowledgment and acceptance of the award until two weeks later raised controversy, this paled in comparison to the controversy raised right away as pundits in the professional media and across social media weighed in: He deserves it. He doesn’t deserve it. Popular songs aren’t literature. Lyrics aren’t poetry. If the Academy’s prize for literature is expanded to include popular song, is Dylan the only deserving songwriter? Is he the most deserving? Et cetera.

This roundtable discussion seeks to address, make sense of, and try to come to some conclusions with respect to all of this ruckus. The participants will consider questions including: What is the relationship of lyrics to poetry? What is the symbiotic relationship between lyrics and music in popular song? Is poetry literature? Are popular songs literature? What is the meaning and significance of the Nobel Prize, or any award for that matter? What is the significance of Bob Dylan? What is the literary value of his lyrics? What is so new and distinctive about his “poetic expressions” and use of language? And is everything important about Dylan and his contribution simply a matter of language?

Finally… does he deserve it?

Panel participants:

Thom Gencarelli, Professor of Communication, Manhattan College
Callie Gallo, English Department Teaching Fellow, Fordham University
Sal Fallica, Professor of Media Ecology, New York University
Lance Strate, NYSGS President & Professor of Communication & Media Studies, Fordham University

Thom served as moderator as well as panelist for the session, which featured a wide-ranging discussion that included multiple intersections with the discipline of general semantics. Thom is also the co-editor, with Brian Cogan, of an anthology entitled Baby Boomers and Popular Culture, and interestingly enough, Sal Fallica wrote one of the chapters, focusing on Dylan and awards ceremonies! (I also have a chapter in the volume, mine is on science fiction film and TV).

🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵  🔴 🔵 🔴 🔵 🔴 🔵 🔴 🔵

Callie Gallo, who is working on her doctorate in English literature at Fordham University, and has an interest in media ecology, helped to provide a fresh perspective to the program. And she wrote a very nice guest blog post on her experience for Hook & Eye, subtitled Fast Feminism, Slow Academe, which in the About the Blog blurb says the following:

Hook & Eye is an intervention and an invitation: we write about the realities of being women working in the Canadian university system. We muse about everything from gender inequities and how tenure works, to finding unfrumpy winter boots, decent childcare, and managing life’s minutiae. Ambitious? Obviously. We’re women in the academy.

Anyway, Callie's post is entitled, The Perks of Saying Yes in Grad School, and it's worth a read, so why don't you click on the old link, open up a new window, and check it out. It's okay, I'll wait until you're done, and you can meet me back here.

You made it back! Well done! So now, let me just note that our program was written up in an NYSGS blog post, and is also available on an NYSGS resource page, but of course, it's all right here as well, including the video recording which was uploaded to YouTube under the title of Music Lyrics Poetry Language: A Conversation About Bob Dylan & His Nobel Prize. And you know, you can watch it on YouTube, via the NYSGS channel, but yes yes yes, you can also watch it right here.

I should add that, unfortunately, I didn't have a volunteer to hold the iPad this was recorded on, to keep faces in the frame. And it wasn't too much of a problem as long as we were seated, as we were for most of the session. But it does begin with my introduction, followed by Thom's, both delivered while we were standing, so the first few minutes of the video is not the most flattering, let alone not being at all professional. But the sound quality is good, and once we were done with the intros and sat down, everything looked fine, aside from the fact that a little bit of the shelf the iPad was sitting on is visible on the bottom left part of the frame. But anyway, for better or worse, here it is:

So, what do you think? Click here for a list of all of the Nobel Prizes in Literature awarded since 1901. Does Bob Dylan belong on this list? Or is this, in its own way, a weird example of celebrity logic that parallels having a reality television star as president?

Saturday, February 11, 2017

A Look Back

So, as you may recall, I spent the Fall 2015 semester at Villanova University, with a visiting appointment as their Margaret E. and Paul F. Harron Endowed Chair in Communication, aka the Harron Family Chair. You can check out some of the blog posts I wrote at that time, My Villanova Adventure, Fatal Amusements Talk, Villanova Grad School Interview, VCAN Connected, and Fatal Amusements. And,  as I was looking at some blog posts that remain incomplete and unpublished here on Blog Time Passing (saved as drafts), I noticed one leftover from that time, that just had a single photograph that had been taken for publicity purposes. Here it is:

This was in my Villanova office, and since I happen to have some photographs I took on my iPhone, I can round out this post with some of those. First, a closer look at the shelves behind me:

I have a couple more books to add now, but I'll save that for later posts. 

Instead, let me provide a view from the outside of the office:

And another one below with the door closed:

And I also wanted to add a few shots of the wall opposite my office. The images stretched too far for me to capture in one snapshot, but here are a few pieces of the mosaic:

The lack of color and contrast made it challenging to get a good photo of the quotes, but here you can see one by McLuhan that made me feel right at home. 

Another quote from Walter Ong, how can you go wrong with that?

Not as well known as the others, but Lee Thayer certainly qualifies as a media ecologist, or at the very least a fellow traveler.

And good old John Dewey, that's pretty cool!

I'd also categorize Barnett Pearce as a fellow traveler. Some great choices all the way around.

Now, if you're looking for something a little more exotic than the Philadelphia suburbs, I'll be sharing pictures from my more recent visit to Henan University in Kaifeng, China, in upcoming blog posts. Hope you like them!

Thursday, February 9, 2017

How Netflix Is Deepening Our Cultural Echo Chambers

So, back on January 12th, I was quoted in The New York Times, which does not publish, as some would have it, all the fake news that's fit to print. The article first appeared online the day before, written by Farhad Manjoo, the title being How Netflix Is Deepening Our Cultural Echo Chambers. And yeah, that's a link to the item over on their website, in case you want to read it in its original and unadulterated state.

As for the original print version, when I got my copy of the paper, I figured it wouldn't be in the first section, which is devoted to hard news and the op-ed pages. So I  looked for it in the third section, Arts, and it wasn't there. Then I looked in the fourth section, Thursday Styles, and it wasn't there either. That left the second section:

That's right, the article made the front page of the Business Day section. Interesting image they got to go with it, don't you think? And maybe it's kind of ironic, in a media ecological sense, that the image takes up more space than the text, as if the paper is somehow trying to compete with television on television's own terms, rather than emphasizing what newspapers do best.

Be that as it may, the bulk of the article was continued on page five:

And of course, it's a bit hard to read off of the images, which are in fact images after all, included via Blogger's insert image function, courtesy of my iPhone's camera. So, not to worry, here's the text in easy to read form:

When “One Day at a Time” started its run on CBS in December 1975, it became an instant hit and remained so for almost a decade.

In its first year, “One Day at a Time,” a sitcom about working-class families produced by the TV impresario Norman Lear, regularly attracted 17 million viewers every week, according to Nielsen. Mr. Lear’s other comedies were even bigger hits: One out of every three households with a television watched “All in the Family,” for instance.

Last week, a new version of “One Day at a Time” started on Netflix. Critics praised the remake for its explorations of single parenthood and class struggle, a theme that has faded from TV since Mr. Lear’s heyday.
So, a seemingly innocuous topic, at least to begin with, another of a seemingly endless run of remakes appearing on film and video, this time via Netflix. And this was the topic that Mr. Manjoo wanted to discuss when he contacted me. I should add that I have previously been interviewed regarding other Norman Lear TV programs, notably All in the Family and Barney Miller, as I related in my previous posts here on Blog Time Passing, All in for All in the Family and A Sitcom to Remember (not to mention my WNYC interview on the similarities between Archie Bunker and Donald Trump, discussed in last year's post, From Bunker to Trump (via Reagan), as you no doubt recall).

Anyway, let's get back to the article, and the difference that four decades can make:

Yet, well intentioned and charming as the new streaming version may be, there’s a crucial aspect of the old “One Day at a Time” that it will almost certainly fail to replicate: broad cultural reach.

The two versions of “One Day at a Time” are noteworthy bookends in the history of television, and, by extension, the history of mass culture in America. The shows are separated by 40 years of technological advances—a progression from the over-the-air broadcast era in which Mr. Lear made it big, to the cable age of MTV and CNN and HBO, to, finally, the modern era of streaming services like Netflix. Each new technology allowed a leap forward in choice, flexibility and quality; the “Golden Age of TV” offers so much choice that some critics wonder if it’s become overwhelming.

It’s not just TV, either. Across the entertainment business, from music to movies to video games, technology has flooded us with a profusion of cultural choice.

This is all well and fine in regard to the changing dynamics of the media industries and entertainment providers, but Manjoo has a deeper concern, one shared by many cultural commentators, and certainly amenable to media ecological analysis:

More good stuff to watch and listen to isn’t bad. But the new “One Day at a Time” offers a chance to reflect on what we have lost in embracing tech-abetted abundance. Last year’s presidential election and its aftermath were dominated by discussions of echo chambers and polarization; as I’ve argued before, we’re all splitting into our own self-constructed bubbles of reality.

What’s less discussed is the polarization of culture, and the new echo chambers within which we hear about and experience today’s cultural hits. There will never again be a show like “One Day at a Time” or “All in the Family”—shows that derived their power not solely from their content, which might not hold up to today’s more high-minded affairs, but also from their ubiquity. There’s just about nothing as popular today as old sitcoms were; the only bits of shared culture that come close are periodic sporting events, viral videos, memes and occasional paroxysms of political outrage (see Meryl Streep’s Golden Globes speech and the aftermath).

Instead, we’re returning to the cultural era that predated radio and TV, an era in which entertainment was fragmented and bespoke, and satisfying a niche was a greater economic imperative than entertaining the mainstream.
So, now, how about some historical context? That's where I come in, at least as far as this article is concerned:

“We’re back to normal, in a way, because before there was broadcasting, there wasn’t much of a shared culture,” said Lance Strate, a professor of communication at Fordham University. “For most of the history of civilization, there was nothing like TV. It was a really odd moment in history to have so many people watching the same thing at the same time.”

That’s not to romanticize the TV era. At its peak, broadcast TV was derided for its shallowness, for its crass commercialism, for the way it celebrated conformity and rejected heterodoxy, and mostly for often not being very creative or entertaining. Neil Postman wrote that we were using TV to “amuse ourselves to death,” and Newton N. Minow, chairman of the Federal Communications Commission under President John F. Kennedy, famously called it a “vast wasteland.”
As you may have guessed, I did mention Postman and Amusing Ourselves to Death in our conversation, and my point here is not to contradict the argument he made. After all, I did bring that very same analysis into the 21st century with my own book, Amazing Ourselves to Death: Neil Postman's Brave New World Revisited. And the argument still stands that on the whole, television has done more harm than good, but the fact remains that the medium did offer many benefits as well, which is what I emphasized in this context:

Yet for a brief while, from the 1950s to the late 1980s, broadcast television served cultural, social and political roles far greater than the banality of its content would suggest. Because it featured little choice, TV offered something else: the raw material for a shared culture. Television was the thing just about everyone else was watching at the same time as you. In its enforced similitude, it became a kind of social glue, stitching together a new national identity across a vast, growing and otherwise diverse nation.

“What we gained was a shared identity and shared experience,” Mr. Strate said. “The famous example was Kennedy’s funeral, where the nation mourned together in a way that had never happened before. But it was also our experience watching ‘I Love Lucy’ and ‘All in the Family’ that created a shared set of references that everyone knew.”

As the broadcast era changed into one of cable and then streaming, TV was transformed from a wasteland into a bubbling sea of creativity. But it has become a sea in which everyone swims in smaller schools.

Only around 12 percent of television households, or about 14 million to 15 million people, regularly tuned into “NCIS” and “The Big Bang Theory,” the two most popular network shows of the 2015-16 season, according to Nielsen. Before 2000, those ratings would not even have qualified them as Top 10 shows. HBO’s “Game of Thrones” is the biggest prestige drama on cable, but its record-breaking finale drew only around nine million viewers.

Clearly, we inhabit a different media environment in 2017 from the one that existed during the second half of the 20th century. And that is just in reference to broadcast and cable programming. What about steaming services such as Netflix?

Netflix does not release viewership numbers, but a few independent measurement companies have come up with ways to estimate them. One such company, Symphony Advanced Media, said Netflix’s biggest original drama last year, “Stranger Things,” was seen by about 14 million adults in the month after it first aired. “Fuller House,” Netflix’s reboot of the broadcast sitcom “Full House,” attracted an audience of nearly 16 million. On Wednesday, Symphony said that about 300,000 viewers watched the new “One Day at a Time” in its first three days on Netflix. (These numbers are for the entire season, not for single episodes.)

For perspective, during much of the 1980s, a broadcast show that attracted 14 million to 16 million viewers would have been in danger of cancellation.

That was a point that I made in our conversation. The criteria for a television show being unpopular during that time was nothing short of extraordinary, numbers that would have been considered wildly successful for any other medium. It really gave new meaning to the very concept of popularity. Which again points to the fact that we are now getting back to a more normal situation, when you look at the big picture historically. Although even the estimations about Netflix viewing, programming viewed by millions, are still pretty impressive. Anyway, let's get back to Manjoo's article:

We are not yet at the nadir of the broadcast era; cord-cutting is accelerating but has still not become a mainstream practice, and streaming services only just surpassed majority penetration. So these trends have a ways more to go. As people pull back from broadcast and cable TV and jump deeper into streaming, we’re bound to see more shows with smaller audiences.

“This is just generally true with how blockbusters across the media are going,” said James G. Webster, a professor of the School of Communication at Northwestern. “Some big ones could get bigger than ever, but generally the audience for everything else is just peanuts.”

A spokesman for Netflix pointed out that even if audiences were smaller than in the past, its shows still had impact. “Making a Murderer” set off a re-examination of a widely criticized murder trial, for instance, while “Orange Is the New Black” was one of the first shows to feature a transgender actor, Laverne Cox.

But let's return to that underlying point about audiences and a shared culture:

I buy this argument; obviously, powerful cultural products can produce an impact even if they’re not seen by everyone.

But I suspect the impacts, like the viewership, tend to be restricted along the same social and cultural echo chambers into which we’ve split ourselves in the first place. Those effects do not approach the vast ways that TV once remade the culture: how everyone of a certain age knows the idioms of “Seinfeld” (“It shrinks?”), or followed the “Cheers” romance of Diane and Sam, or how a show like “All in the Family” inspired a national conversation about the Vietnam War and the civil rights movement.

It’s possible we’re not at the end of the story. Some youngsters might argue that the internet has produced its own kind of culture, one that will become a fount of shared references for years to come. What if “Chewbacca Mom” and the blue and black/white and gold dress that broke the internet one day become part of our library of globally recognized references, like the corniest catchphrases of television’s past, whether from “Seinfeld” or “Diff’rent Strokes”?

That could happen. At the risk of alienating the youngsters, though, I’ll offer this rejoinder: “What you talkin’ about, Willis?”

Maybe so, but Chewbacca Mom is a Facebook video featuring a woman putting on a mask and laughing hysterically about it, while the blue and black/white and gold dress is just an image, albeit one that is a fascinating illustration of how perception differs among different individuals (I used it in my About page for the new New York Society for General Semantics website, and between you and me, I don't care what anyone says, it's white and gold). Compared to the combination of comedy and social commentary contained in 209 episodes of One Day at a Time that aired between 1975 and 1984, these examples certainly seem like a degraded form of discourse. 

And yet, One Day at a Time was hardly the most sophisticated form of television programming, in contrast to All in the Family, M*A*S*H, or The Mary Tyler Moore Show, and even then, these shows were only oases when placed in contrast to the rest of the vast television wasteland. They hardly compare to the shared culture created by print media, notably the role of the periodical press as the basis of open political discourse of the sort lamented by Postman, and longed for by Jürgen Habermas, as well as the role of literature in creating a unified national culture.

What television in its heyday was able to do, however, was capture the lowest common denominator in a way that no other mass medium had hitherto been able to, and that no other medium has been able to since the expansion of electronic communications via cable, satellite, and the internet. This created a shared culture that was, in many respects, a very low form of culture, but then again, one that unified our population in ways that may never again be possible.

One final note on the interview: we had a very stimulating conversation that lasted about 45 minutes, which was much more wide-ranging than the couple of quotes that appear in the article can possibly reflect. But that's par for the course. I think it is a little interesting that I was able to do the interview for The New York Times via Skype while I was in China (more on that in another blog post), while Farhad Manjoo was in California. This too is a function of our new media environment, as is the fact that I can share all this in this very blog post that I am writing right now (and that you are reading some time afterwards).


Tuesday, January 31, 2017

My Language Poetry

So, maybe you remember how I've been serving as president of the New York Society for General Semantics for the past year, and maybe you even subscribed for updates over on the NYSGS website. And if not, please feel free, you don't have to be in or anywhere near New York to get the latest news on our events and resources added to the website (which I put together, and it's not too shabby, if I do say so myself).

And maybe you saw my previous posts on the subject, New York Society for General Semantics, my initial announcement, and the two posts on panels I put together prior to the 2016 election, Political Talk & Political Drama Part 1: Election 2016 and Political Talk & Political Drama Part 2? Pretty interesting stuff, wouldn't you say?

So, now for something completely different, or maybe somewhat different, another of the sessions that I organized for the NYSGS was a series of poetry readings. You can see, and hear, all of them over on the NYSGS site via the following link: The Language of Poetry (Video Recorded September 28, 2016). And they're also available on the NYSGS YouTube Channel

As far as this post here on Blog Time Passing is concerned, I just want to share my own performance of several original, unpublished poems. I think you'll find there's still some connection to politics and other issues and controversies, as well as to the theme of language, and to general semantics. Well, for better or verse, here it is:

In case you were wondering about where this reading took place, it was at The Players, a pretty cool site where all of our events have been held over the past year. And for this one in particular, I connected the NYSGS event to Poetry at the Players, a group that meets periodically at The Players to engage in readings of poetry (which is all about the performance, dramatic readings, so the rule being that you cannot read poems that you yourself have written). I have taken part in the readings for this group a number of times, and whenever I've been able to make it, but less so since September because I'm teaching and holding NYSGS sessions on Wednesday evenings, the same evening that this group meets.

Anyway, the evening began with Poetry at the Players for the first hour, and that was followed by a second hour of readings and performances of original poetry. And that helps to explain some of what I'm saying in my introduction to the session, which I include here mainly for the reference to general semantics and poetry:

And I should add that the place was packed, with something like 100 people in attendance. So it was quite a night, all things considered! And maybe we'll do it again sometime... Subscribe, and you'll know!

Friday, January 27, 2017

Whither Obama?

So, I know many of us are missing Barack Obama more and more every day, and he did say he was taking an extended, and well earned vacation. And he also said that when he gets back to work, he wants to help the Democrats do better in local and statewide politics, a badly needed effort, to be sure.

But this post looks back to an item that appeared right before the 2016 presidential election, courtesy of the UK's Independent. Dated November 4, 2016, the title of the article is, How much money could Barack Obama earn after leaving the White House? And it is followed by a subtitle that says, Mr Obama will receive an annual pension of $203,700. And it is always important to acknowledge the author, which in this case is Matt Payton. So, you know the drill, you can click on the title of the article to read it on the newspaper's website, or stick around are read it here.

The article is more or less informational, starting off with the following:

Barack Obama leaves the White House, the third President in a row to have spent two full terms as commander-in-chief.

Before winning the the 2008 Presidential election, he served three years in the US Senate (2005 to 2008) and seven years in the Illinois State Senate (1997-2004).

Following nearly 20 years in public office, there has been much speculation in regards to his post-Presidential career.

Regarding the next few years, Mr Obama has stated he will remain in Washington DC until his youngest daughter, Sasha, finishes high school.

As standard, every former US President since 1958 receive a pension, with Mr Obama set to receive $203,700 (£162,798) per annum.

Other than his repeated intention to play more golf, the 55-year-old leader of the free world has a number of options:

At this point, the article moves into a speculative mode, listing six possibilities, starting with the following:

1. The political memoir

A traditional first project of former Presidents looking to sculpt their own legacy.

Bill Clinton reportedly received a $10 million (£7.9 million) advance for his presidential memoirs with George W. Bush allegedly receiving $7 million (£5.6 million) for his memoir, Decision Points.

Mr Obama already made millions with his two previous memoirs Dreams of My Father (1995) and The Audacity of Hope (2006).

Publishers have described his presidential memoirs as the most hotly anticipated with advances estimated between $25 million (£19.9 million) and $45 million (£35.9 million), reports The New York Times.

I think we can pretty much count on this one, given Obama's intellectual acuity, track record in publishing, and communication skills. Next up we have a topic familiar to longtime readers of this blog:

2. Lecture circuit

Another popular post-Presidency side-line, former White House residents can net millions making paid speeches at universities and corporate venues across the world.

While his father, President George H.W. Bush reportedly earns $10,000 (£7,990) per speech, George W. Bush earns between $100,000 (£79,900) to $175,000 (£139,840) per appearance.

Bill Clinton was reportedly paid $225,000 (£179,795) for an appearance in February 2014, reports Fortune. Communications professor at Fordham University, Lance Strate said: "The speech is kind of secondary to… just being able to have a big name at your event.

"It might get reported on some form of TV or cable news, which further adds to the prestige and the publicity of the event."
So, there I am, being quoted, and actually this is another case of being re-quoted, a quote in an older article being used in a new article. In fact, I have a whole history on this subject, and you can see it unfold via my previous blog posts, first Giant Speaking Fees-Fi-Fo-Fum, then Of Fees, Futility, and Mike Huckabee, and A Fortune in Speakers' Fees, and then Long-Shot Candidates in the Marketplace, and its follow-up, Why Run & Other Answers to Political Questions. Funny how something small like that just keeps echoing and re-echoing around the digital canyon.

All right then, that's how this article came to my attention, as you might have guessed. But while we're at it, let's see what else comes up on the list, shall we? And the next item demonstrates, if nothing else, that someone doesn't know how to count, as it's mislabeled number 2, and far be it from me to change the quote and correct it:

2. Buy a sports team

President Obama has mentioned his dream of part-owning an NBA basketball franchise⏤his first sporting love. The advance for his memoirs could make this a realistic proposition.

He told GQ last year: "I have fantasized about being able to put together a team and how much fun that would be. I think it’d be terrific."

His predecessor, George W. Bush had owned a stake in the Major League Baseball team, the Texas Rangers, before selling up in 1998 for a cool $14.9 million.

Considering he made an initial $606,000 investment in 1989, that's a decent level of profit.
Maybe the mistake in numbering was due to the farfetched quality of this item? Whatever the reason, the misnumbering continues, as we move on to the fourth item:

3. College professorship

The hottest contender for his post-Presidential career, Mr Obama has spoken frequently about returning to teach law at College.

In an interview with The New Yorker, Mr Obama said: "I love the law, intellectually. I love nutting out these problems, wrestling with these arguments.

"I love teaching. I miss the classroom and engaging with students."

As to where, there are three obvious choices; Columbia where he was an undergraduate political science major, Harvard where he graduated from law school or the University of Chicago where he taught previously.

Columbia is seen as the front runner after the college's president said at the 2015 convocation he was looking forward to "welcoming back our most famous alumnus... in 2017."

Mr Obama would not be the first politician to return to academia, former Secretary of State Condoleeza Rice returned to Stanford University as a politics professor.

At some of the wealthiest and most prestigious American colleges, top professors can earn six figure salaries⏤tempting enough for a former President?
Also, former Vice-President Al Gore taught at Columbia after losing the 2000 presidential election. Speaking on behalf of my profession, I do have to say that Obama's sentiments are admirable, and I bet he's dynamite in the classroom. Of course, this option seems to be more in line with the intellectualism of recent Democratic presidents, as opposed to the Republicans. Not that it has to be that way, although the ivory tower does lean a bit towards the left, making it a context more conducive to liberals than conservatives. But maybe there's some reluctance on the part of folks on the right to have their views challenged and tested? Anyway, the fifth item is also an option that seems to go with a particular political inclination:

4. Social Activism

Probably the least remunerative option but one that holds attraction to both Barack and Michelle Obama.

After graduating from Columbia, Mr Obama spent three years as a community organiser in Chicago.

Both have stated they are committed to their grassroots initiatives such as My Brother's Keeper⏤a mentoring programme for young minority men.

And finally, something that probably would not have been considered as an option until the Clintons:

5. Public office

Mr Obama has made it very clear that neither he nor his wife would ever seek public office after leaving the White House.

He recently said: She will never run for office, she is as talented and brilliant a person as there is, and I could not be prouder of her, but Michelle does not have the patience or the inclination to actually be a candidate herself.

"That’s one y’all can take to the bank.”

The First Lady has categorically backed these sentiments despite groundswell support for her to enter frontline politics.

Mrs Obama said at the South by Southwest festival in March: "I will not run for president. No, nope, not going to do it.

“There is so much that I can do outside of the White House… without the constraints, the lights and the cameras, the partisanship.

"There’s a potential that my voice can be heard by many people that can’t hear me now because I’m Michelle Obama the First Lady, and I want to be able to impact as many people as possible in an unbiased way."
And as much as many of us may admire Michelle Obama, and regard her as having strong potential were she to enter the political arena, dynastic politics is generally not the best thing for democratic government. Growing up, there was this longing for another Kennedy, and while Teddy matured into an excellent elder statesman, his directionless flirtations with running for president were not helpful. His brother Bobby might have been a great one, there is no way of knowing. But John Quincy Adams was not one of the good ones. As for Teddy and Franklin Roosevelt, they were fifth cousins, so they don't count. And of course I'd rather have seen Hillary Clinton as president, and for that matter Jeb Bush, but as a basic principle, we are better off without any kind of political aristocracy.

Sunday, January 22, 2017

Post-Truth and Post-Reason

So, I was playing catch-up in posting 3 of my op-eds published in the Jewish Standard and online on my blog for their Times of Israel site previously, and was going to hold off until posting my most recent op-ed here on Blog Time Passing. But in light of recent events revolving around the inauguration of Donald Trump, I figure I should post this sooner rather than later.

The piece originally appeared in the December 30th issue, and was written with an end-of-year sensibility. The editor extended the title to read, Post-Truth and Post-Reason—Big Data and Big Dada Fight It Out, which is a bit misleading since it's more like big data and big dada ganging up in an assault on truth and reason. But I am grateful that they were willing to publish what turned out to be a very long item, almost twice as long as the typical op-ed. So, here it is:

As we reach the end of 2016, I find I have mixed feelings about the Word of the Year chosen by Oxford Dictionaries: post-truth.

Reflecting the Brexit vote in the UK as well as the presidential election campaign in the US, the term reflects the disillusionment that many of us feel with political discourse in the 21st century, especially as it is conducted via television, the internet, and social media.

But the advent of post-truth leaves open the question, what is truth? In one sense, it is the opposite of a lie, and this year’s election campaign has seen more accusations of lying coming from both sides of the political spectrum than I can recall from past political seasons. A lie is a deliberate attempt to mislead, either by knowingly making a false statement, or by withholding information known to be true.

Over the past half century, two of our presidents have gotten in trouble for lying—Richard Nixon, who was forced to resign, and Bill Clinton, who was impeached. Of course, some of us find that there is a significant difference between Nixon lying to cover up an attempt to undermine the democratic process, and Clinton lying to cover up a personal indiscretion. But both were guilty of failing to live up to the ideal of honesty. Jimmy Carter, on the other hand, campaigned on the promise that “I’ll never lie to you.” Whatever else might be said of him, he tried to tell the American people the truth about the end of postwar prosperity. His message was not well received, to say the least.

The apocryphal story of young George Washington admitting to chopping down a cherry tree with the words “I cannot tell a lie” reflects one type of honesty, honesty in confession of sin, wrongdoing, or error. This kind of honesty is very much a part of Jewish religious and ethical tradition, and the Judeo-Christian foundation of the American republic. It is a practice that our president-elect seems to avoid more often than not, although it has been in general decline through our culture, in part due to the litigious nature of our society, but also due to a decay in people’s willingness to take responsibility for their actions.

Abraham Lincoln was known as “Honest Abe,” reportedly long before he entered the political arena, when he was a young store clerk and, notably, when he was a lawyer. In this regard, beyond telling the truth, honesty refers more broadly to integrity and trustworthiness; beyond lying, dishonesty includes a variety of unethical behaviors, such as cheating. Here too, we can trace this ideal back to biblical passages such as can be found in the Holiness Code (Leviticus 17-27), which includes the commandment “You shall not cheat in measuring length, weight, or quantity. You shall have honest balances, honest weights…” (18:35-36). Accusations of cheating also have been a part of 2016 politics, again directed at both major parties and their candidates.

Admittedly, these concepts of honesty are old-fashioned and obsolescent in our contemporary culture of celebrity, where honestly amounts to self-display and self- promotion. It is the honesty of going on a talk show and talking about yourself, or feeding details of your personal life to the gossip outlets. Donald Trump is seen as honest by his followers not because he accurately conveys the truth, but because he says what he thinks, seemingly with little or no filtering. This stands in stark contrast with the typical politician, who sends different messages to different audiences, especially to wealthy backers as opposed to the general public. Not to mention the fact that officeholders often must withhold information from their constituents.

Because Trump seems to say whatever comes into his head and does not care to be diplomatic in his remarks or hold back in concern over anyone’s sensitivities, he is seen as honest in a way that renders any inconsistencies in what he says irrelevant. So what if he contradicts himself from one situation to another, if what he says at any given moment is what he truly is thinking, what he truly believes to be true? In this way, Trump’s vulgar remarks caught on tape before an Access Hollywood appearance serves as more proof of his honesty, and does not conflict with his statements that he loves women and that no one has more respect for women than he does, at least as far as his fans are concerned.

The kind of honesty Trump represents is associated with the ideal of authenticity. For celebrity logic, authenticity means playing yourself, even if you are playing a role. That’s the difference between being an actor, along the lines of Meryl Streep or Dustin Hoffman, or being a star, like Arnold Schwarzenegger, Sylvester Stallone, or Adam Sandler for that matter. What fans often forget is that playing yourself is still playing a role, that authenticity on the part of celebrities is still an act.

Politicians can accuse their opponents of lying as a way of emphasizing their own image of authenticity, but actually proving such claims can be very difficult, because they require some evidence that there was an intent to mislead. The Watergate conspirators avoided charges of perjury by using the phrase “to the best of my recollection” in conjunction with their testimony. Who can prove that a lie is not the result of a faulty memory rather than a deliberate deception?

For similar reasons, journalists rarely accuse anyone of lying, instead identifying statements as false. That leaves open the question of whether the politicians were simply mistaken, or in the neologism used by press secretaries, whether they misspoke. Journalists can, however, report on the accusations of lying made by some other source. While they may not be able to support the claim that candidate A is lying, they can easily show that candidate B said that candidate A is lying.

The important point is that while in one sense lies are the opposite of truth, in another sense it is falsity that is truth’s antonym. The contrast between true and false takes us away from the ideal of honesty, and removes the factor of personal belief. Instead, we are asked to objectively consider the logic of the claim, and the evidence that may support or refute it.

This meaning of truth is closely related to the concept of facticity, hence the Oxford Dictionary definition of post-truth: “relating to or denoting circumstances in which objective facts are less influential in shaping public opinion than appeals to emotion and personal belief.” This reflects the commonly held view that facts are statements that are true, typically having been verified scientifically. But this is based on a misunderstanding of science.

A scientific fact is a statement that is open to testing. A statement such as “God created the world,” cannot be tested empirically by any known method, and therefore cannot be considered a scientific fact. That means that it cannot be tested to see if it’s true or false. A statement such as “The world is approximately 6,000 years old” can be tested via scientific method, and has been shown to be false. But it is still a fact, in the sense of being a statement open to testing. Ronald Reagan was notorious for citing facts that turned out to be false, but no one accused the former actor of lying.

Actually, according to philosopher Karl Popper, scientists can never prove anything to be absolutely true, because to do so would require observing every possible instance of the phenomenon in question, past, present, and future. And it only takes one exception to prove the theory false. In this sense, science advances by falsification alone, by eliminating error and mistaken notions.

Science cannot give us truth, just tentative explanations that conform to the available evidence, and effective means of predicting outcomes. Science is by far the best method we have for making such predictions. But absent claims of absolute truth, science leaves open the door to relativism, a view that is problematic when it is championed by the left in regard to morality, and by the right in regard to reality.

Stephen Colbert introduced the term truthiness to refer to George W. Bush’s reliance on intuition and gut feelings as a guide to truth, rather than logic, evidence, or even thoughtful reflection. The word seems almost quaint now, as it retains at least a bit of a folksy connection to some sense of the truth, something less extreme than post-truth. It is perhaps a reflection of nostalgic longing and disturbance over contemporary public discourse that accounts for the revival earlier this year of the classic television game show To Tell the Truth, introduced in 1956 by Bob Stewart, née Isidore Steinberg of Brooklyn.

But truth long has been a problematic term, and for many years now we have been rightfully suspicious of anyone who lays claim to the truth. The true tragedy we are witnessing is the decline of rationality. The prophet Isaiah declared, “Come now and let us reason together” (1:18), and it was the Enlightenment, the Age of Reason, that gave birth to the American republic. The democratic basis of our government was predicated on our ability to engage in rational discussion and argumentation, and through competition in the marketplace of ideas, arrive at the truth, or at least negotiate a compromise between opposing opinions.

Rationality has been under attack on two fronts, from the irrationality of an image culture that emphasizes appearance and personality rather than sensible language, and from the hyper-rationality of number-crunching information technologies that leave no room for deliberation or value other than efficiency and productivity. We are caught between emotional appeals that leave no room for thoughtful, impartial consideration, and calculations of quantifiable certainties that do not allow for human evaluation and judgment.

In short, reason is being squeezed out by the extremes of big data and big dada.

The end of rationality has had an adverse affect on the State of Israel as well, as Jewish culture, with its long tradition of Talmudic scholarship, which emphasizes reasoned discussion. Israel’s attempts to use logic and evidence fare poorly in the face of its enemies’ use of images and emotional appeals in the international arena.

Liberals have had more difficulty adjusting to a post-rational world than conservatives, given the liberal bias toward intellectualism. One advantage that liberals do enjoy is in the use of humor, so look for comedians to take on leadership positions in the Democratic Party. For this reason, I wouldn’t be surprised if Saturday Night Live alumnus Al Franken, the junior United States senator from Minnesota, was the Democratic nominee in 2020.

But the end of reason is not a problem only for liberals. It is a challenge to liberalism writ large, to our ideals of freedom and equality. And it makes it all but impossible to follow the commandment found in Deuteronomy (16:20): “Justice, justice, you shall pursue.” How can we pursue justice in a post-truth, post-rational world?