Tuesday, December 20, 2016

Strained Relations

I have one more op-ed to catch up on, this one appearing in the October 28th issue of the Jewish Standard, in conjunction with Halloween. The title of the piece was changed to "Less Horrible Horror Series" which, in my view, is not all that much less than horrible itself, but what can you do? I have taken the liberty to use the original title I came up with for this blog post:

Jews and vampires don’t mix. Or at least they don’t mix easily.

As a kid growing up in the 60s, I remember picking up a copy of a magazine, Famous Monsters of Filmland, and reading all about Dracula and the other creature features produced by Universal Studios and their rivals. The original Dracula motion picture was released in 1931, so it wasn’t a movie I could see at the local cinema, but those of us who lived in the New York metropolitan area did have Chiller Theatre, hosted by Zacherley (“The Cool Ghoul”) on Channel 11 WPIX.

When I learned that my father was born in Transylvania, which meant that I was half-Transylvanian, I was delighted. It meant that I was a bit cool myself. I asked my father if he knew Bela Lugosi, the Hungarian actor who played Dracula in the original movies. My father was born before World War I, when Transylvania was a part of Hungary, so it seemed altogether likely to me that he would know the famous film star. My father joked that he knew Bela Lugosi, but Lugosi didn’t know him.

Having a connection to Transylvania gave me a bit of an edge, but I was hardly the only kid who had fun imitating Dracula and the other monsters, on Halloween and all year round. After all, we had comedy programs on TV like The Munsters, which included the character of Grandpa, aka Sam Dracula, played by Al Lewis, aka Albert Meister, not to mention The Addams Family, commercials for Count Chocula cereal, and Sesame Street’s Count von Count. Humor, not horror, led us to imitate an Eastern European accent and utter the immortal words, “I vant to suck your blud.”

The one bit of dissonance that I encountered in reading about the powers and weaknesses of vampires was the idea that they could be warded off by a crucifix or cross. Vampire lore fit into a Christian cosmology that Jews were not a part of. The 1979 comedy film Love at First Bite, starring Richard Benjamin as psychiatrist Jeffrey Rosenberg, plays on this problem in a scene where he tries to combat Dracula by reaching for his necklace, resulting in a moment of suspense as Dracula recoils in anticipation. But much to the vampire’s relief, he pulls out a Star of David instead.

Indeed, in many ways I found it easier to identify with the vampire, given my aversion to Christian symbols. I grew up seeing them as the symbols of people who for close to two millennia had oppressed and persecuted us, culminating in the pogroms and the Holocaust that both of my parents lived through. It helped that I was a bit of a night owl as well. This was long before the emergence of vampire subcultures, an offshoot of the Goth movement, inspired by the vampire novels of Ann Rice, and the Twilight young adult fiction and film series, featuring “Bella” Swan as the willing paramour of a handsome young vampire.

These and other new takes on the vampire mythos tend to downplay or eliminate its Christian-specific elements, and otherwise sympathize with, sometimes glamorize, and otherwise normalize what once was considered monstrous. The recently concluded HBO series True Blood also falls into this category, and while not featuring any Jewish characters, introduced Lilith as the mother of all vampires in a scene in which contemporary vampires took part in ritual drinking of blood while chanting in Hebrew. This was more than a little disturbing, given the long history of the blood libel used as justification for anti-Semitism.

This summer, I caught up with the Shadowhunters series on the Freeform cable channel. Like the Twilight movies, Shadowhunters is a young adult novel adaptation. The main character, Clary, learns that she is one of a small number of people of mixed ancestry—they’re part angel—and they are called upon to battle demons as shadowhunters. Her best friend, Simon Lewis, is Jewish, and as played by the Alberto Rosende he comes across as a typical Jewish nebbish type, of the sort made famous by Woody Allen, neurotic, fearful, and nursing his unrequited love for Clary. Until he is turned into a vampire, at which point he becomes cool, competent, and charismatic. According to synopses of the book series online, when his devout mother eventually learns of his conversion, she throws him out of their house. At a later point in the narrative, he thankfully is restored to human status.

The narrative may be typical of young adult fiction, and the character somewhat stereotypical, but the way that Jewish characters are inserted into the vampire narrative is both original and commendable. It is no coincidence that the author of the bestselling series, The Mortal Instruments, upon which Shadowhunters is based, Judith Rumelt, who uses the pen name Cassandra Clare, is Jewish.

. . . . . . . . . .. . . . . . . . . .

When it comes to quality television, my award for best series in the horror genre goes to The Strain, now in its third season on FX. Created by respected Mexican film director Guillermo del Toro and novelist Chuck Hogan, The Strain is set in New York City, and one of the elements I appreciate about the series is that it takes place in neighborhoods all over the five boroughs and has a real New York sensibility. The plot line is reminiscent of contagion and zombie apocalypse genres, combined with elements drawn from the Alien film series, and science fiction stories about doubles taking the place of their human counterparts, such as Invasion of the Body Snatchers. These elements meld together to form an altogether original vampire narrative, one in which the supernatural elements are absent, and the strigoi (the Romanian word for vampire used in the series) are the result of a parasitical worm that takes over and transforms its host.

The series features a small group of people who fight what appears to be a losing battle against the strigoi as they take over New York and the rest of the country, led to some extent by Professor Abraham Setrakian. Setrakian is identified as an Armenian Jew in the series, conferring on him a double coding as the victim of genocide, and he is shown to be a concentration camp survivor who first saw the strigoi feeding on captives at Treblinka. He had been a professor of mythology and East European literature at the University of Vienna, and after immigrating to the United States, became a New York pawnbroker. (This is an homage to the 1964 Sidney Lumet film The Pawnbroker.)

The leader of the strigoi is known as the Master (an allusion to the Nazi reference to the Aryan “master race”), an ancient strigoi who can exert complete psychic control over his spawn. His second in command, one of the few strigoi granted free will, is Thomas Eichhorst, who was the Nazi commandant at Treblinka, and Setrakian’s torturer. It follows that The Strain can be understood as a metaphor for authoritarianism and fascism, and this wouldn’t be the first time del Toro used fantasy elements in conjunction with this sort of political critique, as can be seen from his 2006 film Pan’s Labyrinth.

. . . . . . . . . .. . . . . . . . . .

. . . . . . . . . .. . . . . .. . . . . . . . . .

All of the heroes in The Strain are flawed characters, and Setrakian is no exception. He is cranky and humorless, obsessed and ruthless as a vampire hunter, but also intelligent, learned, and the one who recognizes and understands the threat most clearly. The victim of great trauma, he is only occasionally show as deserving of our sympathy. This parallels the way that Jews, and Israel, are seen by many today, both in the United States and abroad.

Don’t get me wrong. I applaud the way in which del Toro has allowed a Jewish character to take a leading role as a vampire hunter in a vampire narrative. Having eliminated the supernatural themes from the vampire mythos, it is not surprising that Setrakian’s Jewish identity relates only to his ethnicity. But del Toro missed an opportunity to transcend the stereotypical fully by including something of his religious tradition, by showing that the source of his strength is derived from his belief in the value of human life, the pursuit of justice, and a sense of spirituality.

Monday, December 19, 2016

Can You Hear Me Now?

Following up on my last post, here's another one of my op-eds from the Jewish Standard, this one published on September 2nd, just as the new school year was beginning:

As I begin my 33rd year in higher education, I can’t help but notice that my students are getting younger and younger every year—while I myself haven’t changed a bit.

Now, if you’re thinking that maybe I’ve gotten things mixed up a bit, that maybe it only seems that way from my point of view, I invoke in my defense Albert Einstein’s theory of relativity. But rather than continue to argue the point, let me share another observation with you:

Cell phones have caused my students’ bladders to shrink. I know, I know, it may be hard to see the connection, but the correlation is quite clear. It used to be that students could sit through a class of approximately an hour and fifteen minutes without a problem, and it was rare that someone would need to get up in the middle of class to go to the restroom. It would happen, of course—we all are human, after all—but not very often.

But somehow, increasingly in recent years, students have needed to go more and more often. And this coincides with the fact that, just like the rest of us, they have come to carry their mobile devices with them at all times, including to class.

Many of them try to hide their cell phones, keeping them on their laps, which is why I think the devices are having a physiological effect. I do try to point out, by the way, that this maybe isn’t the best place to put your cell phone, at least not if you plan on having children some day. I point out that mobile devices do generate electromagnetic radiation, and that we really don’t know for sure how that affects the body. Do you really want to take the chance?

Of course, I know that the sudden rise in students excusing themselves during class is not due to the effects of cellular signals on their bodies, but rather to the effects of text messages on their minds. The magnetic pull of our mobile devices is altogether extraordinary, and affects all of us, young and old. There even is a new word to describe the compulsion, FOMO—Fear Of Missing Out. The fear is nothing new, but never before has it been so intense and unrelenting.

And while our smartphones may be the cause of it all, it has nothing to do with the fact that they are telephones. Remember the days when everyone had a distinctive ringtone, often a few seconds of a favorite song? When every day we saw ads that urged us to buy special ringtones from a selection of thousands? Remember how we spent a considerable amount of time deciding which one to set as the mark of our own individual identity?

Funny how those days have come and gone. And the upside is that there are fewer instances when cellphones ring at inopportune times because their users forgot to put them on silent (or turn them off, something almost no one does anymore). They don’t interrupt services, or a theatrical performance, or a class, very much any more.

The ringing was more intrusive, but at least we all were embarrassed when it happened, and often enough would not answer it. Texts and status updates are nowhere near as obtrusive as ringing phones, but for that reason they are so much harder to ignore. The desire—for most of us the need—to check the new message, and to respond to it immediately, is all but overwhelming.

And you may think that no one sees the light from your phone shining in the darkened movie theater, but we do. That’s why theaters now ask their patrons to turn them off.

And you may think that no one sees you reading your messages or even responding to them during services, but we do. Back in the day, when a New York team was in the World Series and a game was being played during Rosh Hashanah or Yom Kippur, there might be a congregant who came to services with a transistor radio and earpiece. But he (inevitably it was a he) would step outside the sanctuary or shul to get the update. He wouldn’t listen to the game in the pews, and everyone understood that this was a singular exception.

And my students may think that their professors don’t see what they’re doing, but we do. We can see that they’re looking down and tap tap tapping on something with their fingers. Or for the ones with laptops, we can tell when their eyes are glued to the screen, and they’re furiously typing away far and beyond what might be warranted by taking notes in class.

So why do they get up and leave during class? Perhaps it is out of a sense that they’re doing something inappropriate for class, but Sherry Turkle offers a different explanation in her insightful book, Reclaiming Conversation: The Power of Talk in a Digital Age. They are seeking solitude so that they can focus on crafting a response without being distracted by the class. They see it as editing and creating the best possible version of themselves.

. . . . . . . . . . . . . . . . . . . .

Turkle is rightly concerned about the negative effects of our smartphones on all of us—and especially on the young. That we forget or never learn how to deal with boredom, how to let our minds wander, how to daydream, and how to interact with others in a meaningful way. Messaging means never having to apologize, not really, not in a way that forces you to recognize the effect you have had on others, to see it in their faces. Messaging means you never have to stumble through awkward silences, difficult exchanges, never have to go the effort of really relating to someone else. Conversation among friends, family members, and co-workers is becoming a lost art.

Texting is safe, unless of course we’re driving. Think about how much concern there was about talking on cellphones and driving, and how much worse it is to be texting or looking at updates on Facebook, Twitter, or Instagram! Emotionally, texting is safe, and face-to-face interaction is risky. But without risk, there is no growth. And dialogue is the best way to achieve what Martin Buber called I-You relationships, relating to other people as people, as opposed to the I-It relationships, relating to others as objects.

In many ways, messaging and especially updates give us neither I-You nor I-It relationships. Instead, they simply reflect back our own selves, mirror images that show only the surface: I-I relationships. And this brings to mind the warning given by Echo to Narcissus: Better watch yourself!

In his recent book, Not in God’s Name: Confronting Religious Violence, Rabbi Jonathan Sacks notes that the Hebrew Bible was meant to be heard, not read, and the stories of family conflict in the Torah, which often take unexpected turns, should be understood in this context, one where you cannot see the text in its entirety, only hear the narrative as it unfolds, step by step.

. . . . . . . . . .. . . . . . . . . .

It may be hard to believe, but reading silently was all but unknown until after the invention of the printing press. And this is so very important, because when we listen, we listen together, as one, but when we read silently, even if we read the same text at the same time, we read as isolated individuals.

Dialogue, discussion, debate, and devotion are communal activities, very much so in the tradition of Judaism. Whether it’s learning, praying, conversing, or simply being, we all need to put our mobile devices down and just listen. Listen to others, listen to the world, listen to ourselves.

After all, that still small voice that Elijah heard was not a text message.

Can you hear me now?

Sunday, December 18, 2016

Houdini Whodunit

So, seeing as I'm still playing catch-up, I figured I'd post one of my op-ed pieces from the Jewish Standard, this one published in the June 24th issue. And just so you know that I haven't been a total slacker as far as this sort of thing is concerned, I did post it online on my Jewish Standard Times of Israel blog on June 30th. That post included an update to the original column, and this version is further updated, as you'll see if you read through to the end:

My son was about 8 or 9 when we had our first family outing to Six Flags Great Adventure in Jackson, New Jersey.

As I recall, it was his first time in the amusement park, and my first time as well. And I was pleased to discover, soon after entering, an attraction called Houdini’s Great Escape. It paled in comparison to anything that can be found at one of Disney’s or Universal’s theme parks, but I was happy to have the opportunity to introduce my son to the great Jewish showman Harry Houdini.

Houdini was a household name when I was growing up, immediately recognizable as the world-famous escape artist of a bygone era. The fact that Houdini was Jewish also was well known, especially within the Jewish community.

Houdini’s fame persisted long after his death in 1926, at the age of 52, but it began to fade in the waning years of the 20th century. I wonder how many millennials have heard of him these days. For that reason, I applaud Six Flags for keeping his memory alive. I am particularly grateful to all those who protested when Great Adventure closed the ride in 2008, and convinced Six Flags to bring it back in 2011.

We bought my son a hamster about a month or two after our trip to the amusement park, and I asked him what name he wanted to give to his pet. He answered, “Harry.” I smiled and said, “So you want to name him after Harry Houdini?” “No,” he replied. “After Harry Potter.”

I immediately realized that Houdini’s Great Escape made a much greater impression on me than it did on him, and that there was no competing with the young adult novels by J. K. Rowling, and even more so with the Warner Bros. film adaptations, with their amazing special effects, which made magic seem real. This amounts to a bit of a reversal, as stage magicians produced some of the first special effects to appear in early cinema.

Houdini himself started out as an illusionist performing in vaudeville, before achieving widespread fame by specializing as an escapologist. He also starred in a few silent films between 1906 and 1923, but he did not enjoy the same success on the screen as he did in live performance.

Significantly, Houdini was devoted to stage magic as a profession, and led the Society of American Magicians as president of that organization for almost a decade, his tenure cut short by his untimely death. The society pays for the maintenance and care of Houdini’s grave site, which is in the Machpelah Cemetery in Queens. The monument displays both his stage name, Houdini, and his actually family name, Weiss; he was born Erik Weisz in Budapest, the son of a rabbi, and was only about 4 years old when his family emigrated to the United States. That’s when Erik Weisz was changed to the German version, Erich Weiss.

Though Houdini died almost 90 years ago, his name recently has been resurrected on television with the airing of Houdini & Doyle, a series launched last spring on Fox. It’s based on the actual friendship between the great escapologist and Arthur Conan Doyle, the British author best known as the creator of Sherlock Holmes. While drawing on bits and pieces of historical fact, essentially the series is fictional and full of anachronisms, blurring the line between fiction and nonfiction in ways that have become quite common in recent decades. The central fiction is that Houdini, who is performing in London, teams up with Doyle to solve mysteries that baffle the police.

In this new series, Michael Weston (née Michael Rubinstein, grandson of Arthur Rubinstein) became the most recent of at least a dozen actors to have portrayed Harry Houdini. His predecessors include Tony Curtis, Harvey Keitel, Norman Mailer, and Adrien Brody. In this role, Weston looks Jewish, but not in a way that might be deemed stereotypical or particularly overt. His speech does not feature any obvious form of Jewish (or Hungarian) accent, although it does strike me as very similar to the kinds of voices I hear at my congregation. In short, in this series, the fact that Houdini is Jewish is downplayed significantly—but it is not entirely absent.

Houdini & Doyle is a TV version of the buddy film genre, a type of narrative especially commonplace in American popular culture, no doubt due to the diversity of American society. That’s because it depends on strange bedfellows, or if you prefer Neil Simon to Will Shakespeare, an odd couple team-up. The buddies often contrast opposing qualities—rich and poor, white and black, male and female, young and old, professional and amateur, and so on.

The great French anthropologist, Claude Lévi-Strauss, argues that a culture’s myths are ways of symbolizing significant polar oppositions, and scholars analyzing popular culture, such as Arthur Asa Berger, have applied this approach to film, television, and other media. Looking at Houdini & Doyle through this lens can be quite revealing.

To begin, Houdini is American and Doyle is British, Houdini is ethnic while Doyle is a white Anglo-Saxon Protestant (WASPs are an ethnicity, of course, but traditionally they are presented as non-ethnic in American popular culture), and Houdini is an American immigrant while Doyle is native to Britain. (The show is set in London.) Houdini’s background is not emphasized in the first few episodes, but in the third episode, “In Manus Dei,” he falls ill and his mother, who has accompanied him on his travels and speaks with a noticeable accent, gives him chicken soup as a cure. Her character, Houdini’s own devotion to her, and the insecurity associated with being an immigrant all are featured more prominently in episode 5, “The Curse of Korzha,” and the fact that Houdini is Jewish is discussed briefly in episode 6, “The Monsters of Nethermoor.”

On the one hand, it is quite positive that a Jewish-American immigrant can serve as a symbol of an American in general. On the other hand, Houdini’s Jewishness mainly is reflected in his being a victim of prejudice, as he reveals in episode 6. This also makes him a champion of tolerance, as he defends another character facing discrimination and scapegoating, which is commendable. But in this respect, there is no contrast with Doyle, who is sympathetic, albeit revealed as never having been the victim of bias, while the third main character, Constable Adelaide Stratton, Scotland Yard’s first policewoman (an anachronism), also is subjected to significant prejudice and therefore is in favor of tolerance.

Having viewed seven out of the 10 episodes that comprise the first season of the program, a joint British, Canadian, and American production, I would have wanted to see Houdini’s Jewishness reflect something more than ethnicity and open-mindedness. I would have liked it to reflect as well some aspect of his religious heritage. But of course that would undercut his role as a symbol of Americans in general.

Other contrasts come into play. Houdini is a famous and self-promoting entertainer, while Doyle enjoys the quieter esteem accorded as an author, one somewhat embarrassed by the popularity of his Sherlock Holmes stories. Houdini’s success makes him relatively affluent and his brashness marks him as nouveau riche, while Doyle is the model of upper-middle-class propriety, as befits a physician. (That’s his day job.) There is a bit of a contrast between low and high culture, between the sensationalism of the popular performer and the reserve of the man of letters, which also maps onto the egalitarianism of American society and the elitism of the British (Doyle eventually receives a knighthood). It’s also the contrast between the rags-to-riches story of the ethnic immigrant and the conservative narrative of old money. Additionally, there is a contrast between Houdini’s physicality, as an escape artist and also as a fighter, and Doyle’s cerebral quality.

The major opposition on which the program turns, however, is between Houdini as a skeptic and rationalist and Doyle as a believer and spiritualist. While the belief that it is possible to communicate with the spirits of the dead is age-old—King Saul speaks to the ghost of Samuel in the Tanach—the spiritualism movement began in the 19th century. It was inspired in large part by the ethereal (but decidedly earthly) form of communication introduced by the invention of the telegraph, and later by messages sent over the air by radio.

Doyle actually was an ardent believer in spiritualism. He believed in it so strongly that this difference of opinion eventually brought his friendship with Houdini to an end. And Houdini actually was firmly committed to debunking anyone claiming to have psychic powers or the ability to communicate with the dead, invariably revealing them as scam artists using the same methods as stage magicians.

Houdini & Doyle draws on these historic facts to set up the program’s main opposition. It’s similar to The X-Files, except that Gillian Anderson’s Dr. Dana Scully was the skeptic and David Duchovny’s Fox Mulder was the believer. Doyle’s scientific background as a physician does come into play when he solves mysteries, but it does not prevent him from believing in psychic phenomena. Interestingly, Houdini’s and Doyle’s roles are reversed in “The Monsters of Nethermoor,” but only because the unearthly phenomenon being investigated is, in fact, alien beings, and Houdini is willing to believe in the scientific notion that life on other planets is possible.

Houdini, then, comes across as something of a 20th century Spinoza, a modern secular humanist, in contrast to Doyle’s apparent superstition. And the episodes clearly favor science over spiritualism, while portraying both buddies as sympathetic characters. Here too, however, I would wish for something more than rejection of belief on Houdini’s part. I’d have liked some positive expression of Jewish faith, its emphasis on ethics, even a touch of true spirituality.

Still, I applaud the show’s creators for bringing the spirit of Houdini back to life and with renewed vigor. This doesn’t seem like the kind of program that will gain much of an audience, or even make it to a second season. But escaping cancellation may just be Houdini’s greatest trick of all.

First Addendum

Two additional episodes have aired since I wrote this op-ed, one after it was published on June 24th. At the end of episode 8, “Strigoi,” which features their contemporary, Bram Stoker, author of Dracula, Houdini discovers that his mother has passed away. This and other matters prompts a trip across the Atlantic in episode 9, “Necromanteion” (the title referring to an invention of Thomas Edison's, who appears in the episode, that is supposed to allow communication with the dead via radio waves).

The episode includes a scene of a Jewish funeral. Incredibly, Houdini is shown at the grave site minus any form of head covering, and walks out on the ritual, criticizing the solemnity of the proceedings. While the intent is to show that Houdini is suppressing his feelings of grief, it also resonates with his rejection of superstition in an unfortunate manner. The episode ends with his return to his mother’s grave to recite a Hebrew prayer, alone and therefore not as part of the Jewish community. This no doubt reinforces his connection to Doyle and Stratton, but at the cost of a positive portrayal of Jewish community, and one of the most essential functions of any religious tradition.  

Second Addendum  

In the final episode of the season, "The Pall of LaPier," Houdini receives spiritual advice from a native American that he finds comforting. This is a common trope in American popular culture, the "noble savage" as a source of wisdom and superior spiritual connection in contrast to us sophisticated moderns, but once again, this appears in the absence of any link to Houdini's own faith, any interaction with a rabbi, and almost not acknowledgement of Jewish mourning rituals.  And just to be clear, the problem is not in this one series, but the fact that this is typical of the way that Jewish characters, whether historical or fictional, are portrayed in our popular culture.

Thursday, December 1, 2016

On Blackboard

So, if you're in academia, you probably know more than a little about Blackboard. No, not the pirate, that's Blackbeard, although there is a connection of sorts, given piracy's association with digital media. And I'm not talking about the old fashioned educational technology of the chalkboard, either.

No, this is about a form of new media and digital technology used exclusively by educational institutions, brought to us by Blackboard, Inc., and its proprietary learning management system.

If you're a student, you most probably have used it for some of your classes at least. If you teach, maybe you use it, maybe you don't. For those of us who don't use it, some avoid Blackboard because they don't care for such technologies at all. 

Others, including new media mavens such as myself, are critical of it as a system and prefer to use tools that can be used outside of academia, such as those provided by Google. Doing so makes more sense if you're studying new media, and if you want to prepare students for working with new media outside of the ivory tower.

I do admit, though, that for other kinds of classes, I usually don't bother with the system, and opt for good old fashioned face-to-face interaction, and printed documents. It's not that I've never used Blackboard or would never use it in the future. I just don't love it.

Which brings me to a little article that was published last April 13th in Fordham's student newpaper, The Ram. The title was the piece is Blackboard as a Blight to Fordham Technology (I believe I gave them the Blackboard as a Blight, bit, prone as I am to hyperbole, and alliteration). The article was authored by Margarita Artoglou and Kristen Santer, in case you were wondering, and it begins like this:

The use of technology in the classrooms at Fordham can be extremely varied. One class may rely on technology, while another completely disregards it. Although students may bemoan the small bandwidth of Fordham Wi-Fi or the occasional faulty smartboard, most professors find that Fordham’s IT services and technology offerings are average compared to other schools.
Now as it continues we come to a relevant point:

Fordham offers several workshops to help get professors accustomed to new technology offerings and IT updates. Some of the workshops include introductions to SMARTBoards, Blackboard and creating and editing video files. Professor Lance Strate also agreed with the general consensus, “It’s a progression for sure, but I have seen schools that are much worse off than we are as far as not having [technological resources].”

Umm, I don't think the quote quite reflects what I was talking about, but let's say there's a spectrum, and maybe Fordham is somewhere in the middle, with our level of technology not as good as it could be, but better than a number of other schools. Of course, in some ways it would be better to have no technology at all than to have some technology that doesn't work quite work, and that leaves everyone feeling frustrated.

Be that as it may, let's turn to my friend, former student, and colleague, now teaching at Manhattan College, Mike Plugh, for a comment:

The continuous problem that professors seem to have with Fordham’s technology is Blackboard. Michael Plugh, a Communication and Media Studies professor, finds it frustrating. “Everything at Fordham is pretty straightforward, except Blackboard,” he said. “I’m sort of unwilling to get invested in Blackboard because I’m not convinced it has a life beyond itself.”

A scholar after my own heart, let me echo Mike's sentiments:;

Other dissatisfied professors with Blackboard were not as nice as Plugh. “I hate Blackboard,” Professor Lance Strate said. “I understand why it’s used but I think it’s a really badly designed system.” It seems that discontent may be an understatement of the professors’ feelings about Blackboard. It clearly seems to cause more problems instead of making them simpler and more convenient.

That's right, baby, I tell it like it is. But am I lone voice crying out in the wilderness? Maybe not:

Strate is not alone in terms of his problems with Blackboard. Many professors dismiss the system completely and use alternative, free software to communicate with their students. Among them is Professor Cornelius Collins, who finds Google Drive to be a much smoother user experience than Blackboard. “There are fewer steps [with Google Drive],” Collins said. “It’s integrated with students email and it suits my purposes. I find that Blackboard has built in so much functionality that it’s hard to do it in a streamlined way.”
 And then there's that important tenet in investigative journalism—follow the money trail:

At the point where professors are shunning paid-for software in favor of free substitutes, it is clear that Blackboard represents a blight on Fordham’s technological progress. Furthermore, if professors are choosing not to use Blackboard, then the university is needlessly wasting money on the program that could instead be put toward more fruitful pursuits.

All right now, let's cut to the chase, get to the nitty gritty, and hear from the students:

In addition, many Fordham students are disillusioned by Blackboard. “It’s completely disorganized. I never know what assignments are posted for what day,” Nicole Cappuccio, FCRH ‘18, said.

Students have also expressed distaste with the way that professors under-utilize the application. “None of my professors ever use the grading system on Blackboard either, and I think that’s a waste when I could be keeping tabs on my grades that way,” said Cappuccio.

And now, bringing the opinion-oriented article to a conclusion, here's what Artoglou and Santer have to say:

It is quite clear that Blackboard is an inferior platform for grading and source materials, especially when free platforms like Google Drive and WordPress are easily accessible. Fordham definitely needs to update Blackboard, either to a better platform or to a more workable interface with fewer bugs. However, the question becomes whether a university-wide platform like Blackboard is even necessary. Professors can just as easily use Google Drive and WordPress, and often prefer to.

Perhaps instead of spending money teaching professors how to use Blackboard, it could offer programs to help teach students basic technology, software and coding skills.  

 And there you have it!  And I certainly second the sentiment regarding teaching students about technology and coding, as a kind of literacy, media, digital, etc., that would go a long way in contributing to their education, when coupled with a sound liberal arts curriculum that helps them to learn how to think, how to think well (and critically), and simply, how to think.


Thursday, November 10, 2016

Thoughts on Trump

So, first of all, the good news is, I won my $100 bet with Paul Levinson. I was saying Trump was going to go all the way a year ago, actually. Not that I wanted him to, just that I could see the patterns that connected, historically, especially the parallels with our movie star turned president, Ronald Reagan.

So I was saying he was going to be our next president when we were talking at a department meeting at Fordham this past January, and Paul proposed the bet. It looked like a bad one on my part, as it required that Trump win the Republican nomination, something that didn't look at all promising at that moment, and then go on to win the presidency. My only out was that if Sanders became the Democratic nominee, the bet was off. Otherwise, Trump would go on to beat Hillary. That was the bet.

Back in March, I shared my views on a guest blog post for Visible Works Design, Trump By Design, which I recently reposted here on Blog Time Passing. And this also came up during the New York Society for General Semantics panel discussion held on September 9th, about which I posted recently: Political Talk & Political Drama Part 1: Election 2016. At that point I was already acknowledging that it didn't look good for Trump, and doubting my prediction even more by the time of our follow up NYSGS panel on October 26th, which I also posted recently: Political Talk & Political Drama Part 2.

So, it turns out I was right, and in case you missed it, Trump won. It's not the way I wanted things to turn out, but it does support the claim that media ecology provides better insight into contemporary politics than other approaches. Trump won by playing image politics, taking advantage of social media, and by sheer dominance of the news media. The only way he could have been stopped was if the news media had stopped covering him, which was never going to happen. The most mediagenic candidate won, or to use a term coined by Paul Heyer, Trump won on account of his exceptional media sense.

The irony is that Hillary Clinton read Walter Ong, and then Marshall McLuhan, when she was in college. The sad truth is that understanding media and media ecology does not guarantee a successful outcome, as Al Gore can testify to.

So, for us academics, most of whom were not supporters of Trump, as you might imagine, the one thing that we can look forward to is having much more material for critical analysis. This we share with comedians and humorists, who will have four years of Trump jokes to fall back on. It reminds me of 1972, after Nixon won re-election, and I picked up a copy of National Lampoon magazine. On an inside page, there was a photograph of the staff wearing party hats, blowing horns, looking like they were celebrating, like at a birthday party or New Years Eve. And the headline/caption read something like "Four More Years of Nixon Jokes!"

And I am gratified by the failure of the pollsters, once again, to provide an accurate prediction of the outcome. McLuhan called them galluptians (after the Gallup poll and Jonathan Swift's Lilliputians). And I think of the wonderful documentary about Edmund Carpenter, Oh What a Blow That Phantom Gave Me! One of the points stressed there, and in Carpenter's 1972 book of the same name, is that people change when then are suddenly able to see themselves. This applies to mirrors, and photographs, and the moving image. It also applies to the written words, through which we were able to see speech and thought, and to television. And polls give us another kind of reflection of ourselves, and that changes peoples' thinking and behavior. There is no question that they affect and distort the democratic process. This is bad. And if we cannot eliminate them altogether, I for one am happy to see them discredited.

I think we also have to acknowledge that this is, in fact, how democracy works. Nothing comes without a cost, and the cost of political freedom, such as it is, is that citizens may make poor choices, may elect incompetent or corrupt officials (hey, I live in New Jersey, know what I mean?), but what counts is the peaceful transition of power from one group to another. This is not to discount the potential for short term harm, but in the long run, it is a viable alternative to oligarchy and technocracy, both representing rule by entrenched elites.

 And we are long overdue for political realignment, and this may bring it on. While third, and fourth, parties were not a factor in the outcome, they did receive unprecedented attention, and that is a good thing. Maybe next time around, they will nominate better qualified candidates who voters can seriously consider and support.

On that score, I would very much like to see a political scientist who is well versed in how American government works provide a clear and detailed discussion of how our political system could function if we had three or four major parties, instead of just two. How would Congress work? Spell out how it could operate under such circumstances. And how would the presidential election work?

 So, back to reality, my heart goes out to all the Millennials who are nursing their own broken hearts about the outcome of this election. For most of us, it was a shock because it was so completely unexpected. But I think for us older folks, we know politics is where idealism goes to die, that it's often a dirty game, or at best the art of the possible. I love the Millennials, my students and my children, they are absolutely wonderful, generous, open, accepting, fair-minded. I really believe that they are going to make things better, and this election will stand as an important lesson for them, a kind of tempering if you will, that will prepare them for the future. Their generation is now book-ended by 9/11 and 11/9. The baby boomers were associated with extraordinary social progress, the change and progress that occurred in the 2nd half of the 20th century has been nothing short of revolutionary. I believe the Millennials will pick up that torch and carry it forward, much further even than we could conceive.

And so, the response should be clear: Be resolute. Be resilient. Stand up for what you believe. Defend those under siege, care for those in need. Do not stand idly by. For all people of good will, this amounts to a call for action.

 And this is not much consolation, I know, so here are a few more thoughts:

It may seem as if Trump can do whatever he wants to with a Republican Congress, but things generally do not work out that way. Now, they have no one to blame, and no one to fight with, except each other. Think about all the conflict that came up during the primaries. You can expect the honeymoon to be a short one, and the Republicans to start tearing themselves apart pretty quickly.

Nothing succeeds like failure, nothing fails like success. The Democrats will emerge much stronger, and hopefully more progressive. 

It is highly unlikely that Trump would win a second term, so the damage should be limited to four years. I had predicted that whoever wins this election would be a one-term president. We haven't had one since George H. W. Bush. And look forward to midterm elections in two years, which should usher in a Democratic Congress.

In fact, I think there is a fair chance that Trump will not finish out his term. He may resign out of frustration or for other reasons. He may be impeached. There is also a chance he will not live out his term, given his age, and those "second amendment people" he referred to who may feel betrayed when he inevitably crosses them, or just some crazy out there. My prediction that Trump would win was based on the similar pattern I recognized between him and Reagan, and Reagan was the last president to get shot. Trump's appeal to extremists and crazies means he has been playing with fire, so he should not be surprised if he gets burned.

Things are usually not as bad as they seem (also usually not as good). There is a tendency to overestimate the power of the presidency and possibilities for change. Much of our social structure cannot be easily altered, for good and for ill. 

Even a broken clock is right twice a day. There can be some positives, talk of more support for infrastructure, for example, more tolerance for LGBTQ (for a Republican), etc. Trump is not a fundamentalist, not an arch-conservative. Once you get past the personal, at the very least the actual policies may be better than if, say, Ted Cruz had won. Trump is, after all, a New Yorker.

A loose cannon leads to friendly fire. Again, the Republicans may suffer more damage than the Democrats.

Unintended consequences are inevitable, so even when things look bad, good things can come out of it. Again, we vastly overestimate the power of individuals to control events.

And finally, here is my mantra:

We survived Nixon.
We survived Reagan.
We survived two Bushs.
We will survive this too.

 The big scare during my childhood was the election of Nixon. People talked about leaving the country. And he was someone who was not only evil in various ways, but very effective as a politician. Would it better to have someone who is effective but without a moral compass or sense of decency, or someone who is inexperienced and largely incompetent?

The big scare of my young adulthood was the election of Reagan. People talked about leaving the country. People thought he would start a nuclear war with the Soviet Union. But we got through his two terms, and one more with his successor, Bush the Elder.

The more modest scare of my middle age was the election of George W. Bush. There was just this sense of him being a joke, childish, incompetent, and reliant on Dick Cheney, the evil power behind the throne. There was talk of a slide into fascism, and some people also talked about leaving the country. And there was the Iraq War, which was terrible, but without trying to minimize the harm that resulted, we got through it.

We got through the Civil War (although some say it never really ended), we got through two world wars, the great depression, Korea, Vietnam, Civil Rights, Watergate, the Arab oil embargo, the Iranian hostage crisis. We got through 9/11. We'll get through this.

It will be ok.