Thursday, July 31, 2014

Facebook Follies

My last post was on The Future of Facebook? so I might as well follow up with another post on Mark Zuckerberg's creation, what some might say is the Frankenstein monster of social networks. 





And speaking of electric shocks to the system, but in this case not so much it's alive! as he might be dead, there is an eerie resonance between Facebook's experimental manipulation of users' emotions in 2012 and Stanley Milgram's famous obedience to authority experiments that were conducted over half a century earlier. Click on the link to read the summary of the study over on Wikipedia.

About a decade after the experiments, Milgram published a book entitled Obedience to Authority, which went over the experiments, and contextualized them in light of the excuse used by many Nazis after World War Two that they were only following orders—Milgram significantly referenced Hannah Arendt's Eichmann in Jerusalem—and controversies concerning our own conduct of the Vietnam War, notably the My Lai Massacre. Milgram's book was required reading in Neil Postman's media ecology doctoral program.





I've included links to the Milgram book here, as well as Arendt's famous report on the Eichmann trial, where she coined the phrase, the banality of evil, for your convenience. And we certainly can make an equation of it:

Obedience to Authority=Banality of Evil

Anyway, back when I was a doctoral student, Postman also showed us the documentary about Milgram's experiments, which is a little hokey, maybe laughable for being so, but also quite disturbing. It occurred to me that I should check on YouTube to see if it was there, and low and behold:







For Milgram, the moral of the story was how willing so many of us are to obey authority even while disagreeing with what they were being told to do, that most of the subjects who went all the way with the electric shocks were not sadists, and in fact were quite disturbed by what was happening. When asked about it after the experiment, they said they wanted to stop, but the experimenter wouldn't let them, this despite the fact that no force was used, no coercion or persuasion, beyond the experimenter's insistence that the subject continue to give the victim electric shocks. Milgram referred to this as agentic shift, that the subjects ceded their independence as agents, abandoning their sense of responsibility for what was happening, and their freedom of choice. 

Put another way, the relationship, in this case an authority relationship, was more power and overwhelmed the content of the situation, that an innocent was being harmed. This is consistent with our understanding of relational communication as established by Gregory Bateson and Paul Watzlawick, and parallels McLuhan's famous dictum, the medium is the message.

The results would no doubt be different today than they were back in the day when respect for authority and a desire for conformity were quite powerful. It was the kind of culture we associate with the fifties, but it extended into the early sixties, at least. The point that Milgram missed, however, was that he himself had conducted a cruel set of experiments, inflicting psychological damage on some of his subjects, all in the name of a more abstract higher authority: Science. There are many forms of obedience, after all.

In response to these experiments, and others like them, but especially these, rules were put in place governing the use of human subjects. An experiment like this probably could not be conducted today, at least not at a university, where any study involving human subjects has to be reviewed and approved by an Institutional Review Board. So I really have to wonder how in the world my undergraduate alma mater, Cornell University, approved the participation of two of its faculty in an experiment where the emotional states of Facebook users were manipulated without their knowledge, without even their awareness that they were subjects in an experiment?

The study was carried out in 2012, and the results recently published in the Proceedings of the National Academy of Sciences of the United States of America. The article is authored by Adam D. I. Kramera of Facebook, co-authored by Cornell faculty Jamie E. Guilloryb and Jeffrey T. Hancock, and entitled, Experimental evidence of massive-scale emotional contagion through social networks. I'm not sure whether that link will work if you're not at a university that subscribes to the journal, so here is the abstract:


Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. Emotional contagion is well established in laboratory experiments, with people transferring positive and negative emotions to others. Data from a large real-world social network, collected over a 20-y period suggests that longer-lasting moods (e.g., depression, happiness) can be transferred through networks [Fowler JH, Christakis NA (2008) BMJ 337:a2338], although the results are controversial. In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.


There also is a shorter paragraph summarizing the significance of the study:


We show, via a massive (N = 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.

Now, if you can access the article, you can see in the comments section a number of individuals expressing concerns about the ethics, and even the legality of the study, as well as some defense of it, stating that it was a minimal risk study that did not require informed consent. There are other criticisms as well, concerning the methodology and reasoning used to interpret the data, but let's hold that aside for now. Instead, let's go to an article published by the Christian Science Monitor on July 3rd, where I was one of the new media scholars asked to comment on the revelations regarding Facebook's questionable research.

The article, written by Mark Guarino, is entitled Facebook experiment on users: An ethical breach or business as usual? and it starts with the following blurb: "Many Internet companies collect user data. But privacy experts and Internet users question whether Facebook's 2012 experiment marked a breach of corporate ethics and public trust." And here's how it begins:


It's not yet clear if Facebook broke any laws when it manipulated the news feed content of nearly 700,000 users without their explicit consent to test whether social networks can produce "emotional contagion."

(It turns out, to a modest extent, they can.)

But the uproar after release of the results of this 2012 study is raising new questions on how pervasive such practices are–and the extent to which they mark a breach of corporate ethics.


While it is generally known that Internet companies such as Facebook, Google, Microsoft, Twitter, and Yahoo, claim the right to collect, store, access, and study data on their users, the Facebook experiment appears to be unique.

Unique is a bit of an understatement. Facebook users do willingly provide the social network with a great deal of personal information, while at the same time making that information accessible to others, to Facebook friends of course, and often at least some of it being made open to public view, as well as to any third party whose applications users may be willing to approve. That's understood. It is also no secret that Facebook makes money from advertising, and delivers advertising targeted to users, based on the personal information we all provide to them. And neither does anyone try to disguise the fact that Facebook can track when people click on a link, that advertisers and marketing professionals can know how many people clicked on their advertisement, and of that group, how many actually made a purchase. We may or may not approve of all or some of this, and some of us may not be aware of the extent to which this all works, but none of it is kept hidden from users. As they used to say on the X-Files, the truth is out there.






But this is a horse of another color, and by this I mean both Facebook and the experiment, as the article proceeds to make clear:


Not only is the company the largest social network in the world, the kind of information it accumulates is highly personal, including user preferences spanning politics, culture, sport, sexuality, as well as location, schooling, employment, medical, marriage, and dating history. The social network algorithms are designed to track user behavior in real time – what they click and when.

The Information Commissioner's Office in the United Kingdom announced the launch of an investigation to determine whether Facebook broke data protection laws governed by the European Union. The Federal Trade Commission in the US has not yet said whether it is launching a similar probe or not. On Thursday, the Electronic Privacy Information Center, a civil liberties advocacy group in Washington, filed a formal complaint with the FTC, urging action.

The experiment, conducted over a week in January 2012, targeted 689,003 users who were not notified that their news feed content was being manipulated to assess their moods in real time. The study determined that an increase in positive content led to users posting more positive status updates; an increase in negative content led to more negative posts.

So now that the facts of the matter have been established, it's time to raise the question of ethical conduct, or lack thereof:


What alarmed many Internet activists wasn't the use of metadata for a massive study, but rather the manipulation of data to produce a reaction among users, without their knowledge or consent, which they see as a violation of corporate ethics.


Just to interrupt again for a moment, why specifically activists? Doesn't this pretty much marginalize the concern? We could instead say that this alarmed citizens groups, that might sound a little better, but still. Doesn't this alarm Facebook users in general? Or as we used to refer to them, citizens? Just asking, mind you... Okay, back to the article now: 

“It’s one thing for a company to conduct experiments to test how well a product works, but Facebook experiments are testing loneliness and family connections, and all sorts of things that are not really directed toward providing their users a better experience,” says James Grimmelmann, a law professor and director of the Intellectual Property Program at the University of Maryland Institute for Advanced Computer Studies in College Park. “These are the kinds of things that never felt part of the bargain until it was called to their attention. It doesn’t match the ethical trade we felt we had with Facebook,” Professor Grimmelmann says. Many academics studying tech and online analytics worry about the ethics involving mass data collection. A September 2013 survey by Revolution Analytics, a commercial software provider in Palo Alto, Calif., found that 80 percent of data scientists believe in the need for an ethical framework governing how big data is collected.


So now it's academics and activists, that's a little better, but academics are not exactly part of the mainstream, or part of what Nixon used to call the Silent Majority. Oh well, let's hear what Facebook had to say in response to all this:


Facebook leaders expressed remorse, but they stopped short of apologizing for the experiment, which reports show reflect just a small portion of the studies that the company regularly conducts on its nearly 1 billion users. On Wednesday, Facebook COO Sheryl Sandberg told The Wall Street Journal the study was merely “poorly communicated.... And for that communication, we apologize. We never meant to upset you.”

In response to its critics, Facebook notes that policy agreements with users say that user data can be used for research. However, the term “research” was added in May 2012, four months after the study took place. Others say the complexities of the tests require stricter oversight, now that it is known the company has been conducting hundreds of similar experiments since 2007 without explicitly notifying the public.


Oh yeah, don't forget to read the fine print, right, and be sure to carefully review every update to the policy agreement that comes out. How about a different point of view, one that reflects a little bit of common sense?


“Burying a clause about research in the terms of use is not in any way informed consent," says Jenny Stromer-Galley, an associate professor who studies social media at the School of Information Studies at Syracuse University in New York.

"The issue is that people don’t read terms of use documents, and ethical principles mandate that people involved in basic research must be informed of their rights as a participant,” she adds.


Some say Facebook could have avoided the controversy simply if it had provided more transparency and allowed its users to opt out.

Transparency would be a start, but if they had been open and clear about the experiment, basically they would not have been able to carry it out. It would have been like Stanley Milgram telling his subjects, I only want to see if you'll do what this guy says, even though no one's forcing you, and by the way, the other guy isn't really getting any electric shocks. No doubt, had he done that, the Obedience to Authority experiments would have been just as effective and elucidating (please note I am being sarcastic here, obviously those experiments would have been useless and pointless).

Well now, we come to the end of the article, and guess who gets the last word?


Lance Strate, professor of communications and media studies at Fordham University in New York City, says that the revelations, which are among many such privacy violations for Facebook, suggest social networks have outlived their purpose because they no longer adhere to the Internet values of “openness, honesty, transparency, and free exchange.”

“With this move, Facebook has violated the essential rules of online culture, and now begins to appear as an outsider much like the mass media industries. It is almost impossible to recover from the breaking of such taboos, and the loss of faith on the part of users. Zuckerberg started out as one of us, but now we find that he is one of them,” Professor Strate says.



What do you think? Too melodramatic, too extreme, too much? I felt the situation called for a strong comment, a strong condemnation, and if you think that quote is harsh, here is the entirety of the comment I provided:


With the revelations concerning Facebook's latest social experiment, the social network is now dead. By dead, I mean that it has squandered its most precious resource, its credibility and the trust of its users, and is now an antisocial network. Facebook has previously run into repeated privacy concerns regarding its users, but most users have significantly reduced expectations for their privacy in an online environment. What individuals do not expect is to be manipulated, and in fact when attempts at psychological manipulation are unveiled, they often have a boomerang effect, resulting in individuals doing the opposite of what they are expected to do, thereby rejecting the manipulation, and the manipulators.

There is nothing new about conducting behavioral research on audiences on the part of mass media organizations and the advertising industry, but they usually involve subjects who are willing participants in answering surveys, being a part of focus groups, and consenting to be subjects in psychological experiments. Some of the research was necessary simply because mass media had no way to directly measure the size and characteristics of their audience, hence for example the famous Nielsen ratings. But social media made much of that sort of research unnecessary, as it was possible to track exactly how many people signed in to a particular site, clicked on a given advertisement, and subsequently purchased a particular product. It is well known that Facebook delivers ads tailored to different users based on the information they provide, willingly, in their profiles, and that adding various applications, on Facebook and elsewhere, allows other commercial interests to gain access to that same information, information otherwise freely available online anyway. The point is that all this gathering of data is done with the users' consent, but Facebook's manipulation of the users' news streams was carried out without anyone's permission, or awareness. This is where they crossed the line.

Whatever the corporate ethics may be, and some may say go so far as to say that the phrase "corporate ethics" is an oxymoron, there is an ethos that has long dominated the internet that emphasizes the values of openness, honesty, transparency, and free exchange. With this move, Facebook has violated the essential rules of online culture, and now begins to appear as an outsider much like the mass media industries. It is almost impossible to recover from the breaking of such taboos, and the loss of faith on the part of users. Zuckerberg started out as one of us, but now we find that he is one of them.

The result may not be the immediate demise of our leading social network, but it is the beginning of the end for Facebook as the dominant force in the online environment. Facebook may hang on as a kind of online registry for folks to find each other, but its other functions are being usurped by a variety of new services, and even that basic function of connecting with others is no longer a monopoly for Facebook. Younger users will be more inclined to leave Facebook altogether than older users, and it may be that many will continue to maintain profiles on the service, use it for messaging perhaps, but will stop checking their news streams.

In the short run, Google has much to gain from Facebook's loss, as users may turn to the Google+ service as a less objectionable alternative to Facebook. Indeed, Google+ has in effect been waiting in the wings, ready to usurp Facebook's place as the dominant social networking site, waiting in other words for Facebook to stumble and fall. And the decline of Facebook seems all but inevitable, based on past experience of the decline and fall of so many other companies that once dominated the new media landscape, from IBM to AOL to Microsoft to Yahoo to MySpace. Ultimately, though, the door is open for newer services to provide alternatives to Facebook based on entirely new models of connection, perhaps in a more complex mode involving many specialized sub-networks. By the end of the decade, some new service that is only getting started just now may well be the dominant force in online communications.


So, agree or disagree? Either way, you should know that this is not an isolated incident. Just recently, on July 29th, TechNewsWorld ran an article on a similar incident involving an online dating site: OkCupid's Confessed Hijinks Get Thumbs-Down. In it, my colleague and friend Paul Levinson had this to say:


"I think use of customers and users without their consent in experiments, for any reason, is unethical and outrageous," said Paul Levinson, professor of communications and media studies at Fordham University.

"People have a right to know if they're participating in an experiment—if their information is being deliberately controlled and manipulated to test a theory, if their activity is monitored beyond just general statistics of how often they log on, etc. The surest to get people angry and wanting to have nothing to do with social media is to make them unwitting guinea pigs," he told TechNewsWorld.





Paul and I often differ in our views on new media, and technology in general, but on this we are very much in agreement. To borrow a phrase from Jorge Luis Borges, the internet may be a garden of forking paths, but it is not a maze (although it has much to do with our amazing ourselves to death), and we are not lab rats trying to run through it as part of some social media experiment. A warning to all would-be web-experimenters out there, watch out!, because these lab rats bite. In a big way, like megabite and gigabite. Do not count on us being obedient to your online authority. Experiments have a funny way of backfiring on their experimenters. Just ask Dr. Frankenstein.

Tuesday, July 29, 2014

The Future of Facebook?

So back in March, I received a request from a Columbia University journalism student, Anne Bompart, to comment on reports of the unexaggerated death of Facebook coming about in the near future, or at least a drastic decline in the social networking site's popularity. The article was published, ironically enough in one of the last print issues of the Ivy League school's student paper, the Columbia Spectator, under the title of Interfacing the Future, with the subtitle, Is the social media giant's monopoly in danger? It was dated March 27th, and here's how the article began:


With the advent of Facebook, other social networking websites have become obsolete. Myspace? So yesterday. Even new ones that have attempted to launch during Facebook’s reign, like Google Plus, don’t measure up. It seems apparent that social networking websites have a limited lifespan—they’re born, are popular for a few years, and then die.

Is Facebook destined to suffer the same fate? A Forbes article by Gene Marks in 2013 discussed Facebook’s diminishing appeal to teenagers. And a January 2014 study by Princeton researchers John Cannarella and Joshua A. Spechler concurred: “Facebook will … lose 80 percent of its peak user base between 2015 and 2017.”

Yet, Facebook’s recent purchases of WhatsApp and Instagram, along with its acquisition of several other popular apps, suggest otherwise.

So what do I think of all this, you may want to know, or even if you don't, here goes:


Lance Strate, professor of communication and media studies at Fordham University, says, “I believe that Facebook would like to be the main platform through which most people access the Internet.”

Clearly, Facebook is hinting at its conquest in monopolizing the social networking industry by manipulating its users to share everything through a single platform. Strate adds, “Facebook’s purchases of Instagram, WhatsApp, etc. is not just about an ambition to rule the online world, so to speak, but also motivated by fear of losing the primary position they hold at this moment in time.”

Should Facebook become the “home page” of the Internet, it would definitely guarantee its success—not only through popularity, but also monetarily. As Strate describes, “By providing a universal interface, Facebook would be guaranteed a massive and reliable amount of advertising revenue.”

Okay, how about some facts to go with my opinion?

According to a 2014 New York Times article, Facebook’s total revenue in its last quarter was $2.59 billion. Mobile advertising generated more than half, while three-quarters of its 757 million users logged on using mobile devices. Facebook’s acquisitions of popular apps, then, are its attempts at garnering interest among teens, whose usage generates the most revenue.

Also interesting is Facebook’s addition of 50 new gender options in February. In addition to “male” and “female,” users can now choose a “custom” option that allows them to choose identifications like “transgender,” “cisgender,” “intersex,” and more.

By increasing the identities it encompasses, Facebook hopes to increase and satisfy its target audience. Its newest addition establishes itself as a progressive venture—its message is that it is the only social networking site relevant to millennials.

Facebook’s appeal to youth may be a response to the 2013 Forbes article’s prediction of its eventual decline. The article claims that the popularity of websites like Tumblr shows that people are moving to networks where they can use pseudonyms and avatars instead of real names and faces.

However, despite Facebook’s lack of anonymity, it has striven to allow self-expression as much as possible. In 2013, it tweaked privacy settings so that users ages 13 to 17 would be able to share posts publicly. Its addition of a mini-newsfeed provided another medium for users to constantly keep up with what their friends are doing.

And back to me for some thoughts on what this all means:

As Strate explains, “Apart from trying to remain cool and trendy, adding gender options also provides Facebook with additional data that can be used for advertising and marketing purposes.” By giving its users another way to identify themselves, Facebook can better provide users with ads that cater to their interests—resulting in a more personalized experience.

And Facebook appears to be taking preemptive steps to ensure its success. It constantly updates its aesthetic, the most notable being its introduction of the timeline in 2011. This constant revolutionizing creates the idea that Facebook is evolving alongside its users. Strate says, “[Mark] Zuckerberg and his colleagues are all too well aware of the volatility of a dynamic, rapidly evolving media environment. They are desperately trying to avoid having some other service do to them what they did to Myspace.”

And at this point, Bompart poses the big question:

But is such an environment sustainable? Online social networking is a relatively new phenomenon; the first online information-sharing board launched in 1978. Social media has developed rapidly and gained much popularity since, and there appears to be a viable future for the industry.

After all, social networking did exist before the Internet. As Laudone says, “The global reach of social networking sites, like Facebook and Twitter, are enabling connections across borders. Social networking sites—and technology more generally—will help facilitate face-to-face social networking.”

But social media’s volatility could mean that Facebook’s eventual demise is inevitable. Despite its attempts to advertise its modernity, a day might come when it cannot keep up with trends.

And let's hear from another scholar now:

Thomas DiPrete, professor of sociology at Columbia, adds, “Facebook’s fear is that some new company might emerge that offers features that lure enough users away from their platform to create a bandwagon effect. So they not only try to improve their own system, but they look to buy out potential competitors.”

More likely, Facebook might one day be unable to purchase most popular apps. DiPrete concurs, “Eventually some new company will avoid being bought out when small, and will emerge as an attractive alternative to Facebook.”

So, what's the conclusion?

Even so, the number of Facebook users increased by 22 percent from 2012 to 2013. Today, there are 1.3 billion monthly active Facebook users, of whom 48 percent log in on any given day. Every 20 minutes on Facebook, there are 1 million links shared, 2 million friends requested, and 3 million messages sent.

With numbers like that, it’s easy to see why Facebook is the social media industry’s behemoth. And much to the dismay of procrastinators pulling all-nighters in Butler, Facebook is not going away anytime soon.

And now, to round things off, contextualize my quotes, and clarify the process by which quotes are extracted from comments, here are my original set of answers to four questions posed to me about the future of Facebook:

1. I believe that Facebook would like to be the main platform through which most people access the internet. By providing a universal interface, they would be guaranteed a massive and reliable amount of advertising revenue. But their purchases of Instagram, WhatsApp, etc., is not just about an ambition to rule the online world, so to speak, but also motivated by fear of losing the primary position their hold at this moment in time. Zuckerberg and his colleagues are all too well of the history of new media, and how companies such as AOL, Yahoo, IBM, and Microsoft all appeared to dominate the industry at one time, only to be left behind due to innovation and the volatility of a dynamic, rapidly evolving media environment. In particular, they are aware of how quickly they passed by MySpace, once the major force in social media, and they are desperately trying to stay on top of the changes going on, and avoid having some other service do to them what they did to MySpace.

2. Adding new options to the category of gender, and elsewhere, is an attempt to appeal to users, and particularly necessary because Facebook offers a very structured environment, which means that it constantly has to be updated or become outmoded. But apart from trying to remain cool and trendy, adding options also provides them with additional data that can be used for advertising and marketing purposes.

3. I agree that it will one day be obsolete. As I mentioned, this is a volatile industry, and we've seen it happen to others, such as AOL, Yahoo, and MySpace. In particular, younger individuals want to find a space of their own, separate from their parents, and the more that Facebook becomes known as an advertising medium, and as one that violates privacy, the more that individuals will sign off from the service. But the main thing is that innovation will eventually result in something else that will take Facebook's place, maybe a service that better utilizes mobile technology.

4. Social networking existed before we became technologically interconnected, simply in the form of personal connections, and technology has only intensified the fact. We are a social species, so social networking as a phenomenon will continue via whatever technologies are made available to us. McLuhan famously spoke of a global village, and new media are fully realizing that concept, for good and for ill, in increasingly more intense ways. There is no question that the degrees of separation within the world's population are decreasing dramatically through our technologies, and that the degrees of connection are increasing every day.


And there you have it. I can't say when we all will start forgetting about Facebook, but I can say that there are quite a few folks out there who would say that it can't happen soon enough. Maybe we need to start a pool and take bets on when it'll happen? I call dibs on 2018...


Monday, July 28, 2014

Addiction as Faulty Metaphor

So, a few weeks ago I participated in a discussion over on the Media Ecology Association discussion list on the topic of media addiction. I normally don't get involved in exchanges on this subject, but another participant on the MEA list, Kent Walker, questioned the validity of referring to habitual media use as a form of addiction, so I decided to weigh in with my 2¢ on what might be considered a pet peeve of mine.

I do want to be clear that I understand that some folks are very involved and committed to the idea of media addiction, and if they want to use that sort of language, they are free to do so. I am not condemning it. But I am questioning it. I think some people may have felt threatened by me doing so, but that is the whole point of critical inquiry, isn't it?

Anyway, I think my comments on the discussion list were substantive enough to share here on Blog Time Passing, and I hope you agree, or at least will hear me out on why I think the current broadening of the term addiction is problematic.

Here are my first set of comments:


I think it may have been in a junior high school class in what was called "Hygiene" back circa 1970 that I first learned the medical meaning of "addiction" as referring to a substance that causes a physiological dependency in the user. Drugs that were categorized as addictive included alcohol, tobacco, opium/heroin, and barbiturates, while drugs like marijuana, LSD, mescaline, and amphetamines were categorized as non-addictive, but habit-forming. This came as part of a new effort at drug education, in response to the counterculture's embrace of illicit drugs, and the same distinctions were made when I was an undergraduate later on in the 70s, when I was taking a class in therapy and counseling and did some volunteer work for a drug counseling center.

As a former addict myself, in my case to tobacco, although cigarette smokers only occasionally referred to themselves as nicotine addicts, I can attest to the fact that there is a world of difference between substance addiction and habitual use of non-addictive drugs, or media, or any other sort of activity for that matter. I've known a few alcoholics as well, and that form of physical addiction seems even more intense, and it is well known that heroin addicts who go cold turkey rather than easing off of the drug can endanger their health, and even risk their lives.

This is why I personally do not support the current usage of addiction to apply to anything that is habit-forming. I know there are neurological explanations involving the brain releasing endorphins, but I just don't see that as comparable, and I do think the broader use of the term confuses an important distinction, and condition.

I suppose it could be argued that "media addiction" is a metaphor, like "media ecology" which of course I embrace. But not all metaphors are equally appropriate. Ecology can be understood as being about how organisms relate to their environments, and as such need not be confined to biology. Many of us in media ecology object to the use of literacy as a metaphor in "media literacy" because it ignores the distinction between the written word and other forms of communication. On the other hand, while I would prefer "media education", I can accept the usage of "media literacy" because the metaphor generally does not lead people to confuse television with books. And I don't go around objecting to folks who use the metaphor of "media addiction" because there is value in looking at our media use as habit-forming, creating media dependency, and generating withdrawal symptoms at times when people try to or are forced to go without.

But I don't use the metaphor myself, and I do think there is a problem in placing alcohol abuse in the same category as constantly checking your Facebook and Twitter feeds or playing games on your cellphone. When it comes to physical substance addiction, I think there's a difference there that makes a world of difference.

By the way, another point I should have made is that in addition to being a nicotine addict who has not had a cigarette in two decades, I also have the caffeine habit, to the point where I get a headache if I don't have at least one cup of coffee in the morning. But based on my first hand experience, it is clear to me that there is a world of difference between the yearning for my morning cup'o'joe, however strong it may be, and what I used to experience when going too long without a cig—what we referred to as a nic-fit.


Anyway, my post was troubling to some folks, and one response came from my old friend, Marty Friedman, who noted that there has been research done in this area that let to the changing definitions of addiction among professional therapists, as reflected in the Fifth Edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5), released by the American Psychiatric Association in 2013. So here was my response:


I know that psychologists change their views over time, and continue to do so, but that doesn't mean that the current view is correct, and there sometimes are political or social reasons that influence their "scientific" conclusions. The key distinction that they are overlooking, perhaps because they are psychologists rather than physicians, is that physical addiction is not just about psychological dependency or even neurological symptoms, but about actual change to the body, on a cellular level.

Now, people can use the term "addiction" to mean something other than physical addiction, but I am suggesting that that is best understood as a metaphor rather than a variation on the same phenomenon, and that it is an example of what Neil Postman referred to as the great symbol drain and the demeaning of meaning. And I think he would suggest that maybe we need different words for addiction that is physiological in nature, and the psychological sense of feeling as if you were addicted to some activity.

There is also the question of how far do we go in using scientistic terminology to talk about human behavior. We may not always want to frame behavior in terms of morality or ethics, but is every dysfunctional or negative behavior a syndrome or malady of some sort?

And I think there is definitely room for a media ecological critique of the tendency to frame behavioral problems as "sicknesses" in need of "treatment" or "therapy" of some sort. This comes out in some follow-up comments I made:


The value in looking at the broadening of the term "addiction" as being metaphorical is that it leads us to ask what is the purpose of the metaphor, what are the similarities, and the differences?

Referring to a habitual activity, be it gambling, sex, media use, or the use of substances that are not physically addicting as an "addiction" takes the activity outside of the individual's locus of control. This does reduce or eliminate personal responsibility for the behavior, which disallows any evaluation based on morality or ethics. This is important, given the long history of moral condemnation of behaviors that individuals have little or no control over, but leaves no room for any philosophical or spiritual views. It also undercuts the degree to which individuals can exercise control over their own behavior, and defines the problem as a medical condition, which requires the services of a professional specializing in the disorder. Of course this serves the interests of the psychotherapeutic profession, which is not to deny that there are many instances where therapy can be helpful, and at times necessary (and the same is true of pharmaceuticals). But this does fall into a kind of technical thinking, as in Neil Postman's technopoly and Jacques Ellul's la technique.

We know that some individuals exhibit Obsessive-Compulsive Disorder, and that many have this syndrome to a greater or lesser degree. And yet we don't use the metaphor of addiction for behaviors associated with OCD. We don't say, for example, that someone is addicted to washing his or her hands over and over again. OCD is the other extreme where we see the problem lying in the mind or as a neurological disorder, and not in the habitual activity. Does the metaphor of addiction simply point to the tendency that exists in human beings (and other species) to a greater or lesser degree to engage in repetitive behaviors? (Aside from OCD, repetitive behavior is also a characteristic associated with autism.) What is the difference between ritualistic behavior and addiction?

Considering addiction as a metaphor, it can be instructive to consider what habits are not labeled addictions. Are we addicted to showers if we take one every day? To brushing our teeth if we do so after every meal? Is there such a thing as being addicted to reading? If reading is not an addiction, can you look at anything with writing on it, a sign, a newspaper, book, flyer, poster, etc., look right at it, and not read what it says?

What I am trying to point out here is that we need different terms for different phenomena, and that the reification of metaphors can be the cause of confusion.

Addiction, even in the broad sense in which it is defined by the American Psychological Association, is an individual condition, psychology being about the individual mind, rather than the collective culture and society. But my friend, Eric McLuhan, got into the discussion to point out that we can also refer to an entire society as being addicted, say to television, or the internet, cell phones, or other technologies such as the automobile. Here is my response:


We mainly speak of addiction on the individual level, whether it's addiction to physical substances or addiction to certain activities. We might speak of addiction in a collective sense to talk about how large numbers of people were forced or encouraged to become physically addicted, for example that the British got China hooked on opium. But we still are talking about individual addiction, just that it's happening on a large scale.

But now, is it apt to say that, as a society, the United States, for example, is addicted to television, computers, cell phones, etc.? I certainly would argue that as complex systems, contemporary societies are dependent on various technologies for their existence, and would not be able to function without them. But to use the term addiction in this regard strikes me as even more of a metaphor than to use it to refer to individuals engaged in habitual or obsessive behaviors.

To give one example, it's been said that we are addicted to petroleum, and that is a powerful way to describe our dependency on that source of energy. But if we suddenly ran out of oil, and gasoline, and had no immediate substitute for it, the result would be more than just withdrawal symptoms, as the loss of trucking would mean that all of us living in major cities would run out of food very quickly. If roads are our arteries, and trucks are the cells carrying nutrients, then aren't they intrinsic to the social system (as a kind of organism), rather than acting as a foreign substance altering us collectively? If language is inherent in our species, then are the new languages that evolve to be considered a foreign substance or a natural development?

If we employ the metaphor, then we might make a distinction between dependencies due to addiction, and dependencies due to necessity, the distinction between say alcoholism and needing water to survive. This is the territory Innis was scouting out.

Anyway, what troubles me is not the use of the metaphor, but the loss of distinction between addictive substances on the one hand, and other forms of dependency, obsessive-compulsive behavior, and ritual and habitual behavior.


Following some further discussion the list on the subject, I decided to post some further thoughts:


a few more comments on the subject...

There has been a good amount of criticism about the possibility that children are being over-diagnosed as having ADD and ADHD. While there are cases where there is a genuine neurological problem that can be alleviated through appropriate medication, the concern is that anytime students exhibit any kind of behavioral or learning problems in school, they are given a medical diagnosis and prescribed drugs as treatment. In other words, the problem is that a medical framework is being extended inappropriately to areas where it doesn't belong.

I think it's reasonable to ask whether the same is occurring with addiction, which was earlier understood to be a physiological, and therefore medical problem. This sort of questioning is in the tradition of Neil Postman and especially Ivan Illich, not to mention Thomas Szasz. And again, the big problem has to do with clinical diagnosis, rather than the use of metaphor.

Also, in teaching about new media, I tell students about the famous early case involving a virtual community dealing with unethical behavior, as written up by Julian Dibbell under the title of A Rape in Cyberspace. And one question I ask is whether the term "rape" is appropriate for the kind of virtual act that occurred, or whether this usage discounts the seriousness of the actual, real word crime. I think the same question can be asked about virtual addiction, given the seriousness of actual physical addiction. Even when used as a metaphor, words have power to shape our understanding and our responses, and overuse and misuse can result in the demeaning of meaning, to use Postman's phrase.

And I will say in all seriousness that I was a heavy smoker for two decades, averaging 2-4 packs a day, and in that time I know I did some damage to my body that was irreversible. I'll also point out that, as cigarette smokers, Neil Postman and Christine Nystrom both died of lung cancer, and James Carey of emphysema. And I myself found it very difficult to quit, impossible to just go cold turkey, and only was able to stop smoking by being weaned off of nicotine via the patch. I have gotten hooked on all kinds of other activities, playing computer games all night, compulsively checking Twitter messages on my phone, etc., but nothing compares to what I went through trying to quit smoking. So from my personal experience, addiction represents a special and distinct category.

I also find it significant that recovered alcoholics continue to say that they are alcoholics, and always will be, and can never go back to having an occasional drink now and then. That need for absolute abstinence is not comparable to what may be termed sex addiction, or gambling addiction, or media addiction.

Now for something on the lighter side:

I am addicted to the English language. I can't help myself, I can't stop myself from using it. I think about it night and day, I can't get it out of my head. It's there even when I sleep. It affects my thinking, my emotions, my behavior, altering my very view of reality. And the addiction has harmful effects, in leading me to expect the world to be relatively static rather than dynamic, filled with things rather than events and processes, filled with isolated phenomena rather than a dense network of relationships, etc. There have been efforts to help people like me break this addiction, from Alfred Korzybski's general semantics to various forms of meditation and mysticism, but time and time again addicts like me find ourselves getting another fix, often without even realizing what we're doing. I know some use a methadone-like treatment, turning to immersion in a different language to break free of the hold that English has on them, but then they just find themselves addicted to that other language. As far as I know, the only known cure for language addiction requires direct action to remove or disable sections of the brain.

I'll stop now, lest someone accuse me of being addicted to this topic...


Now, in response to some criticism arguing for the extended use of addiction, here is the first part of what I had to say:


I don't think that the treatment for sex addiction requires lifelong celibacy, does it? I think there is a distinction to be made between addictions where the only cure or form of recovery involves complete abstinence, and other behavioral problems where moderation is sufficient. Is the solution to "internet addiction" to never go online and never use email? Does a recovering "news junkie" need to avoid newspapers and news broadcasts altogether? Is the answer to media addiction to completely cut media out of the individual's life, whatever that might mean?

I thought I was pretty clear on the fact that I am not denying that problems exist regarding habitual activity, compulsive behavior, and dependencies. These are very real and very serious problems, individually and collectively. I'm just questioning the use of the specific term "addiction" and asking if it's appropriate. I know that some people are especially invested in that metaphor, and I do agree that the metaphor refers to actual psychological and social problems. My concern is over precision in language, and the question of whether to frame the problems in medical terms, which would suggest they require clinical treatment, as opposed to alternate framings that allow the problems to be approached through education, for example.

Before continuing on, let me note that a couple of folks of the list pointed to the etymology of the word addiction, which is interesting in that it is based on the root term, diction, implying that it has something to do with language and communication. So, continuing on, here is my response to that:


I'm all for using etymology to understand concepts in instances where we are dealing with commonly used words, words that have vague or fuzzy definitions, etc. But in this case, the issue is not the root meaning of the word, but rather its operational definition. The term "addiction" has very specific clinical and medical definitions, and it is fair to ask whether the definitions being used are appropriate or useful, just as we may ask the same for the clinical definition of "deviance", for example, or "insanity". The etymology of the term "malaria" may be of some interest to historians of science, but it does not help us in understanding what the term refers to in current medical usage, and it would be absurd to argue that, given its root meaning of bad air, it should also be applied to diseases brought on by air pollution, or mustard gas.

I do hope, in raising these questions, I am not coming across as addictatorial...


And that is pretty much the sum of the points I made in the discussion, which I hope have been of some interest and utility to you, dear reader. But as a bit of an epilogue, let me note that there was one more email I sent to the list on the topic, which began with a brief  personal response to another list member that isn't relevant here, after which I added the following (true story!):


Now, I just opened a fortune cookie, and the fortune reads: "We first make our habits, and then our habits make us."

Coincidence? I think not...

As it turns out, that fortune is an aphorism that comes to us from a western source, the 17th century English poet, John Dryden, although some mistakenly attribute it to Charles C. Noble. This brings to mind my 2011 post about Neil Postman's quote, Children are the Living Messages We Send to a Time We Will Not See.



Anyway, maybe some folks are addicted to using the term addiction, but as to how the word will be used in the future, far be it from me to venture any prediction.





Sunday, July 27, 2014

Journeys (A Sermon)

Continuing once again on July 25th as lay leader for Friday evening Sabbath services at my Reform Jewish temple, Congregation Adas Emuno of Leonia, New Jersey, my sermon or D'var Torah was based on the weekly Torah portion or parsha, Massei (Numbers 33:1-36:13), which means Journeys. It followed up, in part, on my D'var from last week, posted here under the title, My Sermon on Torah, Tribes, and Tribalism. And once again, I posted this week's sermon on the Adas Emuno congregational blog, under the title of Journeys, but also want to share it here on my own blog:


Parsha Massei



This weeks Torah portion is called Massei, which means Journeys. It's the final parsha in the Book of Numbers, the fourth book of the Torah, whose Hebrew name is Bamidbar, meaning, In the Dessert. And the parsha begins by saying, "These are the journeys of the children of Israel who left the land of Egypt in their legions, under the charge of Moses and Aaron" (Numbers 33:1). And much of the portion is devoted to a summary of their journey, from the liberation from slavery and exodus from Egypt, through the long years of traveling through the Sinai dessert, to the east bank of the Jordan River, on the border of the Promised Land. This is where the journey ends for Moses, and this is where the journey ends in the Torah. The next and last book, the Book of Deuteronomy, relates the final words of Moses to the Israelites, and end with the passing of the greatest of our prophets, which occurs before the Israelites cross over into the Promised Land. It is not until the sixth book of our Holy Scriptures, the Book of Joshua, that the Israelites actually enter and take possession of the land, which is where we find the famous story of how the blowing of the shofars brought down the walls of the city of Jericho.

But this week's parsha looks ahead to the return of the Israelites to Canaan, and speaks of how the Promised Land should be divided up, detailing the different areas that will be given to each of the twelve tribes, and what their boundaries will be. And it lists the names of the chieftains of each of the twelve tribes, along with Joshua as the successor to Moses, and Eleazar the priest as the successor to Aaron. In last week's D'var Torah, I talked about the tribal roots of the Jewish people, and how the Torah and Tanach tell the story of the difficult transition from tribalism to civilization. And I talked about how the Semitic aleph-bet and literacy was central to this transition, in establishing the Torah as a sacred text, in providing the first written history to take the place of myth and legend, and in providing the first true system of codified law, ethics, and human rights.

Parsha Massei concludes with two examples of this transition, in both cases providing progressive responses to tribal realities. One of them follows up on an earlier passage in the Book of Numbers (27: 1-11) that tells the story of how Zelophehad, of the tribe of Manasseh, died leaving behind five daughters, but no sons. His daughters argued that, in the absence of a male heir, they should have the right to inherit their father's property. They made their case before Moses, the high priest Eleazar, the twelve chieftains, and the entire assembly gathered in the Tent of Meeting. And God tells Moses that their plea is just, and establishes a new ruling that daughters can inherit property when there are no sons. It was a small step for women's rights, but it was progress, without a doubt. And it also demonstrated a willingness to break from established tribal traditions, to replace adherence to longstanding customs with a legal system where cases can be decided on rational grounds, and traditions can be reviewed objectively, criticized, and modified, or even abandoned.

In this week's Torah portion, the decision in favor of the daughters of Zelophehad is appealed by the chieftain of the tribe of Manasseh, who argues that if the daughters marry men who are members of other Israelite tribes, then their lands would go the other tribes, and no longer be a part of the region allotted to the Manasseh tribe. Here we see the continued force of tribalism, and the lack of complete unity among the Israelite tribes. Again, Moses consults with God, and what is especially significant here is that the verdict that was made was not to reverse the ruling regarding inheritance, not to revert to the old ways, but to find a new compromise within the realities of tribal life. And that compromise was that the daughters of Zelophehad could marry whomever they please, itself a progressive notion, but they can only marry members of their father's tribe. And Moses goes on to say,


Thus, the inheritance of the children of Israel will not be transferred from tribe to tribe, for each person from the children of Israel will remain attached to the inheritance of his father's tribe. Every daughter from the tribes of the children of Israel who inherits property, shall marry a member of her father's tribe, so each one of the children of Israel shall inherit the property of his forefathers. And no inheritance will be transferred from one tribe to another tribe, for each person of the tribes of the children of Israel shall remain attached to his own inheritance. (Numbers 36: 7-9)


In this way, Moses establishes a new, general rule, based on this one specific case, moving from the concrete to the abstract. As for the daughters of Zelophehad, they found this to be a perfectly acceptable resolution. In all likelihood, they would have married members of their own tribe anyway.

The other example of the transition from tribalism to civilization in Parsha Massei is God's directive that the children of Israel establish six cities of refuge in the Promised Land. And it is important to recall that at this time, there are no police officers, no criminal justice system, no courts as we understand them. It was accepted as common sense that, if one person kills another, then relatives of the victim are justified in seeking vengeance. Therefore, the killer may be pursued by what the Torah refers to as a blood avenger. This is what the Italians refer to as a vendetta, a word that was adopted in the English language in the 19th century. A vendetta can refer to the single act of vengeance, but also to the blood feud that ensues when one act of vengeance is followed by another act of retaliation in a series of exchanges that can go on indefinitely, and may escalate in intensity. In the United States, the most famous example of this is the 19th century feud between the Hatfields and the McCoys in West Virginia and Kentucky, following the Civil War.

In an attempt to avoid this kind of destructive behavior, the Torah establishes a clear distinction between killing someone intentionally and killing someone by accident, the distinction that today we refer to as the difference between murder and manslaughter. If the victim was killed intentionally, or otherwise out of malice, the Torah says that the blood avenger is permitted to kill the murderer. If the avenger is not a firsthand witness to the murder, he can still exact his vengeance based on the testimony of witnesses, and the use of the plural here is significant, because the Torah also insists that, "a single witness may not testify against a person so that he should die" (Numbers 35:30). This does not meet contemporary standards, of course, but for its time, it is progressive in establishing that there is a burden of proof that must be met before someone is condemned as a murderer. But the same portion also insists that a murderer's life cannot be ransomed, that the murderer cannot buy his way out of the death penalty, a harsh rule, but one that insures equality before the law, for rich and poor alike.

A blood avenger does not necessarily distinguish between murder and manslaughter, and it is understood that acts of vengeance are driven by emotion. And following the old traditions of tribal life, a blood avenger may still pursue someone who has killed someone unintentionally, perhaps not believing it was an accident, or maybe not caring about the killer's motivation. We recognize today that manslaughter is in fact a crime, that someone who is guilty of manslaughter may be innocent of murder, but is not entirely innocent altogether. Likewise, in our tribal tradition, the blood avenger is still permitted to seek vengeance. But the killer can flee to one of the six cities of refuge, and ask for asylum. It is then up to the community to judge between the blood avenger and the killer, and if they decide that the death was accidental, then the culprit can be granted sanctuary within the city of refuge. If he steps outside of the city limits, the blood avenger is permitted to exact his vengeance, but as long as he stays inside of the city, he is safe. This amounts to a form of exile and imprisonment, although it is not necessarily a life sentence, as the Torah stipulates that after the High Priest dies,
killers guilty of manslaughter are free to leave and return home, and acts of revenge are against them are no longer permitted.

We therefore have a new set of laws that break with tradition, and are therefore progressive. They are a new set of laws that establish a clear concept of justice, tempered by mercy. And they are laws that are conveyed as general rules, based on abstract principles, the product of a new kind of mindset based on literacy, as opposed to nonliterate traditions where judgment is based on aphorisms, parables and other types of storytelling. By way of contrast, rather than using abstract codes of law, traditional, tribal cultures would refer to a story like the account in Genesis of Cain and Abel, and ask, whether or not the killer in question is guilty of the same kind of act as Cain was. This is akin to arguing a case based on precedent, a type of legal argument that is used here in the United States, and in other nations that use a common law legal system. Legal systems based on civil law are more prevalent worldwide however,
and in such systems only the written law, the abstract rule, is considered, and not the concrete examples of previous cases and judgements. Civil law is also known as Continental European Law, while our system of common law is based on the British system. And while it allows for the use of precedent, the legal cases are still tried based on an established written code consisting of general rules, that is, codified law.

I think we can find in Parsha Massei an echo of the story of Cain and Abel in Genesis, when God says, to Cain, "What have you done? The voice of your brother's blood cries to me from the ground!" (4:10). And we can see how this is stated in a highly abstract form within the Ten Commandments, the Sixth Commandment stating, "You shall not murder" (Exodus 20:13; Deuteronomy 5:17). The more common translation, "Thou shalt not kill," not only omits the distinction between murder and manslaughter that this week's Torah portion clarifies, but also would be impossible to obey unless we starved to death. Moreover, in the Book of Leviticus, in what is known as the Holiness Code, we have the commandment, "you shall not stand idly by the blood of your neighbor" (19:16), and it also say, "you shall not hate your brother in your heart" (19:17) and "you shall not take vengeance, not bear any grudge against the children of your people, but you shall love your neighbor as yourself" (19:18). And so it is in this week's parsha that God says:


And you shall not corrupt the land in which you live, for the blood corrupts the land, and the blood which is shed in the land cannot be atoned for except through the blood of the one who shed it. And you shall not defile the land where you reside, in which I dwell, for I am the Lord Who dwells among the children of Israel. (Numbers 35: 33-34)


What is striking about this is the way that the Torah breaks away from tribalism, in refusing to glorify violence. Tribal societies often view violence as a routine part of life, as natural and necessary, if not cause for celebration. It is not uncommon to find tribal societies glorifying warfare, physical combat, and hunting. Puberty rites for young males typically involved some form of physical violence, and taking part in fighting and killing had a strong association with masculinity. But the written law delivered to the Israelite tribes commanded them that the spilling of blood was abhorrent, especially in the ritual of human sacrifice practiced by many other tribes in the region. The practice of child sacrifice in particular, and human sacrifice in general, is condemned in the strongest possible terms in our Torah and Holy Scriptures.

There is a difference, of course, between not glorifying violence, and practicing nonviolence. The Torah does not tell us to be pacifists, and recognizes that there are times when violence is necessary, to stand up for our rights, and to protect each other. In the say way, the Torah tells us that vengeance is wrong, but this does not mean that the heinous crimes can be or ought to be forgiven. Rather, the call is for justice, tempered with mercy, but justice as a rational evaluation based on rule by law, rather than emotional acts of vendetta. And the justice of the ancient world may seem quite harsh to our contemporary sensibilities, but it was a concept of justice that could be modified over time, changing to meet changing circumstances.

Over time, we would adopt a new kind of rite of passage for young males coming age, one that replaced violent activity with a literacy test. I'm referring of course to the b'nai mitzvah. With the story of the binding of Isaac, the practice of human sacrifice was replaced by animal sacrifice, and with the destruction of the Temple in Jerusalem, we replaced animal sacrifice with prayer. And in possession of the sacred text of the Torah, we embraced study as a way of life. And especially in exile, living as strangers in strange lands, nonviolence was often the only option. This is not to say that we never fought back in the face of the many forms of tribalism we encountered, but it certainly was not easy being an oppressed and persecuted minority.

I recently read a book by the historian Elizabeth Eisenstein about how the invention of printing was viewed in Europe and America, and I found what she had to say about the Nazis in Germany rather striking:


Antisemitic stereotypes attributed a soft, flabby, and sedentary lifestyle to the bookish Jew in contrast to the masculine, muscular Aryan. Observers in 1933 witnessed the book-burnings of works by Jews and other "decadent" authors, along with the elimination of the same works from libraries and bookshops. The elimination of Jewish books served as a prelude to measures in the next decade aimed at eliminating the Jews themselves.


The bookish stereotype has been dispelled, to large degree, through the founding of the State of Israel in 1948, and the fact that the Jewish state was able to defend itself, to resist the combined armed forces of several Arab nations, and to organize the Israeli Defense Forces as one of the most effective military units in the world. But in taking on the task of building our own modern nation-state, and defending it, we find ourselves once again wrestling with tribalism, both externally and internally. How are we to seek justice, and not give in to the desire for vengeance? How are we to temper the desire for justice with a sense of mercy? How are we to stand up for ourselves without glorifying violence? And how are we to defend ourselves without causing harm to others who are innocent of any wrongdoing? The answers do not come easy, but they will never come at all if we do not begin by posing the questions.

In Parsha Massei, after the summary of the journey through the wilderness, there comes a passage that resonates uncomfortably with current events:


The Lord spoke to Moses in the plains of Moab by the Jordan at Jericho, saying: Speak to the children of Israel and say to them: When you cross the Jordan into the land of Canaan, you shall drive out all the inhabitants of the land from before you, destroy all their temples, destroy their molten idols, and demolish their high places. You shall clear out the Land and settle in it, for I have given you the Land to occupy it. (Numbers 33: 50-53).
And we have to remember that this was common practice throughout the ancient world, and the middle ages, and continued into modern times. This is the way that the European settlers handled Native Americans, and this is the way that conquest and border changes were handled in Europe and Asia throughout the 20th century. The State of Israel was unique in not driving out the Arabs out of the land for the most part, not during the War of Independence, and not after occupying Egypt's Sinai Peninsula and Gaza Strip and Jordan's West Bank and East Jerusalem following the Six Day War in 1967. If they had done what just about every other nation has done, and what the Torah says the Israelite tribes did in the ancient world, things would be entirely different today. With that in mind, the passage that comes next in this week's parsha is even more disturbing, as it has God continuing to say to Moses the following:


But if you do not drive out the inhabitants of the Land from before you, then those whom you leave over will be as spikes in your eyes and thorns in your sides, and they will harass you in the land in which you settle. And it will be that what I had intended to do to them, I will do to you. (Numbers 33: 55-56).


Jews all over the world are taking note of these verses in light of the violence and bloodshed in Israel, Gaza, and the West Bank today. And I think we have to understand that in the long journey we have taken from tribalism to civilization, we could no longer follow such a course of action. Over the course of that journey, we have come to be guided by the great sage Hillel, whose most memorable saying can be translated as, do not do to others what you would not have them do to you, or as that which is hateful to you, do not do to others. And no one is claiming that the Jewish people or the State of Israel is perfect, but civilization is not about achieving some form of utopia, it's about establishing a way of life that is not built on violence or vengeance, but on justice and mercy.

Tribes cling to a way of life, and refuse to change. Anthropologists tell the story of the People of the Deer, a small Inuit tribe in the Arctic region of Canada. As their name implies, the People of the Deer survived by hunting caribou. Every year, the herds would migrate through
the tribe's territory, and the tribe would hunt them, and obtain enough meat to survive through the winter. This was their way of life from time immemorial. But one year the unthinkable happened. The herds were small, and the tribe did not get enough meat to last through the winter. This story is often told to introductory anthropology classes, and the question is then put to students: What do you think the tribe did at this point? The typical answers that were given included moving to another location, trying to follow the herd after they left the territory, rationing out the supplies, sending the old people off to die or killing or exiling some members of the tribe through some other means of selection, and even trying to signal or search for some form of outside help. The one thing that almost no one ever thinks of is the one thing that the tribe did do. Which is nothing. They did nothing, because they could not conceive of doing things in any way differently from the way that they have always done things. And so, they died.

The lesson can be taken in different ways. For anthropology students, it brings home the fact of our cultural bias
as westerners, that whenever a problem appears, we believe that some sort of action has to be taken. Indeed, we demand that someone do something about it. But sometimes there are no solutions, and all we can do is wait. And in regard to the situation in the Middle East, demands that Israel act unilaterally to resolve the situation may indeed be unrealistic.

But we also know, as people who have made the journey from tribalism to civilization, that things can change, that progress is possible. Just as we have made progress from slavery in Egypt to revelation at Sinai to the return to the Promised Land, just as we have made progress from agriculture to industry to electricity and digital technologies, just as we have made progress from archaic custom to rule by law, freedom, equality, and increasing understanding of human rights, so can we make progress from violence to thoughtfulness, from war to peace, from hostility to friendship. The story of the Jewish people, and the story of the Arab people, begins in the Book of Genesis, when God says to Abraham, "Go forth from your native land and from your father's house to the land that I will show you" (Genesis 12:1). And so, our history begins with a journey, a journey made out of faith, without knowing the final destination, and without knowing the way. Jews and Arabs, both the children of Abraham will have to follow the example of our patriarch, if we are ever going to make progress, if we are ever going to leave behind the tribalism of our father's house, if we are ever going to arrive at the Promised Land of a permanent and pervasive civilization where, in the words of the prophet Micah, "each one will sit under his vine and under his fig tree, and none shall make them afraid" (4:4). May it be so, in our time, and soon.