Thursday, April 26, 2012

A Not Un-Bearable Temporal Distortion

So, I had to improvise a little bit in my Science Fiction Genre class the other day because a guest lecturer I had lined up was unable to join us.  So I wound up showing a couple of anime from the Animatrix, the direct-to-DVD release that adds some of the back story to the two sequels to the Matrix.  And I showed the class the first regular episode of the SyFy Channel's Battlestar Galactica series, which is entitled "33" and I think was the best episode in the entire run.

The original series from 1978 was clearly derivative of the first Star Wars film from the previous year, and an attempt to capture some of that sparkle for TV.  It only lasted one season, followed by a short-lived attempt to revive it in 1980.  While the original series has had its fans, simply put, it was not very good.  Not horrible mind you, just not very good.

The scenario was taken from the Book of Exodus, with Lorne Greene, well known to audiences then as the patriarch of the Cartwright clan on the long running western series Bonanza, leading a ragtag fleet of spaceships on a quest to find the mythical planet Earth.  The fleet was all that was left of humanity, after an attack by the evil Cylon robot empire destroyed the 12 colonies, 12 worlds named after the 12 signs of the zodiac.

The original had an underlying religious sensibility, and in Battlestar Galactica 1980 there even was some interaction with angels, but the relationships were never clear.

Unlike the original, the remake was a high quality production, with a great deal of originality, and it very much captured a post-9/11 sensibility, really emphasizing the fact that humanity was almost entirely exterminated, and that the survivors were just barely hanging on.  The 12 colonies were again associated with the 12 signs of the zodiac, and consistent with this, human culture was depicted as polytheistic—they would say gods where we would say God—and specifically rooted in the ancient Greek pantheon (e.g., Zeus, Apollo, Athena, etc.).

Interestingly enough, there's also a Mormon reference thrown in, a carry-over from the original series, whose creator, Glen Larson, was Mormon.  Both series make reference to Kobol as the home of the human race, and Kolob in the Mormon faith is the "is the star or planet nearest to the throne of God" (see the Wikipedia entry on Religious and mythological references in Battlestar Galactica). What impact this may have on the candidacy of Mitt Romney, it is hard to say (a number of people have speculated on the possibility that Romney is a robot, and my friend Paul Levinson thinks he might even be a Cyclon!).  Perhaps if he chooses Newt Gingrich as a running mate, Newt having made a moon base one of his campaign promises, he'd have the science fiction vote sewed up.  

Sound far-fetched?  You may be surprised to learn that Reagan gained favor in the SF community due to his "Star Wars" Strategic Defense Initiative.  Seems that many SF fans saw the militarizing of space as worthwhile in that it was a surefire way to get us up into space, and keep us there.  Those damn peaceniks are all about feeding the hungry, curing diseases, eradicating poverty, and the like.  Lewis Mumford saw quite clearly that the space program was as massive a waste of labor and resources as the building of the pyramids in the ancient world, noting that both constitute attempts to send a select few into their respective notions of the heavens.  Rationally, I know he's right, although  it's hard to shed that emotional attachment to the vision of space travel a la Star Trek.

Anyway, the humans in Battlestar Galactica were essentially as modern as us in most ways, and more so in regard to space travel, and the fact that their religion was based on the Greek gods can be seen as reflecting and symbolizing the fact that western culture is dominated by a secular humanist orientation that is rooted in Greek philosophy. By way of contrast, the Cylons believed in God--they were monotheists, and generally much more religiously oriented than the humans.  And this being a post-9/11 series, this twist clearly reflects our anxiety about Moslems generally, and Islamic fundamentalism and the terrorist initiative that sprung from it more specifically.

I should note that back when the new series was on the air, I wrote several posts about it, and you can find them by clicking on Battlestar Galactica over on the side, down at the Labels gadget, where all of the labels are displayed in cloud formation.  But I'm bringing all this up because I wanted to note the theme music for the series, which has been widely applauded for its aesthetic appeal, has some middle eastern overtones to it.  It also turns out that some elements of the theme come from a bit further to the east.  There's a bit of singing in an unfamiliar (hence alien in a sense) language, and over on YouTube the person who posted this video, , noted the following:

Turns out, the singing in BSG's opening is actual lyrics, taken from the Gayatri Mantra: oṃ bhūr bhuvaḥ svaḥ tát savitúr váreniyaṃ bhárgo devásya dhīmahi dhíyo yó naḥ pracodáyāt

The Language is Sanskrit. Basically, it translates into something like: We meditate upon the radiant Divine Light of that adorable Sun of Spiritual Consciousness; May it awaken our intuitional consciousness.
 He then added the lyrics to the video, as you can see below:

So, now, the music for Battlestar Galactica was create by Bear McCreary, who has also done music for the sequel series, Caprica (a major disappointment, the series that is, not the music), The Walking Dead (one of my favorites of current series), Eureka (a series that never lived up to its promise, and one that I gave up on), and Terminator: The Sarah Connor Chronicles (a great series that inexplicably never caught on and was canceled).  If you're interested in his work, go check out the Bear McCreary Official Site.

And this brings me finally to the point of this post, which is to share with you a short video, one that is, appropriately enough for Blog Time Passing, entitled Temporal Distortion, and which features a soundtrack by, you guessed it, Bear McCreary.  So, here goes:

And, according to the write up over on Vimeo:

What you see is real, but you can't see it this way with the naked eye. It is the result of thousands of 20-30 second exposures, edited together to produce the timelapse. This allows you to see the Milky Way, Aurora and other Phenonmena, in a way you wouldn't normally see them.

In the opening "Dakotalapse" title shot, you see bands of red and green moving across the sky. After asking several Astronomers, they are possible noctilucent clouds, airglow or faint Aurora. I never got a definite answer to what it is. You can also see the red and green bands in other shots.

At :53 and 2:17 seconds into the video you see a Meteor with a Persistent Train. Which is ionizing gases, which lasted over a half hour in the cameras frame. Phil Plait wrote an article about the phenomena here
There is a second Meteor with a much shorter persistent train at 2:51 in the video. This one wasn't backlit by the moon like the first, and moves out of the frame quickly.

The Aurora were shot in central South Dakota in September 2011 and near Madison, Wisconsin on October 25, 2011.

Watch for two Deer at 1:27

Most of the video was shot near the White River in central South Dakota during September and October 2011, there are other shots from Arches National Park in Utah, and Canyon of the Ancients area of Colorado during June 2011.

 Is this video science fiction? Not exactly, not in a purist sense, but it does use science to create an alien and alienating, and in a sense fictional sense of space and time.  Perhaps we can speak, in an abstract sense, of a science fiction aesthetic that exists independently of any science fiction scenario? Or perhaps we need to create an entirely new category for this sort of experimental video?

Monday, April 23, 2012

Heating and Googlighting

 So, it seems as if Microsoft has taken a lesson from all of the negative political advertising that we've been subjected to as part of the Republican primary campaign over the past year.  We don't usually see commercial interests go on the attack in the same way as political candidates in this great democracy of ours, but take a look at this video by Microsoft that provides a humorous but pointed assault on Google's cloud services, notably Google Docs:

Ironically, perhaps, this Googlighting video appears on Google's own YouTube website (heck, Plato's criticisms of writing were put into writing in the Phaedrus). No doubt, Google's not happy about it, but they are set up as a neutral carrier, censoring only for reasons of decency and copyright.  Their little write-up there goes like this:

What happens when the world's largest advertising business tries to sell productivity software on the side? Beware the Googlighting Stranger. Learn more at

 If you follow the link, it will take you to a somewhat more rational comparison of services.  But it's the attack ad that really fascinates me, and while some might liken it to the famous Apple advertising campaign ("Hi. I'm a Mac. And I'm a PC"), Microsoft's new approach lacks the humor of the Mac ads, or the Davy vs. Goliath appeal that they had.  After all, Mac had less than 10% of the home computer market in comparison to PC computers, whereas Microsoft dominates the market for word processing, spreadsheets, and presentation software (i.e., PowerPoint), and is just trying to crush Google's upstart alternative.  And it's the downright nastiness of the ad that amazes me.

Looking into the matter further, I came across this video from Newsy Tech about another attack Microsoft launched against Google earlier this year:

Certainly, full page newspaper ads, like their webpage on productivity applications cited above, require more of a reasoned approach than videos, a point that reinforces the arguments made by Neil Postman in Amusing Ourselves to Death back in 1985.

But reasoned or no, it seems that Microsoft has been asserting itself after what seemed to be a prolonged period of resting on it laurels.  Whether or not it would be accurate to refer to Microsoft in recent years as a slumbering giant, it is certainly true that Apple and Google have been producing one innovation after another, while Microsoft has largely been doing the same old thing year after year.  And sticking with the status quo is an almost certain way to get left behind in the new media world. Just ask IBM about the IBM PC, or AOL about dial-up, or MySpace about social networking.

That's not to discount Microsoft's entry into the search engine market with Bing. Maybe you remember the days when there actually was a choice of search engines?  I mean a serious choice.  When you would choose among such options as Yahoo, Lycos, WebCrawler, AltaVista, etc.?  Before Google wiped them all out with its amazing algorithms. So, maybe Bing has made a little ting in Google's near-monopoly. Maybe. Funny how we don't see Google sweating it, and ridiculing Microsoft's attempt to grab a piece of the search engine action.

And Microsoft has also moved into the mobile market with its Windows Phone software.  Now this represents much more of a threat to Google, who has had the run of the playing field for alternatives to the Macintosh iPhone, with Google's Android operating system.  It pretty much looked like Android was positioning itself as the MS-DOS/Windows-type system for mobile devices, with an operating system that, while of lesser quality that Mac's, could be used across all of the different hardware platforms.  In the mobile arena, Mac has been pursuing the same strategy as it had for computers, creating exclusive combinations of software and hardware as high-end products, maintaining quality over quantity.  Their major departure from this strategy was iTunes, software that works on PCs as well as Macs, which is what turned the company around (hard to believe they almost went under at one point), but of course this was only a partial departure, as iTunes only works with Mac mobile devices, e.g., iPods and iPhones.

So, it's Android vs. Windows Phone in a war of mobile operating systems.  And I think it worth noting that Android is based on Linux, the third major operating system for personal computers, after Windows and Macs.  And while Linux has gained significantly in popularity as a free, open source alternative to Windows since the introduction of the Ubuntu version in 2004, their share of the market is not significant (around 1%).  Android, then, is by far the most successful version of Linux in existence, but can it survive a concerted effort on the part of Microsoft, the company that has dominated personal computer operating systems ever since poor, shortsighted IBM asked Bill Gates to write a Disk Operating System for the microcomputers they were introducing in the early 80s?

Or will Android wind up being deactivated? Or retired, to use the old Blade Runner euphemism.

And perhaps you've noticed the new commercials on TV for Microsoft's web browser, Internet Explorer?

 You can read and see more about it on their website:  Since IE9 is only for Windows, I can't give you any firsthand feedback on it, but I really have to wonder, why is this browser different from any other browser? I suspect there are few if any differences that make a difference, except for the fact that all of our web browsers are now working towards the convergence of personal computers and mobile devices.

Now, you may remember a time back in the 90s when Netscape was the browser of choice. This was back in the early days of the Web.  And as I recall, Netscape was starting to talk about how all you really need is a browser, that a browser is, in fact, an operating system.  After all, what we're talking about are interfaces to computer hardware.  Back in those early days, Netscape was pointing to future possibilities, but no doubt Microsoft took notice, as operating systems (i.e., MS-DOS and Windows) are the foundation of its business.

So Microsoft integrated its Internet Explorer browser into its Windows operating system, making it the only accessible alternative, and this pushed Netscape aside so that Internet Explorer became the dominant browser.  Microsoft became the subject of an antitrust suit filed by the US government (see Wikipedia's entry on United States v. Microsoft), which Microsoft lost, so that it was declared a monopoly; Microsoft appealed, and finally reached a settlement with the Department of Justice in which their operating system would allow a choice of browsers.

But the damage had been done. Netscape never recovered.  Even Mac users were forced to turn to Internet Explorer because Mac's Safari browser sometimes did not agree with websites designed with Internet Explorer in mind.  And this is where Microsoft went soft, as Mozilla, a non-profit, developed the Firefox browser, which was able to achieve the same functionality as Internet Explorer, and eventually improve on Microsoft's browser in regard to its efficiency and its features.  Introduced in 2004, within a few years, Firefox seemed to be well on its way to displacing Internet Explorer as the browser of choice.

Mozilla received significant supported from Google, which is why the Firefox homepage featured a modified version of the Google homepage, at least up until recently.  Google released its own browser, Chrome, in 2008, developed by several programmers they hired away from Mozilla, but Google was relatively low key in promoting Chrome until the past year or so.  By the end of last year, Chrome had overtaken Firefox as Internet Explorer's main competition, and I believe that in about a year or so Firefox will be all but forgotten.

Chrome for Android was released this year as well.  But what is really key, I believe, is that Chrome has also been developed as an operating system, making good on Netscape's original vision.  It is a very limited operating system, to be sure, one that essentially only works with online apps, such as Google Docs.  But this does mean that the devices running the Chrome operating system require much less start-up time, and much less memory, not just for applications, but also for storage of files, as both can be accessed online, that is, stored in the cloud.  Of course, you have to have WiFi or a wireless data connection to make it all work.  To date, the Chrome operating system has only been used on a limited number of laptop/netbook type of devices, dubbed Chromebooks.

When I checked the Wikipedia entry on Chromebooks, I found the following statement:  "Some analysts viewed Google's web-centric operating system packaged with hardware as a direct attack on the market dominance of Microsoft." Um, yeah, you got it!  No wonder, then, that Microsoft is hitting back, and hitting back hard against Google.  If people start to accept and get comfortable with Google applications, especially the productivity software bundled under Google Docs, this would not only threaten Microsoft's near monopoly over those programs, but its dominance over the operating systems that represent Microsoft's very lifeblood (vampire that it is!).

So, it's Windows vs. Chrome, as well as Windows Phone vs. Android.  In other words, it's Microsoft vs. Google.  And what if Facebook comes to the aid of Microsoft, having also been threatened by Google's efforts, in launching Google+ as an alternative social network?  And what if Amazon makes good, belatedly, on the Epic predictions of a union with Google?  Would Apple remain neutral, or take one side or another?

Albert Einstein famously said, "I know not with what weapons World War III will be fought, but World War IV will be fought with sticks and stones."  But maybe the Third World War won't be a nuclear holocaust, or even a conventional land war, but a conflict in cyberspace, a series of new media battles fought with bits and bytes, memes and viruses, networks and programs.  Cry havoc, and let slip the apps of war!  Call it what you will, cybergeddon, or e-pocalypse now, I do think we're in for some interesting times.

Thursday, April 19, 2012

Pandora's Boxed In

So, last month I was asked to comment on an interesting phenomenon concerning the internet radio site (actually a streaming music service) Pandora.  This was in conjunction with an article written by Erika Morphy and published online in the E-Commerce Times on March 8, 2012, entitled , Pandora's Popularity - Too Much of a Good Thing?

It seems that the service ran into a problem that wouldn't seem to be a problem at first glance, one of growing in popularity too quickly.  As Morphy explains at the start of the article:

Pandora's popularity may not be enough to keep the company going in the long term. With growth in its user base come increases in royalty payments. "The company is trying to find the balance between free users and paid users," noted business professor David Tomczyk. Most people will choose the free accounts, and Pandora is attempting to monetize those as best as possible with short ads -- it may need to ratchet up the advertising.

She goes on to get into the specifics of the situation:

Pandora reported a 62 percent increase year-over-year in active users for its fiscal fourth-quarter, which ended Jan. 31, along with a 74 percent increase in advertising revenue and a 99 percent rise in listening hours.

However, those gains were accompanied by increases in royalty payments the online radio provider had to pay to content owners, and its overall performance fell short of analyst expectations. The stock priced dropped by some 24 percent after Pandora released the figures.

Pandora reported a loss of 3 US cents a share and revenue of $81 million for the quarter. Wall Street analysts had expected a loss of two cents a share and revenue of $83 million. Content-acquisition costs reached $48.2 million in the quarter, compared with $23.9 million a year earlier.

The company also revised its first-quarter revenue guidance downward, to $72-$75 million.

Here's where I come in, as the article raises the question, A Flawed Model?:

The stock downturn reflects investor fears that Pandora is not just experiencing new company growing pains, but that its model may be inherently flawed. The more popular it becomes, the more it has to pay in royalty fees.

"The service is employing the old, broadcasting model, where they pay for content, which they use to attract an audience, which they then sell to advertisers," Lance Strate, professor of communication and media studies at Fordham University, told the E-Commerce Times.

"Expect them to radically change their model, or disappear from the online environment," he said.

It is indeed quite amazing to see such examples of what McLuhan referred to as rearview mirror thinking, trying to do the same old job with the new tools.  Even the very fact that they think of what they're doing as radio shows how they're employing an old media map to try to navigate the new media territory.  And Pandora is not alone in this respect, as the article goes on to note:

Concern over the business model is not limited to Pandora, Lee Simmons, IPO industry specialist at Dun & Bradstreet, told the E-Commerce Times.

"New Internet music-streaming companies all face these costs, and they don't always have the cash to pay for the royalties," he said.

If a company doesn't keep a sharp eye on costs and cash flow, it could easily be sunk, Simmons noted.
That doesn't mean he's ready to write off Pandora, though, which is proving to be among the top providers in this space.

"Pandora is in high-growth mode," said Simmons. "[The] company is expanding rapidly, and it has a great opportunity to realize its upside potential." That potential includes its brand name and a user base growing at a rate that shows little sign of abating.
 The next section entitled Monthly Figures, gets into the bottom line:

Indeed, perhaps to quell questions about its future, Pandora announced that it will begin releasing its user stats on a monthly basis.

Its numbers for February indicated an increase in its share of the total U.S. radio listening market -- to 5.74 percent from 2.90 percent at the same time last year. Total listener hours reached 975 million, an increase of 101 percent from 483 million during the same time period last year. Active listeners for the month reached 49 million, an increase of 57.5 percent from 31 million during the same time period last year. 

 Somehow, though, this doesn't address the problem of the flawed model.  Anyway, the final section is called Get the Scaling Right, chalking it all up to a question of balance:

Scaling is one problem in particular Pandora must tackle if it wants to right its business model, David Tomczyk, assistant professor of management at the Quinnipiac University School of Business, told the E-Commerce Times.

"The company is trying to find the balance between free users and paid users -- the freemium model popular with many Web-based businesses. The challenge is offering a free model to attract customers and a premium service that's interesting enough people would be willing to pay for it," Tomczyk explained.

Most people will choose the free accounts, and Pandora is attempting to monetize those as best as possible with short ads every few songs, he noted. Now the company has to decide whether to increase the number of ads in order to cover growing expenses for its bandwidth and server needs, he said. Getting that balance right could mean the difference between success and failure.

 Ah yes, we all want the freebies!  And on the internet, it seems like just about everything is or ought to be free.  As my friend Richard Barbrook put it, it's cybercommunism.  Of course, free content is an old story, as it's the basic strategy of broadcasting, where the profit is in using the content to create audiences, and then selling access to the audiences to advertisers.  But that, as we see, can be a tricky business when you have to pay for content, especially when you do not enjoy a situation of scarcity and near-monopoly conditions, as broadcasters did, given the limited spectrum for transmission over the airwaves. 

It's no wonder that the big new media success stories are the sites where the content is obtained for free, whether it is through Google's use of search engines, or Facebook's user-generated content.  The other alternative would be micropayments, along the lines of Apple's iTunes.  If I might be willing to pay a penny for your thoughts, maybe I'd do the same for a song?

Anyway you look at it, Pandora's boxed in, and there's no way out for it or similar services, except to create a new strategy better suited to the new media environment.

Sunday, April 15, 2012

Going the Distance for Autism

Dear Friends,

If you can help, even just a little bit, it would mean a whole lot, it would make a great difference.

We know you understand that the diagnosis of autism is devastating, and that the treatment is terribly demanding. Our children struggle just to master the basics of everyday living.  Their teachers work with tireless dedication to forge a future for our children where they can at least live safely, take care of themselves (even if it is with assistance from others), and spend their days productively contributing to our society.

As you know, it's not easy, and it's not inexpensive.  We need donations to cover many basic costs of our daughter Sarah's school. The EPIC School in Paramus, New Jersey, is dedicated to helping children with autism between the ages of 3 and 21, and we are hoping that they will also be able to start a day program for young adults who "age-out" of the school system after turning 21.  If you've heard about the epidemic of childhood autism that has been occurring in recent years (and who hasn't heard about it?), just imagine the epidemic of adult autism that is just around the corner.  We have to start preparing now, before it's too late.

With so many fundraisers for so many good causes, we hesitate to trouble you with one more, but once in a while we need to…  Donations as little as $10 are welcome, and will make a difference in the lives of Sarah and her classmates, now and for their future.

EPIC is one of four schools collaborating in Go The Distance for Autism on May 6.  We hope you will support Sarah.  Here is the link to her page where you can make a donation.

Thank you for all of your help and support.

The Strate Family

Saturday, April 14, 2012

From Judge Napolitano to Jacques Ellul

Did you catch this YouTube video when it went viral a couple of months ago? It's called, Judge Napolitano.How to get fired in under 5 mins. It's how to get fired from Fox News, to be specific.  You may recall my view on Fox News from my post last October, All Foxxed Up!  Judge Napolitano always struck me as a bit out of place on that channel, especially when he sat in on Fox and Friends, sitting in between the leggy, blonde Republican chicks and the mesomorphic mooks with sarcastic smirks that typically hosted the program.

So, our Foxy friends, owned and beholden to old Murderin' Murdoch, pursued their "fair and balanced" agenda against the "lame-stream" media by departing from the longstanding tradition of journalistic objectivity (an ideal, mind you, something to aspire to and strive for), and mix opinion liberally in with facts, giving us POV (point of view) programming, driven by conservative, Republican ideology.  So, the judge's rant is not only a criticism of Republican primary politics, but also, implicitly, of the Fox News enterprise itself.

Now that the Republican primary season is essentially over with, with the suspension of Rick Santorum's campaign, it's worth returning to this screed.  Ron Paul's candidacy is just a bit outside of the mainstream, just enough to open things up to a more radical point of view.  And it is for that reason that he has gained support from some unexpected intellectual quarters, including my colleague, Paul Levinson.

But while the judge may have crossed the line as far as Fox is concerned, many of us would note that it amounted to not much more than stick a toe just across the boundary, and I'm sure many viewers of this video with more radical views found themselves asking, What if Judge Napolitano didn't go far enough?  You might say, in for a penny, in for a pound, but it's hard to imagine the judge turning to the leftist views of folks like Noam Chomsky, even though there is some substantial common ground with radical views informed by the theoretical context associated with Karl Marx and Marxian theorists (e.g., Theodor Adorno, Louis Althusser, Antonio Gramsci, Raymond Williams, Stuart Hall).

But a better alternative, for the judge, and for me for that matter, comes from the media ecology cannon, specifically from the work of the French social critic Jacques Ellul (1912-1994).

Ellul is perhaps the most radical of media ecology scholars, but rather than arguing that we are being dominated, oppressed, and manipulated by the rich and powerful via the economics of capitalism, Ellul went beyond Marx to argue that it is the technological imperative, what he referred to as la technique, that has been the driving force in the contemporary world.  He put forth a form of technological determinism, but not in the sense that all of history and human affairs is determined by specific technological developments, but rather, that we now have societies in which human beings have surrendered all autonomy to technological approaches.

Ellul's view is not born out of leftist politics, I should add.  It is radical, but has a conservative tinge (are you listening, Judge?) in that Ellul was grounded in Christian theology, based on the French Reformed Church, a Protestant denomination. A lay theologian, Ellul was ecumenical in his approach, and championed faith and religion in general as a form of resistance to the technological society.

It's important to note that it's not just about machines, not even primarily about machines.  The technological imperative is all about efficiency, finding the most efficient means to any given end.  No other factor can seriously be considered anymore.  You cannot effectively argue for a course of action on moral or ethical grounds anymore, only in regard to the efficiency of means in obtaining desired outcomes. 

This forms the basis of Neil Postman's argument in Technopoly, where he specifically identifies Taylorism (out of which came the idea of scientific management and efficiency experts) as the turning point in the transition from a relatively balance technocracy to the complete dominance of technopoly.

In a technopoly, or technological society, government is no longer subject to the will of the people.  Most vital matters (economics, diplomacy, war) are of such enormous complexity that they can only be managed by technical experts.  So elected officials turn to their advisers to guide them on technical matters. Politics and ideology no longer drives government, but instead must be bent and distorted to justify the decisions of technical experts, based on the sole criterion of efficiency.

Consent of the governed is still the sole source of legitimacy, but now has to be manipulated in order to be made to support the decisions of the technical experts.  And elected officials themselves are technical experts, but mainly in campaigning, which they do to get elected, and now do constantly even after they're in office.

This quick précis doesn't begin to do justice to the work of Jacques Ellul in this, the centenary year of his birth, but I would recommend reading his work, if you haven't already, including his early trilogy, The Technological Society, Propaganda, and The Political Illusion, his two follow-ups to The Technological Society, The Technological System, and The Technological Bluff, and his moral critique of image culture, The Humiliation of the Word

Judge Napolitano, now that you have some time on your hands, what if you read some Jacques Ellul? I think you'd find that he asks the kinds of questions that you'll never hear voiced on Fox News, the kinds of questions that you started to ask, the kinds of questions that are vital for an understanding of our contemporary media environment. Be warned that they're the kinds of questions that may get you fired, that is, that may get you fired up!

Monday, April 9, 2012

Keeping Virtual Watch

So, I must admit that I've been slacking off a bit in the blogging department, having been experiencing something of a time famine of late.  I didn't make up that phrase, by the way, time famine has been out there for some time now, at least a few decades according to the Double-Tongued Dictionary

Funny how the concept of time famine seems to coincide, to some degree, with the new media revolution that began with the popularization of personal computing, followed by the widespread use of the internet.  You might also say that it correlates as well to information overload.  Add spatial dislocation to the mix, and we have 3 out of 4 horsemen.  As for the 4th, well, reports of the death of civilization may be premature, but by how much?  Time is always of the essence.

Certainly, suggestions of the demise of the wristwatch carry a certain weight, as that item of jewelry has gone from an almost absolute must-have for any fully functioning member of a modern, technological society, to an optional bit of ornamentation, and this has occurred in a relatively short period of time.  The watch has been obsolesced by the cell phone, and as McLuhan long ago observed, a technology that becomes obsolescent is often repurposed as an art form.  In this instance, the watch, whose main purpose was utilitarian (what good is a watch that doesn't work? who would wear a watch that's stopped, even if, as the joke goes, it would still be correct twice a day?), is now becoming purely cosmetic, a piece of jewelry no more functional than a necklace or earring.

True enough, watches have a long history of association with precious metals, sometimes gems, and yes, they have long been considered a fashion accessory.  But not necessarily so, after all, as watches might better be compared to shoes, and shirts and pants, viewed nearly as much as necessities, rather than options.  Their obsolescence, then, eliminates the cheap, purely functional variety, and leaves only the more pricey version as a status object.

All this was inspired by another example of augmented reality technology, in this instance applied by Tissot this past holiday season as a means of trying on watches without having to make the trip to a retail outlet.  Here, take a look:

Here's an on site demonstration, outside of a store in London:

and this one is outside of Harrod's:

And you can read more about it, and find out how to try it out for yourself in the comfort of your own home, over on the Tissot Reality webpage.

Of course, the irony is that the augmented reality technology is what is making everyone go wow, not the watches themselves.  While it's a great attention-getter, this use of AR ultimately calls into sharp contrast one of the most intriguing recent manifestations of the new media environment with one of the signature technologies of the old media environment, the mechanical timepiece.

Back in the 1930s, Lewis Mumford noted that the invention of the mechanical clock in the medieval monasteries of 13th century Europe constituted a giant step on the rode to mechanization, as the true function of the clock is to control and coordinate human activity.  As such, the clock was an early example of cybernetic technology (not to mention, the first mechanical device that produced nothing physical, just pure information, and the first form of automatic machinery, the first form of robotics and automation).  Jay David Bolter likewise points to the clock as a defining technology that was a vital forerunner to the modern computer. 

Personally, I always used to say that the watch was not about strapping time to our wrists, it was us in literal bondage to time.  I resisted wearing a watch in my youth, but then I grew up.  As Bob Dylan put it, ya gotta serve somebody, and Mumford noted that it was a natural progression from keeping time to serving time.  But in recent years there have been times when I didn't wear a watch because the battery was run down or the mechanism was broken, or I lost it, or just forgot to put it on, and frankly I hardly noticed its absence.  And there are many times now that I go out without my watch, and do not miss it one bit.

The mechanical clock has been obsolesced by digital displays run by computer chips and wired technologies, such as cable boxes and computer screens.  I don't know about you, but I find that the shift from Daylight Saving Time to Standard, and back again, no longer requires the kind of effort it once did in changing the time on clocks in every room—more and more they've been replaced by displays that automatically make the switch.

It follows that the wristwatch was obsolesced at the point that the computer became so miniaturized and mobile that we could carry it around with us wherever we go, that is, at the point that our cell phones evolved from being souped up walkie-talkies into tiny PCs, with entirely accurate time displays.  But from the start, the purpose of cell phones, and beepers before them, was very much the control and coordination of human activity.  As they evolved to carry out that function in increasingly more effective manner, the wristwatch became increasingly more redundant.

Tissot, then, may be trying to sell real watches through virtual imaging, but with the watch itself becoming virtual (virtualized?), my guess is, they'd better just watch out.