Sunday, May 29, 2011

R.I.P. Jeff Conaway and Fuck TMZ


I have a couple little things to say:

1. Jeff Conaway passed away this week and that is an unfortunate thing. I am not one who tends to get broken up over the death of celebrities, but it would take a tremendously calloused person to not recognize the sadness of seeing someone’s demons have the final say.

The headlines came out that the Grease star had died, and I believe that movie has a special place in a lot of people’s hearts. A lot of people my age associate it with their childhoods and I imagine the movie has renewed resonance in the post “High School Musical,” “American Idol,” and “Glee” world in which we live. In fact, now that I think about it, Grease became a “reality” TV phenomenon of its own when it opened up the casting process to create the show casting for the Broadway revival into Grease: You’re the One that I Want! (When I went to look up the title of the show, I found that there was a similar show on the BBC, Grease is the Word, that did the same thing for a West End revival.)It seems like the American musical is alive and well if Jukebox musicals, revivals and the Disney Channel are your idea of inventive theatre, but that is a whole other post.

I don’t think I’ve ever sat down and watched Grease start to finish, but I am pretty sure that I have seen the whole thing in installments from frequent airings on cable TV when I was young. My main familiarity with Jeff Conaway, however, comes from him being a part of the great ensemble cast of the wonderful late 70’s, early 80’s sitcom, Taxi. More current TV audiences may know him from Celebrity Rehab, on which he showed audiences the behavior that would lead to his early death. Though Dr. Drew Pinsky would report, while Conaway was in the coma from which he would never awaken, that an overdose did not occur, he asserted that Conaway’s state was directly related to years of bodily abuse. Later, on Friday Pinsky would report via Twitter that: “I'm saddened to report he has succumbed to his addiciton [sic].”
“Jeff was like a brother to me,” fellow Taxi cast member Marilu Henner said in a statement to E! News. John Travolta, who starred alongside Conaway in Grease, released a statement to TMZ that Conaway was "wonderful and decent man and we will miss him."

This leads me to…

2. When I die, don’t report it via Twitter (and if you do, take a little time to ensure proper spelling and punctuation). Also don’t send your condolences via TMZ (this, of course, presupposes that I will have sufficient notoriety when my time comes that this would be considered a viable option). Seriously, TMZ is a site that would have published deathbed photos of Conaway complete with tubes up the nose and a colostomy bag if they could have gotten their hands on it. Why anyone would give a scoop to these pricks who exploit their subjects and viewers alike is completely beyond me. I have occasionally tuned into TMZ’s spin-off TV show (generally by accident, but occasionally I will tune in to find out what sponsors I should be boycotting) and have been appalled by the smug manner in which they routinely pass off the worst kind of trivialities as news and seem to think that invasion of privacy and the forsaking of propriety constitutes as journalism. The only reason I could see Travolta giving his statement to TMZ would be if he thought that a brain-dead public would not get the message if he gave it to a reputable news source. Still, Marilu Henner gave her statement to E! News, which generally deals in the same trivialities as TMZ, but aren’t dicks about it, and the message got out just fine.

Seriously, though, if my death is treated like this, I will come back and haunt your ass, and not Sixth Sense style, but full on Amityville Horror shit.

That’s about it. In summation, I guess all I wanted to say was R.I.P. Jeff Conaway, fuck TMZ, and you can keep your Twitter.

Thursday, May 12, 2011

Digital Navel Lint

The other night I did something that I more than occasionally do: I had a thought that was burning a hole in my cerebral cortex that I felt desperate to share, but decided that I was too tired, drunk and emotional to attempt to put it down into words that would convey the depth of my feeling, the confusion of my thoughts, and the subtle fluctuations of my state of mind. Perhaps it was laziness, or perhaps it was the only wise thing that I did last night, but I am glad that I showed restraint.

I have frequently thought that the internet should have a breathalyzer to prevent unwise Facebook posts or irresponsible Amazon.com purchases from occurring late at night. Last night, I am pleased to say that all that it took to halt me in my tracks was the recognition of the vast reach of social networks and the internet in general. I believe that read somewhere that if the pen is mightier than the sword, then the printing press is like an atomic bomb. That being the case, what is the internet? A tool that effectively has a greater reach than print, unrestrained by circulation numbers and printing and shipping costs, available to nearly everyone (in developed countries) regardless of intelligence, ability, or sobriety.

I suppose that if one stopped and thought about this, it would be easy to get alarmed by this, and indeed it is much easier for more insidious notions to be shared with likeminded psychopaths than in times when lone nuts tended to be, well, lone, and isolated geographically. However, I am simply more concerned with the rest of the population that will be more inclined to use this technology in ways that are more indulgent than outwardly destructive (and I think that is most of us).

While it can be argued that the internet is a convergence medium, presenting audio, video, still images, and text in equal importance (and I must add that it is quite remarkable that video and unaccompanied audio can coexist on one medium) I would say that the main thrust of the internet is text. E-mail, status updates, tweets, blogs, etc. I don’t think that this is a controversial opinion or that I am saying anything that isn’t widely known, but I will argue that, while it is assumed to be a Read-Write (RW) medium (to borrow Lawrence Lessig’s use of disc drive terminology as applied to a medium) as opposed to traditional publishing, which would be considered Read-Only (RO), I propose that it lends itself to being a Write Only (WO) medium, in which the conversation is entered less in the spirit of a give and take, than to indulge in the novelty or airing one’s opinions, feelings, and breakfast choices to a wide audience who pays the minimum amount of attention to this information simply to find an opening for them to drop in their ten cents.

Of course, we all read how Facebook was used to organize protests in Egypt and, don’t get me wrong, I believe that this extraordinary circumstance was probably the best use to which social networking can be put. In fact, I would say that it makes nearly all of what I post seem silly and irrelevant. That said, what happened in Egypt was indicative of the breadth of Facebook, not its depth. I would suppose, however, that most Facebook users (at least in America) do not have the overthrow of governments as their main intention (yes, I have read a tremendous number of politically charged posts, but they were of course sandwiched between posts of cute kitten videos and movie quotes).

I hesitate to say unequivocally that the medium completely dictates the content, but I will argue that 420 characters is not enough to expound on anything in any detail, and with Twitter that number is even less. The only thing that can be done with that would be platitudes, slogans, and trivia, usually in horribly truncated and corrupted language. So now we have the ability to publish far and wide poorly considered, hastily written, fragmented thoughts. So while text seems to have a greater presence in our communication than it has since the advent of the telegraph, telephone, and television, it seems to have been done great injury by these other media which favor the image or sound bite (can you tell I am re-reading Neil Postman right now?). Thus we inundate each other with messages that mean little now in the grand scheme of things, and will mean even less later.

Come to think of it, perhaps I need not have bothered censoring myself. I do think that it was a moment that I should have kept for myself anyway without feeling the need to validate it by sharing it with unseen people. However, if I wrote something and posted it, I don’t think there would have been much harm. The post would have easily sunk like a rock, perhaps after a few people would post an emoticon or two before going back to talking about their breakfasts and weekends, and I wouldn’t have blamed them. The posts that I find have the longest life-spans are the ones that enable people to share their own perspectives. I don’t think that this is entirely narcissism, and I do believe there is value to interactivity, particularly when it comes to discussion of news and current events. However, there is a considerable difference between posts that encourage debate and force us to clarify and defend our opinions and those which attempt to make other people gaze at our navels.

There has been quite a bit said about being cautious as far as what we post about ourselves online, arguing that nothing on the internet truly disappears (if you don’t believe me, try out the “Wayback Machine” on archive.org), and we are supposed to be careful of posting things that will come back to haunt us later. But perhaps we should also contemplate why we wish to air intimate details about ourselves and, on the other hand, be concerned about how much time is spent posting things that will prove to be unworthy of remembering.



For further reading:
Lanier, J. (2010). You Are Not a Gadget: A Manifesto. New York: Knoph

Postman, N. (1985). Amusing Ourselves to Death: Public Discourse in the Age of Show Business. New York: Penguin

Lessig, L. (2008) Remix: Making Art and Commerce Thrive in the Hybrid Economy. New York: Penguin

Monday, April 25, 2011

The Obsolete Savant

A few nights ago I was at my local when the song “Fool in the Rain” began to play which, of course, led to a conversation with a slightly drunk patron about Led Zeppelin. One of the more mundane conversations that I had had on the subject, neither of us had any controversial statements to make (one of the most famous is the age old proclamation that “Led Zeppelin is overrated.”). Both of us liked the song. Both of us liked the band. Both of us believed that the album from which the song came (In through the Out Door) was decent enough, while certainly not as strong as their earlier albums.

“There were some good songs on that album,” he said. “’D’yer Mak’er’ is one of my favorites.”

“That’s off Houses of the Holy,” I told him.

“No. It’s off of In through the Out Door,” he asserted.

“I’m 99.44 percent certain sure that it’s off of Houses of the Holy,” I said in a tone which attempted to be authoritative but non-combative and non-condescending.

“No. I’m pretty sure it was In through the Out Door,” he stubbornly added.

As we had just met, he had no reason to believe me and doubt his own memory. He didn’t know about my collection of classic records, my memory for facts, my obsessive attention to the minutest details. He didn’t know of the late-night calls I frequently received from drunken friends to settle bar bets about who played what on what and so forth. I’m not bragging. In fact, I’m not sure this something worth bragging about.

Forgoing a long drawn out debate, I pulled out my smart phone, pulled up the Wikipedia page for Houses of the Holy, and showed it to him. He conceded and the conversation casually meandered elsewhere. No hours of bickering. No bringing up dozens of tangentially related facts in order to prove our expertise to each other with neither side willing to back down until finally agreeing to disagree, but not really because it would still be gnawing at us and every newcomer to the bar would be asked to weigh in to support one of our sides. No, we solved it in a matter of seconds and that was it. After so brief a battle, the victory was hollow.

What was the victory, anyway? To have recalled information that anyone with a computer or smart phone would be able to instantly access? The guy didn’t even seem impressed.

Today one does not need knowledge; one simply needs the ability to access it. With the internet available nearly everywhere on all sorts of devices, most people have that ability at any given time. One simply needs to be able to plug into this collective consciousness and pulls whatever morsel of data that is needed out of the digital ether.

I’m not saying that this makes us a nation of idiots. And it is important to note that this kind of paradigm brings with it issues of its own and a whole different skill set. The internet, that wonderful repository for all discovery, philosophy, epistemology, and pornography also includes vast amounts of hearsay, illogical arguments, un-cited quotes, and straight-up stupidity. It requires diligence to sift through this mess to find verifiable information. In fact, it’s the same kind of diligence required of any kind of research. Cross referencing and citation checking may seem a bit tedious to the modern internet surfer, but as concerned citizens of the digital frontier, it is imperative to use these new digital stockpiles of information fully and responsibly. That said, in a pinch Wikipedia is generally all one needs to quickly, easily, and reliably determine what album “D’yer Mak’er” was on(while nothing online is infallible, Wikipedia is pretty good for basic facts, and for things to be posted, it does require citation).

I guess what makes me the sad is the fact that people like me are simply not needed anymore. In years before the wisdom of the ages could be instantly accessed from the cloud, experts served a very particular function, to maintain the knowledge that others did not have the time, energy, or desire to acquire, either because they had more pressing concerns or perhaps they actually had social lives. Today, they don’t need some geek like me.

Worse, no longer are we simply unneeded, but more, we are largely undesirable. Who wants to have some guy around who slavishly dissects liner notes, pours through musicians’ biographies, and spends hours researching obscure session men? These loner geek-savants tend to make rather trying and creepy company. Now no longer kept on retainer to settle the odd bar bet, we mostly engage one another, finding weaknesses and exploiting them in games of geeky one-upsmanship.

I still maintain that, as good as smart phones are, the geek savant will be able to supply more information and faster. Not only will the geek savant tell you what album the song was off of, but the name of the session guy who played violin on the track, and what obscure band he used to be in with a guy who later played drums in Wings. However, while beating the machine in such a decisive way would be a point of pride with the geek savant, most normal people would probably rather take a few extra seconds to Google something than be inundated with all of the other extraneous information.

Most people would simply be glad to know that such information is accessible should want to find it at some point, and also comfortable knowing that they most likely never will.

But I argue that there are hazards involved with relying on the information simply being in the cloud, even beyond the apocalyptic fear that the whole system may someday collapse, plummeting us all into the dark ages before e-mail, Facebook and internet porn. The main concern should be the fact that the ability to remember vast quantities of facts tends to involve mind mapping and contextualization, which not only act as mnemonic devices, but also give weight to the information. Simply accessing facts may lead to a less critical eye when assessing information and determining its validity and importance. The bigger picture could be lost.

I am not writing this to say that everyone should start pouring through tomes about musicians, songwriters and producers (hell, I’ve even heard that there are other things to learn about beyond music). This way of thinking is not for everyone. In fact, we are aware of being different and our undeserved feeling of superiority comes from that fact that we know more than others (care to) know. But please don’t throw us under the bus because of this. We still serve a function. We’ll still tell you want you want to know and a little bit more of what you were unaware that you needed to know. Don’t abandon us for the new toy.

Friday, April 8, 2011

A Steaming Cup of "Friday"

I feel like the subject of Rebecca Black, the new teen-phenom-in-training, and her opus “Friday,” has been exhausted by now. Younger and more in touch people have weighed in over a month ago and I expect that I am probably the last to offer my two cents. In fact, I really hope that I am. Originally, I watched the video out of curiosity. After all, it was being described as the worst song of all time, and I guess I just had to see it. After I watched it, I debated for a while about whether I would even bother to put down an opinion, which would be a open admission that I actually sat through four minutes of adolescent treacle instead of pursuing loftier things. I decided to let it rest.

This was until I saw that Stephen Colbert was to perform the song on Jimmy Fallon’s show. Clearly this phenomenon was now bigger than the adolescent girl who made the song and the pathetic teenage boys who took the time to write online comments urging Black to “get an eating disorder” or “die.” (Frankly I believe those comments should instead be directed at Ark Music Factory, the company that wrote the song and sold it to Black’s parents for two grand.) To be sure, I don’t like the song. I hardly think that even needs to be said. To be honest, I would be wary of any thirty-something males who do. Which is why I was so surprised and dismayed that Colbert, backed by the Roots, performed a special arrangement of the song on Fallon’s Friday night show, potentially giving the song more cultural weight (ironic as their performance was) and greater longevity.

I feel about “Friday” the same way I feel about “2 Girls, 1 Cup”: If it’s not your thing, don’t watch it. If you aren’t a preteen girl with a taste for sugary pop, don’t watch “Friday.” If you aren’t a far-out fetishist fecalpheliac, don’t watch “2 Girls, 1 Cup.” These are two internet sensations that became popular because of the horror that they inspired in an unintended audience.

But in spite of the fact that I belong to neither of the aforementioned demographics, I have seen both. They simply blew up in such a way that I felt that I was somehow disconnected from the reigning popular culture if I didn’t experience both of these things. In fact, I probably didn’t have to, and probably shouldn’t have. I don’t feel scarred by either of the experiences (but if I had to choose, I would probably say that I found Ms. Black’s video more disturbing), but found them both tremendously unnecessary. Both of those videos should have remained mercifully in obscurity.

The view count of “Friday” on YouTube is now pushing 90 million, but in spite of the fact that the number of views do not count as votes of approval (there are nearly 2 million “dislikes”, and only a quarter million “likes”, while the rest didn’t bother to vote), Rebecca Black has become famous. I feel slightly guilty knowing that I am 1/90,000,000th responsible for that. I added, in a small way, to its popularity, or at least infamy. I was even hesitating to write this piece because I thought it would further contribute to her notoriety if I thought for a second that anyone was actually reading this.

YouTube provides a way to satisfy our curiosity in the latest viral internet phenomena in a way that can be anonymous without making any type of monetary investment. This is especially convenient in the cases of videos such as “2 Girls, 1 Cup” and “Friday” when we know in advance that what we are going to see is a lot of shit.

For the most part, it was a combination of ridicule, indignation, and curiosity that made the song and video popular. Consequently, whether we like it or not, we will be hearing more from Rebecca Black. What direction her career will take is anyone’s guess. Will she go the Ashley Simpson route and find a degree of unearned success just through sheer marketing, or will she actually prove to have some artistic voice of her own (unlikely)? Or will she simply record an album that will end up in the bargain bins next to William Hung’s CD by the end of the year (see how old I am? I’m talking about CDs for shit’s sake!).

Seeing Colbert perform it made realize just how big it has become and how inescapable it is. Even we old farts can’t ignore it. Maybe if we had ignored it, not let our morbid curiosity get the best of us, maybe it would have remained what it was supposed to be, a frivolous video for a thirteen year old girl to show to her friends. As if its internet success wasn’t enough, in Colbert’s and the Roots’ hands, the song has been given a new treatment and an even bigger platform than it had before. But who knows? Maybe, hopefully, this will be the last word.

I will give Colbert and the Roots credit: they polished that turd to a shine, and as far as viral internet videos are concerned, I’m glad they recreated “Friday” instead of “2 Girls, 1 Cup.”

Wednesday, March 2, 2011

Charlie Sheen "News"

It is said that the English word “news” came from the pluralizing of the word “new” in the late 14th century. I think that this is important to remember because most of the things that are exposed to the public as “news” are really just “goings on,” events that do not represent any deviation from the norm or expected outcome and have no direct relevance to the reader (or viewer). Any look at the daily stories of any “news” source will generally result in some sort of feeling of déjà vu. If it’s not surprising, if it’s not new, it’s not news.

Charlie Sheen is going crazy… Or at least he’s acting crazy and saying crazy things, and as the old expression goes, if he walks like a ducks and talks like a duck and he’s not actually a duck, he’s probably fucking crazy. I don’t believe I have to dwell on his antics and although the story seems to just get sadder (I just read online that his kids were just taken away by police), I feel like it’s important to say that this cannot be seen as unexpected. He has a history of drug use, running around with porn stars, frequenting prostitutes, and violent acts against romantic partners (don’t get me wrong, I have nothing against most of those things). This is merely a logical extension of behavior that he has exhibited over the last twenty years or so, which has been allowed to continue and escalate because, well, apparently we like him that way.

Remember that after Sheen had done quite a bit of damage to his image in the late nineties, he was given a second chance on a show called Spin City, replacing Michael J. Fox as the central character. Sheen’s character, unlike the cheeky, but earnest workaholic character played by Fox, was a charming, philandering semi-lowlife named… Charlie. He won a Golden Globe for that role. Clearly, he was back.

After Spin City was cancelled Sheen ended up reprising the character, which is to say he played another loveable reprobate named Charlie, on the CBS sitcom Two and a Half Men. America loved him so much that he ended up being the highest paid actor on television, garnering nearly two million dollars per episode. America told him that not only did we forgive him for his transgressions, but that we would reward him with heaps of money (or, more accurately, the ratings that would justify those heaps of money).

Television actors commanding large salaries is nothing new. The cast of Friends, for example, made record breaking salaries at the time, but we expected them at act to some degree, or at least to learn to respond to different names. I will not compare most sitcom acting to Ben Kingsley transforming himself into Mahatma Gandhi, and also I don’t know the cast of Friends personally, so I can’t say the extent to which their characters differed from their actual personalities. However, it is not hard to see the past decade of Charlie Sheen’s career as him simply “being himself” (okay, minus perhaps the spousal abuse), and receiving huge amounts of money and adoration for it.

And now he’s going crazy. Or crazier. But this should come as no surprise. When he began exhibiting his outlandish behavior we gave him a biscuit… and a TV show. We gave him another TV show and his behavior escalated and we gave him another biscuit…. and even more money and fame. He has been given positive reinforcement for all his negative behavior over the past decade. The escalation of his antics should be expected.

Therefore, it is not news.

It’s entertainment.

If the producers of Two and a Half Men, were really smart, they would try to fill out the DVD of this aborted season with his clips from TMZ, The Today Show, 20/20 and the rest of the interviews he did this past week. After all, it’s the same character. It’s the same old Charlie, just on at a different time. And clearly we love him just as he is.

Thursday, March 5, 2009

Steal Your Face: Branding The Grateful Dead


In 1975, the San Francisco Bay Area band, Hot Tuna, released their fifth album, “America’s Choice.” The cover was designed to resemble a box of laundry detergent, complete with suds overflowing on the box. Was this a jab at American consumer culture from members of the quickly disappearing counter-culture, or was this an acknowledgement that, indeed, their record album was a product like any other and they were trying to sell it? I imagine it was a bit of both.
At this time one could say that the band was in a bit of an identity crisis. It wasn’t sure what it was or how it was going to present itself. Looking back on the previous four albums, the band’s identity was based around members Jorma Kaukonen and Jack Cassidy’s mutual love of old blues combined with the free approach exemplified by their previous band, Jefferson Airplane. Now they were beginning to go for a more traditional hard rock format. Was this evolution or trying to sell records? Did this shift have something to do with the deliberately generic album cover design?
When people discuss branding, more often than not they think of a product in terms of something that is artless and utilitarian. Talk to any amateur musician about branding their work and usually the words “sell out,” “philistine” and “Michael Bolton” will enter the argument. Yet, they will spend hours discussing “their sound” or having a designer friend of theirs come up with a “bitchin’ logo.” And yet, why is this not referred to as branding? It seems that applying any business model or language takes away from the artistry, so people simply call it other things. When successful bands form their own record labels, as the Grateful Dead did in the seventies, and actually take control over production and manufacturing, this is done in the name of “artistic control” never referred to “vertical integration.”
I am not going to say that Hot Tuna broke up because of a branding crisis. However, it is notable that as seventies went on, Cassidy and Kaukonen seemed more geared towards finding new audiences than holding on to the one they had, exemplified by the fact that they both formed New Wave bands after the Tuna dissolved in 1978.
The Grateful Dead on the other hand, well, say what you will about them, but they knew how to hold on to their audience. Alright so they may have dabbled in some disco beats in the late seventies, but they never lost their core, even when their music was most at odds with the pop music trends. While their peers faded from the scene, they persisted, monopolizing their audience of aging (and neo) hippies. Am I saying that they did all of this because of a “bitchin’ logo”? Of course not.

It certainly didn’t hurt, though. The logo, derived from an old stencil that was used to identify the band’s gear when playing at venues and festivals where numerous other bands were playing, ultimately ended up adorning the entire cover of their 1976 live album “Steal Your Face,” giving the logo its unofficial name. Since then, the logo has adorned tickets, posters, t-shirts, archival releases, websites and all manner of merchandise. The image is so well known, and so associated with the deadhead culture that law enforcement officials in some states even made it probable cause to search for narcotics when seeing the logo on a bumper sticker.
At first glance, the logo would appear to evoke a more heavy metal sound, with the image of the skeleton and the lightening, but the same could be said about the name of the group. Personally, as a fan who doesn’t necessarily appreciate being lumped in with the majority of “Dead Heads,” I appreciate the simplicity of the logo, the design of which also counters the more “hippyish” aspects of the music and culture (“The dancing bears” seem more appropriate). The fact that most Dead Heads would disagree that the design is somehow inconsistent with the music that they make is evidence of the fact that the image has become so inextricable from what it is intended to represent.
The design itself is very effective. Essentially a logo within a logo, the inner circle containing the lightning bolt dividing a circle into red and blue halves is often used by itself in band images. The red white and blue obviously speaks to the colors of the American flag, as well it should. The music of the Dead, being hugely inspired by old folk, bluegrass, and blues, is distinctly American. The lightning bolt signifies that music is vital, a living creation emerging from the traditions of the past. In the outer circle the skull (Skull = Dead, no-brainer) frames the inner logo, indicating that this mixture takes place within the head space. Again, the outer circle is almost as recognizable on its own without the lightning bolt, also being incorporated into many images and packaging designs.

All in all, I would say that the logo is incredibly potent and effective for a number of reasons. Firstly, it is ubiquitous. People know what it means and whenever any part of that logo shows up anywhere, the viewer knows that the “product” will somehow be associated with the Grateful Dead or the Dead Head culture. Secondly, though the colors are striking and clearly have symbolic value, the image is completely recognizable in black and white and still has power when the image is altered, or shown in part. Lastly, it is aesthetically appealing, a powerful image that, artistically, is even a bit more “badass” than the band portrays itself.

(On a side note: Hot Tuna finally got their “bitchin’ logo” when they reformed years later. Not as ubiquitous or versatile, but it looks awesome on a t-shirt.)

Friday, September 26, 2008

Digital Media, Ritual, and Pink Floyd

I guess before I go into the “how did I get here from there” bit, at which I sometimes marvel myself, I should come out with my basic problematic, and then explain how I arrived at it along with all the thrilling moments (Ha!) and startling revelations that came on the way.

My problematic is this: How does the shift to digital media (primarily referring to information rendered on one’s computer or personal media player, formless and easily copied) devalue information and art, and how can this shift work to the advantage of artists and media professionals who seek to actually make a living off of their efforts?

My background and training had been in theatre. From the time I was in elementary school, I believed that I my future had been mapped out for me and that future would be on the stage. It was only after I graduated from college that I decided that the vitality of theatre as a changing force in mass consciousness was a myth, or at the very least, a highly outdated notion. People were no longer getting their ideas in theatre. Theatre was no longer affecting anyone’s daily lives. I must have known that. I had grown up more on movies than plays, and as a child I watched more MTV than saw live music. Why and how did I end up with this unshakable reverence for the direct, immediate live experience?

The MP3 started to gain in popularity around this time. I knew, as everyone did, that digital media was the way of the future. CDs had put vinyl in a pauper’s grave (the resurgence of vinyl was still a few years away), and DVDs were threatening to do the same to VHS (which then, as now, nobody looked at as any big loss). Information was getting put in smaller and smaller (and more portable) packages, but in my stubbornness, I refused to believe that the package would go away altogether. I had a debate with my best friend, himself a former actor now turned musician, about whether the MP3 would really catch on. He believed that it would, but I said otherwise. I made reference to an article I had read about McLuhan (“What If He Is Right?” by Tom Wolfe, which I had read way back in junior high school and represented the whole of what I knew about McLuhan at the time) in which he said that in the future, people would want tactile experiences (however, he also said in the same breath that packaging would become obsolete), as well as citing the autobiography of Frank Zappa, in which he said that “People like to own stuff.” I thought that by quoting two dead guys that I would automatically win the argument and, in doing so, provide an accurate forecast for the future. I don’t think I need to say how wrong I was.

Years later, my friend called me and reminded me of the talk that we had had. I, of course, conceded to him, and he told me that even though he had been right, he wished that he hadn’t been. He mentioned how he was listening to the album “Wish You Were Here” by Pink Floyd, and he recalled the first time his father played it for him as a child. He remembered noting the care with which his father pulled the LP out of the jacket and placed it on the turntable, gently putting the needle on the record, then hearing the warm crackle of the old vinyl as the music started. He listened to the whole album staring at the image of the burning man on the jacket. How will my children experience that now? He felt as if the future had been gypped of something, that a beautiful bit of ritual would be missing from their lives.

Of course now, the beautiful ritual is the least of everyone’s concerns. Given the nature of digital media to be copied easily and exactly and the move away from a physical carrier (such as a packaged CD ), the idea of authenticity is becoming a hazy notion. Fewer people are paying for media, and the industries are losing income (boo hoo) as are the artists who create the material. How can the artist benefit from this shift instead of being shortchanged by it? How do we get that sense of beauty and ritual back, and if we can’t, what are we getting in return? What’s going to happen to album cover art?