WohlWorld

Critiquing the Major Film Studio Logos

leave a comment »

About 90% of the money paid to see a movie in North America goes toward one produced or distributed by one of the six “major” film studios. These are mega-profitable, highly visible corporations that nevertheless usually don’t have a lot of brand distinction between them. A movie is a movie is a movie, for the most part, and I don’t know anyone in today’s world who associates any one major with a particular type or quality of product (with the one exception probably being Disney). Yet each company has a rich history and an identity they try to convey to indifferent viewers, probably more for the sake of tradition and corporate pride than for advertising or branding purposes. The way they do that is through their animated logos which they slap in front of their releases. I decided to examine these little-scrutinized industry totems and review those of the six majors. Here are my conclusions:

Columbia Pictures

“Columbia” dresses up as a Men in Black.

Columbia’s logos over the years have varied as to whether they zoom in to or out of the torch’s light, but all have featured “Columbia,” who looks exactly like Annette Bening, and is supposed to be a female personification of the United States. (A little ironic for a studio owned today by Japanese conglomerate Sony.) The latest iteration, which has been in place since the early ’90s, contains some rich detail and color, especially in the clouds. I think it would be cool if those clouds were animated to move in a more noticeable way, but it’s still an impressive feat to make clouds so visually interesting. This logo also features a lens flare big enough to embarrass J.J. Abrams, which was certainly not the case in the logo’s early years, when lens flare was thought of as an unacceptable error rather than the cinematographic weapon it is today. Although the scene is presented about as attractively as it can be, this logo feels a little off to me. Its patriotism, even after being toned down—the blue drape Columbia holds used to be an American flag—doesn’t seem all that relevant to the idea of movies; the symbol isn’t evocative of much other than silver dollars and World War I propaganda. I really like, however, how the studio lets their mascot have fun: She wears sunglasses and holds a neuralyzer for Men in Black, and in a bizarre meta-joke, had her face replaced with that of the real Annette Bening for her 2000 film What Planet Are You From.

Paramount Pictures

Like most of the logos in this list (movie studios rarely go with a full re-brand, I’ve discovered), the Paramount logo has changed from one variation to another of the same theme for the entirety of the studio’s existence. I had always assumed the mountain depicted was supposed to be the Matterhorn, but it’s a bit too misshapen, and possibly a bit too associated with another company, to be it. The logo’s original designer was a Utah native and many have suggested it could have been inspired by any number of peaks in the Wasatch Range. But if there’s one mountain in the world that most resembles the Paramount peak’s current form, it has to be Peru’s Artesonraju. Anyway, the identity of the mountain is irrelevant; the question is, how does it function as a logo? I would say this logo does a lot with relatively little, and the stars that ring the mountain are the unsung heroes here. For decades, their appearance came spontaneously. Starting in 1986, they approached from downstage right. And after 2002, they descended from the heavens. As you can see in the logo above, the new 100th anniversary one adds some new wrinkles including the stars taking a meandering, leisurely skip across a lake and a bright sun in back of the mountain. I give Paramount credit for making such an obtuse, formerly static symbol interesting. It doesn’t have much to do with movies or anything else really, but something about it just feels right.

Twentieth Century Fox

The monument topped with a huge “20th” predates the company itself, as it was originally used by Twentieth Century Pictures before its merger with Fox Films in 1935. The gleaming Art Deco colossus is a splendid thing to center a movie studio logo around, because it belongs to a certain design era that everyone associates with Classical Hollywood. Until 1994, the image was stationary; only after then was the twirling view of a CGI Los Angeles cityscape introduced. The first iteration included a number of easter eggs to be found in the signage on buildings, including, for instance, “Murdoch’s Department Store.” The version seen above was updated within the last few years but is largely unchanged, as the most visible difference to the old CGI one is the addition of palm trees surrounding the spotlights. I’m wholly in support of this. The more classic Hollywood tropes they can cram in there, the better. I like this logo a lot because it’s sort of like a slightly more reality-bound version of the new Disney one. In this case the fantasy world isn’t one of medieval legend but an idealized, romantic Hollywood, and it feels alive, with individual buildings and moving cars and the Hollywood sign. I think they could probably do more with the idea—how about throwing in a majestic old movie palace marquee or two?—but I still think this is one of the best studio logos there is.

Universal Studios

This is one logo that has benefited a lot from the development of CGI in my opinion. Its extraterrestrial vantage point was ambitious when it became the first still-operating American film studio to be founded in 1912 (beating Paramount by a few months); early renditions of the logo were obviously of the scale model-on-a-matte-background variety, and it stayed that way pretty much until the ’90s. It was only then that the magic of 0s and 1s could provide the vivid detail we see today. The 100th anniversary logo seen above takes, aside from the huge interstellar cloud it has the Earth residing in front of, a very realistic approach to to the globe. This includes depicting the luminescent signs of human inhabitance, a decision that I’m not sure how I feel about. For sheer aesthetics, I think I preferred the late ’90s and ’00s version with an Earth that was shimmering with white light and comprised of the brilliant rainbow of hues normally reserved for topographical maps. To its credit though, the new one is a tiny bit less Americentric, as it doesn’t conclude with North America perfectly centered. (The Earthcentrism of the company’s name referencing the universe but only focusing on our planet is another issue.) All in all, this is definitely a logo I can get behind. I like anything in space, the globe is an instantly recognizable icon, and the subtextual message—that through movies you can have a world of experiences—is a comforting one.

Walt Disney Pictures

The Disney logo of my youth holds a good deal of nostalgia but it was looking aged by the mid-2000s. Knowing nothing but the two-tone, two-dimensional representation of Disneyland’s Sleeping Beauty Castle, the 3D, flag-fluttering version that preceded Toy Story (and a few other Disney/Pixar films after it) was a bit of a shock. But nothing prepared us for the staggering orgy of color, sound and visual information contained within the current Disney logo. It starts in space, focusing on the star to be wished upon from Pinocchio, along with the “second star to the left” from Peter Pan. The camera swoops over an ocean, multiple rivers, an expansive network of small villages, a large sailing ship and a steam locomotive before settling before an amalgam of Sleeping Beauty Castle and Walt Disney World’s Cinderella Castle. I love this logo. If you look really closely, you can see numerous individual buildings and trees, a network of roads, docks extending into the rivers, another faraway castle-like structure, and an entirely different landmass beyond the sea. On its bad days, Disney can feel sterile and like a soulless simplification of everything; that feeling is not to be found in this logo. it presents a breathing universe that represents Disney’s position as a modern custodian of timeworn myths and implies that that universe can be explored and lived in rather than only known of at the basest level. Essentially, I want to play an open-world video game taking place inside the Disney logo, and I’m impressed when any 30-second clip of anything gives me that feeling.

Warner Bros.

Unfortunately, I think there is a clear last place among the six majors’ logos, and it is occupied by Warner Bros. Although they’ve gone through some interesting minimalist periods, the gold shield as we see today has been pretty much the same as long as it’s been their logo. I have a number of issues with it, and the fact that it’s hard not to think of Looney Tunes when looking at it is the least of them. Biggest of all is the opening view of the sequence, in which we see a warped, gold-stained aerial shot of what I presume are Warner’s studios. Like I mentioned with the Fox logo, I applaud attempts to showcase imagery that evoke the essence of Hollywood. But this one fails because there’s nothing romantic, exciting or emotionally resonant about looking at row upon row of warehouse-like soundstages, even if they are where the proverbial magic is made. And it doesn’t help that the image is probably too distant to look like anything identifiable anyway unless you are really concentrating, which you are not, unless you are scrutinizing it on YouTube in order to blog about it. Another problem with this logo is the sky background. Columbia proved that with CGI you can make clouds look amazing. So why is Warner still rolling with what looks like a matte painting from the ’40s? It’s jarring, especially since the shield itself has had a 3D, CGI sheen since 1998. I don’t think this logo concept provides very much to work with compared to some of the others, but I still think WB is really dropping the ball here.

Well, that ends my review of the Big Six, though there are dozens more logos like them in the wide world of film production and distribution, including a few real gems. I think my personal favorite might be that of Skydance Productions, which, for some beautifully inexplicable reason, shows the letters of its name being released into zero-gravity space by elaborate vice-crane contraptions in the vicinity of the Sun. I’m also a big fan of those of Marvel and DC, which in my estimation express a similar idea—the mythmaking power of the comics medium—in admirably different ways: Marvel by overwhelming the screen with the vastness of its canon, and DC by showing rays of weightiness emanating from those iconic building blocks of comics, the Ben-Day dots. Hollywood has been owing much to these companies lately, and I think it could learn a lot from their movie logos too.

The End of Geography (Except, Not)

leave a comment »

When I was young I loved maps. (Not that I don’t now; I still spend a lot of time browsing Google Sightseeing, the book of transit maps from around the world my girlfriend and I bought, and reddit/Map Porn.) As a child I had a puzzle of the United States that I would regularly trace, writing in information of limited importance such as the order of their admittance to the Union. I remember being enthralled at my friend’s CD-ROM street map that covered the entire country.

The intersection of Internet technologies and the traditionally geographically diverse nature of sports has created some interesting issues, and for that matter, maps. This one shows what teams are blacked out in what locations on MLB.tv, baseball's out-of-market TV service.

Of course, today we don’t think twice about being able to pull up a comprehensive street-level atlas of the entire Earth on a portable device the size of a Game Boy Light. Anyone could come up with countless examples of how the world has shrunk even more as the Internet has pervaded our lives more and more.

One might think this would mean that physical location doesn’t matter as much as it once did. In some ways, this is definitely true. It’s easy to get into Singaporean politics, Namibian cricket or Greenlandish hip hop while you lie comfortably in your American bed. This huge expansion of the information one can find if they put a little bit of effort into it is one of the things, if not the main thing, that makes me happy to be alive when I am.

But when it comes to the mass media, I’ve noticed an odd thing: It seems today that location means destiny more than ever.

Take national politics, for instance. Coverage of presidential races is now overwhelmingly reported on by 24-hour cable news networks that deliver the same content whether you’re in Hawaii or Maine. Yet political parties worry about the optics of geography more than ever.

Look at where national nominating conventions have been held. From 1992 to 2004 Democratic conventions were held only in liberal metropolises: New York, Chicago, Los Angeles, Boston. Since then the conventions for both parties have been swing-state only affairs: Denver and Charlotte for the Democrats, St. Paul and Tampa for the GOP.

Another area in which geography is playing a larger role is television. Scripted TV shows of the past may have had real settings, but the Milwaukee of Happy Days or the Seattle of Frasier were little more than abstract concepts flavoring the L.A. soundstage environments. While there are still some spectacular examples of this—until you’ve seen the Disney Channel’s Wizards of Waverly Place, you have no idea how much Greenwich Village looks like Disney World’s Main Street, U.S.A.—the phenomenon has lessened as single-camera, on-location shows become the norm. And it’s even more pronounced when it comes to reality TV.

A somewhat-less-than-true-to-life New York City street set from Wizards of Waverly Place.

The Real World is often cited as a reality pioneer and that holds true in its featuring a different city for every season. Because today, it seems like most big reality franchises are somehow location-dependent. You’ve got Pawn Stars in Las Vegas, Swamp People which inspired a bevy of other shows set in the Louisiana bayou, Jersey Shore which did the same for New Jersey, and The Real Housewives of most conceivable upper-class locales, the format of which is now being exported internationally.

The area in which location seems to have taken on the most added significance recently, though, is sports. ESPN and other national media outlets once acted actually national; now their inordinate focus on teams from the very few largest media markets is unmistakable.

Take the Jeremy Lin phenomenon. I think his is an incredible story and it certainly deserves media attention. But you’ve got people like Mark Cuban—the owner of a team in a big, but not huge, non-coastal market, the Dallas Mavericks—saying that it wouldn’t be much of a story if it wasn’t happening in New York. I’m sure Cuban was exaggerating when he said “no one would know” about Lin if he was playing elsewhere. But it seems clear to me that it wouldn’t have developed into the same obsession for the Worldwide Leader.

Interest in consolidating the NBA’s elite talent into the biggest markets seems to be at an all-time high, if the recent demands of players like LeBron James, Carmelo Anthony and Dwight Howard to be traded only to alpha cities are any indication. It’s hard to blame young men with boundless fame and disposable income for wanting to live in New York, Los Angeles or Miami. But I do blame certain media outlets for encouraging them by focusing so disproportionately on teams in places like that.

North American celebrity culture in the realms of film, TV, music, comedy, and so forth have always been centered around an extremely narrow group of cities. I don’t have a problem with that—after all, that’s part of what gives those cities their unique attractiveness. But I have always thought it was kind of cool how sports, by the nature of its organization, is a glamorous industry that is quite geographically egalitarian. To ply their trade professionally, a baseball player from Miami could go to St. Louis, a football player from Southern California could go to Green Bay, and a hockey player from Stockholm could go to Calgary, and all of these moves would be considered seriously moving up in the world. I don’t like the feeling that we’re losing that idiosyncratic feature of sports with increased emphasis on big-market teams.

In my opinion, this is particularly unfortunate in the NBA, which has a tradition of placing teams in small cities that no other major pro leagues are present in, like Memphis, Oklahoma City, Orlando, Portland, Salt Lake City, and San Antonio. Before securing a new arena deal this week, one such city, Sacramento, came extremely close to losing their team to Anaheim, in the Los Angeles metropolitan area, which has two teams already. Is this idea of stacking huge media markets with ever more teams a sign of things to come? I hope not, but with the media focus on those areas, it would make a perverse kind of sense for the teams.

The way I see it, national media has two choices about how to approach coverage of a nation- or worldwide sports universe. Either they can identify stories from anywhere in the country that they think, if widely publicized, would have the most wide appeal; or, they can act like a local outlet focusing on the teams with, statistically, the most fans, and push stories about these teams onto the rest of the country that might not otherwise care at the same time. I think they choose the latter far too often. Which frustrates me, because in this day and age, I want to revel in the fact that we’re no longer beholden to local newspapers and TV for our information. And I certainly don’t want to feel like our spectacular modern communication technologies are just feeding me a facsimile of that experience, and about a location I might not even care about in the first place.

Living in the age of cyberspace hasn’t made geography irrelevant—if anything, it’s amplified it. Some of the effects of this are positive and some are negative in my opinion, but either way, I find it pretty counterintuitive at first blush. But as a recent study showing that Twitter connections match up quite tightly with airline routes indicated, the Internet serves as a compliment, not a replacement, for the physical world. Methods of communication may shift rapidly, but human nature has proven much more resistent to change.

Written by Dan Wohl

03.02.12 at 1:53 am

Pop Culture Mysteries: The Truth is (Maybe) Out There

with 3 comments

The media explosion of the past 50 or 60 years has led to a big increase in the amount of what anyone might call art or entertainment. (Not that everyone would find all of it artistic or entertaining, of course.) The invention of all our familiar mass communication technologies enabled that, and I revel in living in a world that’s completely flooded with pop culture. Come to think of it, that’s sort of a crucial theme to this blog.

What I think might be overlooked is that not only has a great deal of culture been created in this time period, but so has a stupendous amount of information relating to that culture. We’ve formalized things in a way that probably didn’t seem necessary or practical in past eras. It’s interesting to imagine what a medieval lute player would think of Slayer or what the Knickerbocker Club would think of modern baseball. But I also wonder what they’d make of the 117 different Billboard charts or a college football ranking system that takes into account six different computer-generated data sets.

Details play into our understanding of culture more and more, and Wikipedia and the Internet in general put them at our disposal with a minimum of effort. The ease with which facts can be learned has made it possible for netizens to be obsessed with ever more specific minutia. There are dusty Web backwaters where you can learn all about every conceivable piece of Calvin and Hobbes merchandise. About the lifetime winning percentages of all six Legends of the Hidden Temple teams. About the academic validity of the writing on blackboards in school-set pornographic films.

One side effect of all this, for me anyway, is a heightened fascination with those facts that still manage to fall through the cracks. If you have a basically proficient command of Google and something is still a mystery to you, chances are it is a mystery to society as a whole. Unsolved pop culture mysteries remind me of the vastness of time, in the face of a world where we expect nothing to still be hidden. Here are a few of the ones I find the most intriguing.

What was “Ready ‘n’ Steady,” and what happened to it?

Joel Whitburn is a meticulous record collector and researcher whose mission it is to own and catalog every record that has appeared on the Billboard singles charts since they were invented in 1958. He now has them all, except one.

D.A.'s "Ready 'n' Steady" bubbles under the Hot 100 on June 16, 1979, then bubbles its way into oblivion three weeks later.

For three weeks in June 1979, a song called “Ready ‘n’ Steady” by an artist called D.A. appeared on the “Bubbling Under the Hot 100” chart (which was, at the time, simply positions nos. 101-110). It debuted at #106, went to #103 then #102, then dropped off the chart—and apparently off the face of the Earth.

No one is known to own a copy of the record or know what the song or artist even was. Whitburn, the authority in this area who has tracked down literally everything else, is totally stumped. He now says he isn’t sure whether the record exists at all.

But the evidence is right there. What makes the “Ready ‘n’ Steady” mystery especially confounding is that for a song to come close to the Hot 100 a song has to be, you know, popular. That no one would own or have much knowledge of, say, a legendary unreleased track like “Carnival of Light” is unsurprising. But songs get on the Billboard charts by having their records bought and being played on the radio. People have to have bought it. Radio stations have to have a copies stashed in their libraries. Someone has got to at least remember the damn thing, right? …Right?

There is one other possibility, one that would be pretty bizarre, though maybe no weirder than a charting song disappearing without a trace. That would be that the song never actually existed in the first place. Who knows why or how a fictitious entry could make it on the chart (for three weeks at that). Could it have been a copyright trap to determine if someone was copying their information? Mapmakers add fake towns to their maps for this purpose sometimes, but since Billboard wants its proprietary chart information to be repeated as much as possible, I don’t see how this could be. Could it have been an inside joke by a rogue Billboard employee? You can’t rule it out, though if that was it, they really should have thought harder about making up a jokier sounding title.

What happened to the “Shot Heard ‘ Round the World” ball?

On October 3, 1951, a legendary baseball moment occurred. Down 4-2 in the bottom of the ninth inning of a playoff game to determine the National League champion, Giants third baseman Bobby Thomson hit a three-run home run to end the game, defeat the rival Dodgers and propel his team into the World Series. It was a shocking turn of events that enveloped the Giants’ home stadium the Polo Grounds in complete bedlam, with Giants fans rushing onto the field in a state of euphoria.

Lost amid the insanity was the fate of one crucial object: the ball. It landed in the left field bleachers and hasn’t surfaced since.

The bat that Bobby Thomson used to hit the "Shot 'Heard Round the World" is in Cooperstown, but the location of the ball is a mystery.

There have been many conflicting stories. One man claimed the ball had been given to him as a child by a family friend, and the ink on it indicating it as such was determined to date from the period. Two eyewitnesses who are visible in photos of the event testified that they saw an African-American boy catch the ball in a glove and run away. An obscure book from the ’50s claimed the ball ended up with a woman named Helen.

One of the most intriguing accounts came from filmmaker Brian Biegel, whose film and book “Miracle Ball” present fairly convincing evidence that Helen was, of all people, a nun. This Sister Helen insisted all her possessions (including, presumably, the ball) be thrown into a dump after her death, and Biegel suggests this was because she was breaking the rules of her order to even be at the game in the first place.

That there’s recently been so much interest in finding this “Holy Grail of Sports” and that in 1951 whoever ended up with the ball didn’t think it important to come forward then or possibly ever, shows how much the concept of memorabilia has developed. Antiques Roadshow, eBay, Pawn Stars and other cultural elements emphasizing the collectability of things has transformed the significance we place on historical objects, and probably on contemporary objects as well.

What’s so interesting about that is that this is all happening right as objects would seem to mean less than ever before, in the age of e-books, mp3s and paying at Starbucks via mobile phone. I wonder if all this has alerted people to the increasingly few things that cannot be digitized. Historical items fall under that category of course, and I’d bet that as society grows more and more digital historical objects will grow more and more interesting to people, as the very concept of tangibility becomes more alien.

There are other mysteries in the world of antiques and collectables, but none about something so crucial to American pop culture. Indeed, Don DeLillo’s magnum opus Underworld immortalizes the whereabouts of the ball as mythic. Huge industries have sprouted up recently dedicated to knowing the truth about old things. But deep down, perhaps even subconsciously, I think people yearn for there always to be a bit of mystery concerning the past. And the chances that the Shot ‘Heard Round the World ball will ever leave that realm seem very slim to me.

What is the origin of the vocal sample in DJ Shadow’s “This Time”?

I’ve discussed music sampling on this blog several times before. And while I’m disturbed by some of the more shallow uses of the technique, I’ve also mentioned some artists whose sampling I think is really artistic and interesting, such as “plunderphonics” groups who construct entire albums solely out of obscure vinyl samples, like the Avalanches. A similar if less dance-oriented artist who preceded the Avalanches is DJ Shadow, who the Guinness Book of Records credited with creating the world’s first completely sampled album with 1996’s Endtroducing.

Ten years later, DJ Shadow released a tepidly received album called The Outsider that was a good deal less sample-crazy. But its lead track, “This Time (I’m Gonna Try It My Way)” was built around one of the most peculiar samples ever.

Puff Daddy may have been happy to copy Led Zeppelin and the Police, but it’s apparent that more creative DJs try to outdo one another in finding the most obscure samples possible. The vocal sample on “This Time,” however, might be some kind of trump card, because no one, including DJ Shadow himself, knows who it is.

Apparently, it was found on a demo reel in an abandoned studio in the San Francisco Bay Area, labeled only with the name “Joe” and the year 1967. If Joe, or anyone who was aware of him making this recording, is still alive, they either haven’t heard this song or haven’t come forward.

The genesis of DJ Shadow and Joe’s track is a testament to the power and mystery of recorded media. How amazing, and strange, is it that a person can write a song and lay it down one day only for it to be both forgotten yet preserved, awaiting rediscovery without any of its original context, for decades?

Considering how much people know about obscure vinyl recordings—the videos breaking down the samples in Avalanches and DJ Shadow songs linked earlier attests to that—something like “This Time” makes you realize how much in fact how much culture we don’t remember. How much has been lost, forgotten or as of now hidden is, to me, incredible to think about.

What is “It doesn’t DO anything! That’s the beauty of it!” from?

Take a look at that quote. Does it sound familiar? I asked this question of six or seven friends and they all said it did. Apparently people across the country and world all do. So…what is it from?

Can you place it? Don’t worry. No one can. It has bedeviled the Internet for years and some truly epic attempts to figure it out have been made to no avail. Willy Wonka, the Simpsons, The Hitchhiker’s Guide to the Galaxy—some guesses are more prevalent than others but nothing has proven correct. Investigators have scoured deep into the annals of pop culture looking for the line, and occasionally something reasonably close from a source that is today mostly obscure like this will show up.

Of course, that misses the point. The seeming ubiquity of “It doesn’t DO anything! That’s the beauty of it!” ensures that it can’t be from something most people are not likely to have seen. Chances are that it sounds familiar to everyone because everyone has heard something that sounds similar to at least part of it somewhere. Why this particular phrase has such a strong associative effect like that, and furthermore how and why and by whom it reached the public consciousness for being such, are sub-mysteries of their own.

I think this one is amazing because it’s a mystery so mysterious that it strongly challenges one’s perception of the mystery’s premise in the first place. It’s probably not truly a question seeking an answer at all, it’s more likely a vast sociopsychological phenomenon that creates a mystery without any any possible solution. You can try and try to find the answer, but ultimately you really can’t do anything with this one. And you might just say that’s the beauty of it.

What Might ’00s Nostalgia Look Like? An Exercise in Future-Retroism

with 8 comments

There’s no way to confirm this empirically, but it feels to me like ’90s nostalgia has exploded in just the last year or so. Nickelodeon announced an upcoming ’90s nostalgia programming block, NBA jerseys of the era are all the rage, and amazingly excellent blog posts that you should read are being devoted to its cultural relics.

Already available at CafePress, despite the incorrect punctuation

Of course, this should not have surprised me. As I mentioned in the kids’ sports movies post and as others have pointed out, nostalgia for a decade begins two decades after it like the clockwork of this watch (see? ’90s nostalgia is inescapable all of a sudden!).

So we know ’90s nostalgia is big right now and will likely only get bigger as the ’10s go on. Trust me when I say I approve of that. But for this post, I want to try to travel into the future. To 2020, to be exact, when according to the aforementioned ancient Mayan nostalgia calendar that hasn’t been wrong yet, nostalgia for the ’00s will kick into gear.

You might find it hard to imagine, today, what anyone could be nostalgic about, and I’m right there with you. But I have no doubt that no one in 2001 really knew what kind of form ’90s nostalgia would take at the time either. While acknowledging that difficulty, I’m going to attempt to predict, at this very early time, what ’00s nostalgia might look like.

How will I do this? Not very well, is likely to be one answer. But as for my methods, I think there’s enough to be learned from the common themes in the well established ’70s, ’80s and now ’90s nostalgia movements/industries to develop certain nostalgia indicators, if you will.

I think a key thing that characterizes objects of nostalgia is that they are often not the cultural items that are considered actually great. I don’t think anyone says “I love ’70s movies, like, ‘The Godfather.'” That’s not to say they necessarily have no actual merit, it just means they might be things that don’t immediately ensure cultural endurance.

In that way nostalgia is a concept that shares a lot with the tenets of pop art. It’s as much about resurrecting what’s been discarded as it is about determining what had merit. Something from the past that satisfies both these conditions will quite likely be deemed nostalgic. Something discarded that wasn’t all that good has a decent chance too. And even a few things that endured because they had merit might, but only if they are extremely indicative of their time.

So based on those ideas and what I know of contemporary nostalgia, I will attempt to figure out what from several different areas of culture in the ’00s will be viewed as such. You might have heard of retro-futurism, the artistic concept of utilizing motifs based on what the future was thought to possibly be like in the past. This is going to be the opposite. Join me now, as I take a future-retroistic journey to 2020…

The first thing I’d like to discuss is the Internet. I think the few years after 1995 when the Internet was truly a wild frontier, the “Netscape Era” if you will, is pretty fascinating. But I feel sure that in the future the ’00s will be seen as the golden age of the Internet.

The Internet is becoming more and more the media norm every day. If the 2012 presidential election is not the last one to be covered in newsprint, I think the 2016 one definitely will be. Every facet of Internet media has gotten more corporate and slickly produced, from digital editions of newspapers to YouTube to the way music is downloaded.

The ’00s will end up looking like a pretty special time in Internet-land: a time when blogs were vibrant and mostly noncommercial but still well read and commented upon, when social networking became an exciting new youth-driven thing, when a whole new culture that would eventually take over the worldwide mainstream really began to take root.

I imagine this shirt could be a prime example of ’00s Internet nostalgia. So could lolcats or any other meme that seems indicative of the time before the Internet really “grew up.” Hell, I think computers themselves could be items of nostalgia by 2020. I don’t want to give any support to Apple’s suggestion that the iPad is “magical and revolutionary.” But I do think the combination of tablet PCs, ever more powerful smartphones, and the fact that the Internet will probably be ingrained into almost every product might make the concept of hunkering down with your laptop a thing of the past.

I wanted to mention the Internet first also because I think its presence will affect future nostalgic views of many other things too. Which brings me to TV. Similarly to what I mentioned about computers, I feel certain the idea of watching a show at a specific time because that’s when it airs will be extinct pretty soon. You can already see this happening everywhere, what with Hulu and Netflix streaming and so on, and of course we have the Internet to thank for those wonderful things.

As for the actual content of nostalgic ’00s TV, I could imagine reality shows that are starting to taper off in popularity a bit like Survivor (currently in its 22nd season, by the way) being a big part of it, especially if reality shows are not the networks’ bread and butter anymore by 2020.

But the most fervent nostalgia comes from those were young during the decade in question. There’s moderate ’90s nostalgia for Friends or The X-Files, but extreme nostalgia for everything Nickelodeon did in the decade. And what is the equivalent in the ’00s? I think the answer is the Disney Channel. That’s So Raven and Phineas and Ferb will be seen then the way Clarissa Explains It All and Doug are now. Of course, future Nickelodeon-style nostalgic shows could also be from…Nickelodeon itself, which has still been the most-watched basic cable channel for years.

And I have to disclose that the inspiration for this post came from when I was thinking up a trivia question about The O.C. and was amazed at how evocative of a different time it felt, despite the fact that it only debuted eight years ago. I couldn’t quite place my finger on exactly why that was, but it probably had something to do with the endless stream of indie rock that was on display.

“Indie” music (I put that in quotes because “indie” became just a genre, not really a reflection on the type of label one was signed to) will, I imagine, become a large part of ’00s music nostalgia. And while a lot of it doesn’t really rock that hard, the truth is that by 2020 we might be long past hearing anything that could be considered “rock” at all in the mainstream.

As for ’00s pop, if you ask me which pop stars will be objects of nostalgia and which won’t, I can’t say confidently except that I feel certain Britney Spears is going to have a huge revival at some point. She has every element of future nostalgia written all over her: enormous initial success, a long descent into being tabloid fodder and not particularly culturally relevant, but a past repertoire of sugary pop gems waiting to be re-appreciated in the future.

To return to indie rock for a second: While Death Cab for Cutie and the Shins may have been featured in the preppy world of The O.C., the indie movement they represented was inextricably linked to hipsterism. Love them or (as almost everyone seems to profess, even if they seem to be one) hate them, I think in the ’20s they could very well be seen as the definitive fashion trend-setters of the ’00s.

A common criticism of hipsterism is that it is a regressive culture, concerned only with recycling stuff from the past, and while that might be true, I don’t think it means it hasn’t produced distinctive fashion elements. Enormous glasses, skinny jeans, beards—I suspect all these will be revived as ’00s style in the future.

Another hipster-begun trend is the aforementioned wearing of ’90s NBA jerseys. I am not immune. And while thinking about this post, I decided to purchase, on eBay for $1.25, what I thought will be a future item of ’00s nostalgia. Although recent events make the Sacramento Kings’ imminent relocation look less likely, the point remains: defunct teams ALWAYS become objects of nostalgia, with a fervor that certainly outstrips people’s feelings toward those teams when they were around. (If you don’t believe me, check out the winning bid for this item.) So expect teams that are likely to be moved soon, like the Kings or the Phoenix Coyotes, to be treated the way the Vancouver Grizzlies or the Montreal Expos are now.

I think it’s interesting that the hipster jersey trend focuses mostly on basketball rather than other pro sports; I’m guessing this is because the ’90s were when the NBA really settled in as a global presence after its big gain in popularity in the ’80s. (Space Jam, whose sublimely ’90s website is still online, couldn’t have hurt either.) Will one particular sport dominate ’00s nostalgia?

I don’t think it will be hockey, despite it having an interesting journey this decade from positioning itself as the game of the future to almost going extinct after the devastating cancellation an entire season. I also don’t think it will be baseball, which seems like it can survive and thrive through anything, including steroid scandals.

I think if there is one sport that might attract nostalgia in the ’20s it could be football. And the reason I say that is because football might be poised for a decline in popularity over the next 10 years. The brutal toll the game takes on its players in the form of concussions and other injuries, all too often leading to serious problems later on in life, has been coming to light recently. As this column points out, it would not be surprising for football to go the way of boxing, going from dominant to fringe as the sport’s dangers become more and more well known. I suspect that if that happens, vintage NFL jerseys will gain a certain nostalgic quality as totems of a cultural element that was once dominant before a fall from grace.

And speaking of people who fell from grace (even if they didn’t deserve that grace in the first place), let’s talk about the defining political figure of the ’00s, George W. Bush. No, I don’t think in the future Bush will become thought of warmly by the general public, but conservatives will surely try to Reaganize him. And in a way that will be appropriate, because I think the political culture of the ’00s will be seen very much the way the ’80s are now.

That is to say, as a nation gripped by a moderately irrational fear of a less-powerful-than-they-seem enemy. Just swap out the Soviet Union with terrorists. I think we’ll look back and laugh at ourselves for having to take our shoes off at the airport and that this will be reflected in lighthearted movies taking place in the decade. That being said, I think compared to the ’80s this element of ’00s nostalgia will be a little less pronounced, mostly because the terrible events that did transpire were realer than the Cold War was. This shirt and this album cover/band name might be possible today, but I doubt there will ever be a band called “Alan Qaeda and his Qaedets” or something.

Before I finish this post, I want to add a brief coda about one thing. I seriously think there is a possibility that ’00s nostalgia could be less profound than that of the ’70s, ’80s or ’90s, but not because anything about the decade lends itself to being anti-nostalgic, because I don’t think that’s possible. No, I think that could happen—I know it sounds silly, but I am being serious here—because there is no agreed upon way to say “’00s” in speech.

“Aughts,” “Naughts,” “Ohs,” “Zeroes”—all are logical in their own way, but none have the inescapably understood meaning that “nineties” does. Without this universal shorthand, might someone not bother trying to describe something to someone else as being very “’00s”? Of course, that assumes everyone will not come to agreement on a term at some point, which seems unlikely given that none was reached during the decade itself, but not impossible.

Personally, I vote for “two-thousands.” It makes the most sense to me since you begin the spoken form of each year in the decade with those words, and I don’t think ultimately it would be confused with the term for 2000-2099, since if precedent holds the century will be known as the “twenty-hundreds” anyway.

The ’10s don’t have an obvious way to say them either (“teens,” I guess) so write down these three things to look forward to in the ’20s: a clear and obvious way to say the decade (roaring “twenties”!), ’00s nostalgia, and being able to look back on this post to see how wrong I was about it. I can’t wait!

Art, Physics, and Your Place in the Omniverse

with one comment

This surreal moment—in which Michael Scott (of the American version of “The Office”) meets David Brent (of the British version)—took place on the American one recently. I thought it was hilarious and apt that they form a hug-worthy bond within seconds of meeting. This was due, I assumed, to them basically being the same person, thrust together by the most divine of chances.

On the website I first saw this at, Nikki’s Finke’s Deadline Hollywood, a commenter named cookmeyer1970 took a less breezy view of it. Although they said they ultimately enjoyed it, they began by saying:

I was very much against this (the two series shared practically the same script for an episode as well as duplicate characters making it, in my opinion, two parallel universes that shouldn’t cross)…

I don’t think the issue of “duplicate characters” really has to come into play—most of the American characters that were adapted from British ones are done so pretty roughly, and even Michael and David have their differences. And by this point, the American show (which has aired 140 episodes) has created a whole swarm of characters that have no equivalent on the British show (which aired 14). But I admit I hadn’t thought about the fact that the first episodes of the two series are identical nearly line-to-line.

Does a fact like that firmly establish that the two “Offices” could not possibly exist within the same plane of being? Should Scott and Brent have destroyed each other like matter and anti-matter the instant they ran into each other?

From a practical standpoint, my answer was: I don’t care. It goes without saying that virtually any TV series requires suspension of disbelief, and as far as potential inconsistencies go, I realize this might be one of the most obscure, technical ones ever. But from a theoretical perspective, I think it’s quite interesting to consider.

You sometimes hear about “the universe” of a particular film, TV series, book, or other work of art. Some of these universes are extremely obvious. Clearly the universe of, say, “The Lord of the Rings” is not our own. But the truth is that literally all stories, even those that might appear firmly grounded in reality, create their own universe that necessarily cannot be the same as our own.

For instance, look at “The West Wing.” Despite relying on ripped-from-the-headlines political issues for much of its drama and intellectual substance, not to mention frequently discussing U.S. history and governmental minutiae, the world it depicted was in some ways as alien as Middle Earth. The show established that Nixon was the last real president its universe shared with ours, and despite incorporating real foreign states into storylines all the time, also invented the countries of “Qumar” and “Equatorial Kundu” to stand in for generalized representations of the Middle East and Africa, respectively. (It also presented a universe in which every human is capable of delivering a searingly witty riposte without a second’s hesitation at any given moment and for any given situation, but that’s another issue.)

Does this mean we should consider “The West Wing” a fantasy? Certainly not. Does it mean we can’t appreciate the voluminous amount it has to say about the real issues in our world? Not in the least. But it’s not our universe, and actually it’s not even particularly close to being our universe.

The divergence point for the universe of a movie or TV show can come from an even more elementary source. Consider the fact that in the universe of any work that involves actors playing characters, it would be fair to assume the actors don’t exist within that work’s universe. The most brilliant, succinct explanation of this concept is found in the underrated existential action comedy “Last Action Hero,” when a kid enters the universe of a Schwarzenegger film. He finds that in a world without Arnold, logic dictates that the Terminator could very well have been played by Sylvester Stallone.

If I remember correctly, there was also a joke on “Seinfeld” once in which Frank Costanza reads about Jerry Stiller dying. This would prove that actors on a show could possibly still exist within that show’s universe, as odd as it would be for George Costanza’s perfect doppelganger to exist and for him to be an actor to boot. But the bottom line is, it wouldn’t be a joke if that wasn’t the case—and it’s only the case in the “Seinfeld” universe.

This idea is a overlapping concept between art and quantum mechanics. They many-worlds interpretation is too complicated, certainly for me, and possibly for anyone, to fully understand, but the basic principle is that every possible outcome of every possible divergence point exists in a universe somewhere within the “multiverse.” In other words, every single thing that could happen does happen, and it creates a new universe when it does. Does this mean that every choice the author of a work of fiction makes determines what real universe, out there amongst countless others, they end up describing? Who’s to say?

(By the way, it’s worth mentioning that the multiverse is, theoretically anyway, not the be-all and end-all. It is “simply” the collection of possible quantum configurations of our universe; it’s conceivable that there are more multiverses, as well as multiverse-type realms in other dimensions, that we don’t understand. The collection of everything, anytime, anywhere, in any dimension, is referred to as the “omniverse.”)

This panel depicting the DC Comics character the Flash existing in different forms in different universes illustrates the "many worlds" concept.

The medium that has undoubtedly explored this concept the most is comic books. Both DC Comics and Marvel Comics have made explicit references to the many-worlds idea and have taken advantage of it in far more ways than just depicting characters with superpowers. They’ve had DC and Marvel characters fighting each other, transported their characters into the Renaissance, and imagined what it would be like if Superman had landed in the Soviet Union rather than America. There was even a storyline in which the Fantastic Four made their way to our Earth, in which Marvel Comics is a company that produces stories about them, to beg “God” (author Jack Kirby) to save a particular character’s life. This is all while within the many-worlds framework, which ends up being an elegant solution to any potential continuity problems—and, incredibly, a scientifically plausible one at that.

I really like the idea that every story that has ever been conceived is part of the actual omniverse. Everything that you have ever imagined has happened. So when it comes to “The Office,” why can’t two office managers, who presided over an episode-long period of strikingly identical events, exist together? It doesn’t make much sense in our universe, but in the “Office” universe? Why not?

The only point I want to make here is this: You can look at the creation of art and fiction as wondrous, magical even, and I certainly do. But you can also look at it in terms of tapping into the most incredible potential realities that science tells us could exist. And I think that’s pretty damn cool.

One of the most amazing stories of “Star Trek” is a Deep Space Nine episode, “Far Beyond the Stars,” in which Captain Sisko has visions of himself as a sci-fi writer struggling to get his stories—about space station Deep Space Nine and its commanding officer, Ben Sisko—published in segregated America. Even if racist editors prevent his work from being published, he insists, his creations still exist because “you can’t destroy an idea.”

It’s the End of MTV as We Know It, and I Feel Fine

with 2 comments

In 1981, MTV was the first to land on the Moon.

The 24th MTV Video Music Awards, held on September 9, 2007, represented some kind of low-water mark in the channel’s history. This edition of the VMAs halved the number of awards, renamed the surviving ones to joky titles like “Most Earthshattering Collaboration” and “Monster Single of the Year” and solidified MTV’s party-all-the-time image by eschewing classier venues like Radio City for the Palms in Las Vegas.

With so many negative MTV elements coming to a head, it may have seemed appropriate when Justin Timberlake, accepting the award for Best Male Artist, expressed what many think is the reason for MTV’s ills. “Play more damn videos!” he said. “We don’t want to see the Simpsons [these ones] on reality television. Play more videos!”

Today, that metaphorical Moon (of places you can watch music videos) looks like this.

I’ve talked to many people my age who feel similarly. I can see why they do, at least at first blush; I’m as nostalgic about MTV as the next Echo Boomer. But I think when it comes down to it, in today’s world, it’s clear that both we and MTV are better off if they don’t bother too much with videos.

Before MTV launched in 1981, music videos had already been receiving considerable attention in the U.K. on “Top of the Pops” and elsewhere. But in the U.S. the format was mostly unknown, consigned to occasional appearances on variety shows, as between-movie filler on HBO and a pioneering program on the USA Network.

Of course MTV changed that, and the music video would go on to alter the direction of film editing and change fans’ sensory understanding of popular music forever. When it began MTV was basically the only place to see videos and therefore one of the best places around to learn about new music. After copycat shows on broadcast networks fizzled, it remained that way for a long while.

But you don’t need me to tell you that that isn’t the world we live in anymore. The Internet in general, and YouTube in particular, has put you, at this very instant, a few keystrokes away from virtually every music video ever produced.

So to sum up: Then, we had a single video at any given time doled to us by a massive media conglomerate whose primary objective was to garner ratings and get us to buy stuff; today, we have instant access to a Memory Alpha-like historical archive of the artform at any time or location we choose. I don’t yearn for the way of the past when it comes to this, and I don’t know why anyone would.

The movement of music away from the television and onto the Internet has been, in my view, an enormously positive development. When it was the dominant source of music videos, MTV was inert, a cultural master you could either choose to submit to or not. YouTube, music blogs, and the rest of the musical Internet is nothing like that. It doesn’t tell you what culture to consume and offer no alternative; it challenges you to seek out the culture that you, individually, will like the best. And if you accept the challenge (and to be sure, it is far more challenging than watching MTV), I think the results tend to be more rewarding.

Again, I’m sensitive to the feelings of those persons who fondly remember the days of “Celebrity Deathmatch,” “Say What? Karaoke” and videos being labeled “Buzzworthy” or “Spankin’ New,” because I am one. But, in this and many things, we shouldn’t confuse nostalgia with the way things should be now.

Nor should it be confused with a prudent business plan for MTV, not that MTV itself needs that advice. I can see a new Lady Gaga video on probably millions of different websites, but the only place I’m going to watch a new episode of “Teen Mom” (well, legally, anyway) is on MTV or MTV’s website. It’s clear that MTV foresaw this long ago, and if it hadn’t rededicated itself to original programming, there is no way it would be in existence today.

I don’t want to imply that I love what MTV is today, though. “Jersey Shore” is just the tip of the iceberg when it comes to how its reality programming has devolved into being about partying and little else, and the slow descent of “The Real World” to this territory is especially saddening. Even when MTV tries to consciously go against that impulse, the results sometimes to lead to some of the worst scripted programming I’ve ever seen.

That being said, there are positive things to be said about MTV, and I think what it has become—essentially, a channel catering to teenagers and young adults—is a great thing in theory at least. If “The Real World” no longer has many meaningful things to say about a generation, the prolific “True Life” definitely still does, and this year MTV also received the first “Excellent” rating in the history of GLAAD’s Network Responsibility Index which measures channels’ presentation of LGBT characters and issues.

One more point I’d like to address is the contention that a channel shouldn’t be acronymically calling itself “Music Television” if it isn’t about music. To that I say, you should only believe that if you believe AMC shouldn’t show “Mad Men” because it isn’t an “American Movie Classic” or that CBS shouldn’t show anything unrelated to Columbia Records music, since that was the origin of the name “Columbia Broadcasting System.”

Both of those channels, like MTV, realized they weren’t serving unique enough niches anymore, so they evolved. Just as, over the years, music consumers have evolved to embrace new technologies. Sometimes its messy, but I generally think that, in the end, evolution ends up benefiting everyone. I think even Justin Timberlake would agree with that.

Written by Dan Wohl

12.27.10 at 2:48 am

Shockin’ Through the Decades

with 2 comments

“Shock rock” is sort of a redundant term. Rock and roll has always been about pushing the envelope since…well, since whenever it began, an unanswerable question that is the subject of a particularly detailed and fascinating Wikipedia article.

Since the world has become comfortable with the idea of rock music, it’s taken a little more to qualify as what anyone would call shocking. Really, is anything shocking anymore? I don’t mean this as a joke. Even if you have never heard of nor wish to seek out the band Fartbarf, the album Carnivorous Erection or the song “Fecal Smothered Dildo Punishment,” I’d imagine it doesn’t truly startle you to know they exist. Feel free to Google them if you don’t believe me.

When a shocking band is out of sight and out of mind for all but the tiniest sliver of society, they haven’t really succeeded in shocking. So what of those musicians who are known to relatively many but still retain the reputation for shock? They tend to be iconoclastic, ego-driven individualists; they tend to have a keen sense of what kinds of shock appeal to mass audiences (themes of death and horror, for the most part) and they tend to have at least a tiny smidgen of actual talent (though not always).

These qualities make up a sector of popular music that has developed less as the domain of true shock and more into a semi-defined, not-always-shocking genre called shock rock. In honor of this, scariest month, I present this rundown of shock rockers throughout history.

Screamin’ Jay Hawkins (first record: 1956)

 

Screamin' Jay Hawkins

Most musical genres can’t trace their origins to a single individual (though some can). But the history of shock rock makes it seem reasonably clear that it began with one specific song: Screamin’ Jay Hawkins’ “I Put a Spell on You.” Like most defining songs of original rock and roll,  it straddles the border with rhythm & blues, and has been covered over and over by obvious antecedents and others as well.

The song was originally meant to be a straightforward ballad. It only ended up in the grunting, shrieking, animalistic way it did because both Hawkins and his band recorded the final take in a drunken stupor that he didn’t remember the next day. It was only then that Jay Hawkins became Screamin’ Jay Hawkins and the world was left to wonder once again whether the course of history would have been different had a small group of people not gotten wasted at the precise time they did.

Hawkins also originated a key component of shock rock by opening his act by coming out of coffins, wearing a cape and carrying around a skull-on-a-stick sidekick named “Henry.” Hawkins never came close to replicating the success of “I Put a Spell on You,” but because of that one song and the decades-long career it afforded him, everyone on this list owes him a debt of gratitude.

Screaming Lord Sutch (first record: 1961)

 

Screaming Lord Sutch

The parallels between Hawkins and Sutch don’t end with having names implying an inclination to utter loud piercing cries. Both were, in their own unusual ways, on the cusp of a significant movement in the development of rock and roll—Hawkins with the gradual breaking-off from African American rhythm & blues, Sutch with the British Invasion—and both were famous primarily for one, oft-covered song.

For Sutch, that song was “Jack the Ripper,” and the song’s subject matter (not to mention Sutch’s caped, top hatted image) established the Victorian motif that would inform much future shock rock. This live performance from 1964 is incredibly surreal to me. The screams of British teenage girls are a familiar emblem of the black-and-white rockin’ ’60s, but how often do we see them elicited not by happiness but by something resembling actual revulsion?

Sutch might be the campiest figure on this list, and he has a few other good songs that stray far into Bobby “Boris” Pickett territory. It’s difficult to find and kind of truly dark undertone to his songs, which is sadly ironic considering he hung himself at age 58.

Arthur Brown (first record: 1968)

 

Arthur Brown

Another shock rock progenitor, another one-hit wonder. For Arthur Brown and his band, the quaintly named Crazy World of Arthur Brown, the song was “Fire,” which was a #1 hit in the U.K. in 1968. Someone might prove me wrong, but as far as I can tell, Brown was the first musician to wear bright white-and-black “corpsepaint”-style makeup, which is really pretty significant, when you consider the wide spectrum of later artists who wore it.

I also get the feeling that Brown was the first shock rocker to introduce a level of true menace to his delivery. Hawkins and Sutch were more in your face, but I get more of a sense of creeping dread from Brown. He conducted himself not like a deranged, creepy outsider, but more like a supernatural presence—the “God of Hellfire” as he put it—which is another hallmark of future shock rock.

The alarmingly D.I.Y.-looking pyrotechnic device attached to Brown’s head in the video led to some extremely predictable incidents. For instance, at one show, the inflammable fluid spilled onto his head as he was being lifted to the stage by a crane, and his hair was, as the song puts it, taken to burn. The situation was dealt with in a very rock-and-roll manner when a quick thinking-fan doused the flames with a pint of beer. Not sure if the ordeal took Brown to learn (not to do that anymore). I can’t decide if this is hilarious or an indication of him having been really kind of crazy, but one way or another Brown definitely pushed the genre further.

Alice Cooper (first record: 1969)

 

Alice Cooper, with early bandmates

We’re now moving away from progenitors and toward some people who should and actually finally might get into the Rock and Roll Hall of Fame. Alice Cooper is one of those celebrities who seems to weirdly get more prominent as he gets older, but he wasn’t always the Wayne-and-Garth educating, golf-obsessed, evangelical conservative Republican he became. Come to think of it, he probably was, but the world just didn’t know it.

What it did know at Alice’s inception was that Alice Cooper was a band, not an individual. When the erstwhile Vince Furnier assumed the Alice moniker for himself (apparently a ouija board told him he was the incarnation of a 17th century witch by that name) and wrestled the copyright for it away from his bandmates, he established two shock rock precedents with one fell swoop: a combination of mild transvestism with hyper-machismo, and an ego too large to accept peers.

(His former bandmates have gone on to other things, and I’d feel remiss if I didn’t mention drummer Neal Smith, whose website describes him as a “Rock N Realtor” and has an intro stating, “Over 25,000,000 albums: SOLD. Over $25,000,000 in real estate: SOLD.” Check it out yourself.)

Cooper’s catalog contains some bona-fide non-shocking classics and a bevy of very good ones that tend to combine horror (at varying levels of camp) with intense libidinousness. As he has gotten older and become a sort of scary-music mascot, one might expect his music to have mellowed, but on the contrary, his lyrics seem to have gotten somehow more controversial as he blows past 50 and 60.

I’m not sure how I feel about that. I’d like to give him credit for trying to stay bold and commenting on current events (that song was written in response to the Columbine shootings). But the lyrics are so atrocious that I can’t, and furthermore, the contrast with his new-found family friendly image makes songs like that seem incredibly fake, too. There’s no denying, however, that Alice Cooper was the first shock rock superstar, and I’ll always appreciate him for that, even if he puts out a promotional single about drinking donkey blood with all proceeds going to the Tea Party Express.

Ozzy Osbourne (first record: 1970)

 

Ozzy Osbourne

What’s the opposite of a soft spot in your heart? Whatever it is, I have it for Ozzy. I feel weird admitting that since he was the original frontman for one of the most skull-meltingly awesome and influential bands in the history of heavy metal, Black Sabbath, where his singing was acceptable enough to not get in the way of Tony Iommi and company. But I often have wondered—and I hope I don’t get death threats for saying this—how good the band would have been had a singer of the quality of the late, far-beyond-great Ronnie James Dio been with them from the beginning.

Ozzy didn’t really become a shock rocker until he became a solo act. As you can see, this is something of a pattern. Most of the incidents he’s most notorious for—biting the head off a dove, drunkenly urinating on the Alamo memorial in Texas, biting off the head off a bat (that he once claimed he thought was rubber, and that he claimed another time bit him, necessitating rabies shots)—happened after he left Sabbath.

I’ve just never gotten much out of Ozzy other than a party animal with an exceedingly average voice who sometimes likes to wear heavy eyeliner. I see “Crazy Train” as a great 30 second intro that is ideal for a baseball player’s walk-up song followed by four minutes of total stock. I see “Mr. Crowley” (and most of Ozzy’s attempts to delve into the occult) as corniness that wouldn’t sound terribly out of place on a Spinal Tap album.

I actually think one of Ozzy’s best solo songs is “Suicide Solution,” which is a legitimately nuanced and interesting take on alcoholism. The song is a warning that regularly drinking to excess is a form of slow, torturous suicide. But of course, this is the song that more than one dead teenager’s parents have sued him over. Ozzy knew his song wasn’t really at all about suggesting kids kill themselves. But when he titled it as he did, did he have this ensuing controversy in the back of his mind? Knowing his penchant for shock without the substance to back it up, it wouldn’t surprise me.

Kiss (First Record: 1973)

 

Kiss

I was planning to only include individuals on this list, since the ego-driven solo artist seems to be the shock rock archetype. But I realized there was no way I could exclude the band that began when an art major named Stanley Eisen met an Israeli-born elementary school teacher named Chaim Witz in Queens.

From a commercial standpoint, Kiss is the perfect shock rock band. They wear strange makeup, they spit blood and fire and play a bass shaped like an axe. But they combine all this with a catalog of songs that is almost painfully inoffensive. Their most famous song‘s lyrics could have been sung by Bill Haley. Yes, there was a brief rumor when they came out that KISS stood for “Knights in Satan’s Service,” but there’s no way in hell (pun definitely intended) that anyone could think that today.

What this sterilized version of shock rock has produced is a repugnant merchandising empire that has weirdly overshadowed the band’s music. But when it comes down to it, while they don’t have a deep collection of hits, they have a few that are pretty damn great. Vacuous and musically simpler than most punk songs, but great. I’m happy to take a visualless Kiss mp3 now and again, but personally I’ll let them keep the condom, the coffeehouse, the licensed professional wrestler and the casket. Sorry, “kasket.”

Grace Jones (first record: 1976)

 

Grace Jones

You can’t really call Grace Jones a shock “rocker.” But her brand of shock pop/disco/new wave introduced the important concept that mainstream shock didn’t have to come from themes of horror or death. Take for example the cover of her second-highest charting album, which features a title that would shock Al Sharpton, a flattop that would shock Kenny Walker and a mouth that would shock the ice cream truck guy in “Legion.”

As you can tell in this video when she wears an enormous Keith Haring dress or Andy Warhol declares that “Grace is perfect,” Jones enmeshed herself into the ’80s avant garde like few other musical artists. She appeared on talk shows wearing enormous orange turbans and gold masks. But at the same time, she appeared in “Conan the Destroyer” and played a salacious, steroidal henchwoman in the James Bond film “A View to a Kill.” She wasn’t tied to any cultural movement as much as she was committed to subverting cultural norms, which was a brilliant calculation for someone who came along right as the world was becoming a bit more shockproof.

In my opinion Jones’ musical output, which is heavy on covers, is a bit inconsistent—I think her version of “Warm Leatherette,” for instance, pales compared to the original, while others are outstanding. But her achievement was not in music or film but in image. She ushered in a new kind of art-and-fashion based shock that reflects in pop to this day and has scarcely ever needed to employ a drop of fake blood. (Well, almost.)

GG Allin (first record: 1977)

GG Allin

GG Allin has the unique distinction of being the only person on this list whose birth name was weirder than their stage name: His name at birth was Jesus Christ Allin. He was born and raised in a no-electricity, no-running water New Hampshire log cabin with a sociopathic religious maniac father who regularly abused his wife and children. And this incredibly sad and evil upbringing produced a person who was, unfortunately, plenty evil in his own right.

Grace Jones’ revelation was that you could attain shock value by subverting cultural norms rather than literally shocking people. Allin was the opposite. There were no homages to old horror movies at his shows. There was literal horror. A typical GG Allin show would feature him stripping naked, committing self-harm, producing every kind of human waste (sometimes, proceeding to consume it) and engaging in violent physical or sexual acts with audience members.

Allin repeatedly promised to top all these by eventually committing suicide onstage, but he died of a drug overdose before he could. I have to admit that his funeral was twistedly poetic: As he lay in repose, his friends plastered stickers on his casket, jammed drugs and whiskey down his lifeless throat, took pictures of themselves touching his penis, and generally treated his bloated, fetid corpse with the same disrespect he treated everyone in life. The consensus was that that was how he would have wanted it.

There are those who hold Allin up as a paragon of individualist punk ethos. But by the end of his career, it seems obvious to me that what he really was was an extreme narcissist who was quite deluded about both the extent of his influence and the consistency of his philosophy. It was his need for attention that led him to take shock rock to its logical extreme. He recorded a few songs, but I’m sure even he knew he was never going to be remembered for his musical talent. For better or for worse, he proved that one can be remembered for shock alone.

Rob Zombie (first record: 1987)

 

Rob Zombie

I think it’s possible that Rob Zombie does not get as much respect as he deserves. Paying homage to horror films of the past, campy or otherwise, is a tradition as old as Bauhaus and the Misfits. But I think it was with Zombie, with his band White Zombie and in his later solo career, that it really reached its apogee.

Take, for instance, “Dragula” (named after the Munsters’ car), which I think, in contrast to some of the hard rock my peers and I liked in the late ’90s, holds up exceedingly well today. I feel like this song and video are masterpieces of cinematic schlock-homage. I especially appreciate the fact that Zombie doesn’t seem too self-serious about it, if his “Night at the Roxbury”-style head pump while cruising in the dragula is any indication.

His horror fixation extends beyond exploitation movies; one of my favorite music videos of all time is his one that emulates a silent film. And his old band White Zombie had some pretty bangin’ hits of their own (although I have always thought that the lyric I originally thought it was—”more human, that’s what you’ve been”—would have been cooler).

I think Rob Zombie is the first artist on this list I’d think of as a shock rocker more as a genre label than for someone who generated any actual shock in their time, which is reflected in how the public views the types of films he derives his image from. He’s also more of a multimedia magnate than anyone on this list, as he has branched out to become an in-demand (if not particularly good) film director. (In that vein, there’s also an odd spinoff-like element at work in the fact that what he is to horror, his younger, less-popular brother tries to be to sci-fi.) Zombie might not be high on shock, but ironically, he seems more enmeshed than anyone in the culture and history of what shocks people.

Kembra Pfahler (first record: 1990)

Kembra Pfahler with bandmates

Kembra Pfahler is probably the least well known artist on this list, but I really wanted to bring her up for two reasons. One, I think the picture at left is the most shocking one I saw in my research for this post. And two, she seems to me to be a remarkable synthesis of different shock rock elements. She combines Zombie’s midnight movie motif with Jones’ avant garde artistry with Allin’s transgressiveness with a mastery of makeup and presentation that outstrips almost anyone.

Most of Pfahler’s music has been recorded with her band, the marvelously named Voluptuous Horror of Karen Black (named for the cult icon), who have an array of camp horror-inspired songs with titles like “Chopsley: Rabid Bikini Model” and “Do You Miss My Head” as well as some quite transformative covers. And her music, image and overall presentation is accepted as avant garde art in a way that few musical artists are; she was featured, for instance, during the 2008 Whitney Biennial.

Most (though not all) of Pfahler’s music actually tends to be pretty straightforward and inoffensive relative to her image, but she more than makes up for it with some of her transgressive performance art pieces that would probably have given pause even to GG Allin (who she recorded with, by the way). She’s known, for instance, for cracking eggs on her vulva. Which really only sounds intense if you don’t know that she once literally sewed her vagina shut. (It would seem clear that this was some sort of protest against objectification, yet she also posed for Penthouse in this condition, so count me as confused.)

Finally, it’s also worth nothing that Pfahler is something of a cosmetological celebrity, having appeared at several makeup-industry events. And I think you get a good sense of the concept of shock rock in general when you see a crowd of calm fashion types watch her sing about “suck[ing] the shit out of your ass” while completely naked and covered in red paint (all of those things are in this video, if you care to witness them). I love the idea of someone like her being a cosmetics spokeswoman, even though you’d figure nearly every potential consumer is not looking to get as extreme as her. Sort of like SUV commercials highlighting towing capacity when most suburban buyers will never know the difference. The bottom line is, when it comes down to it, who is makeup more crucial to than shock rockers?

Marilyn Manson (first record: 1994)

 

Marilyn Manson

If such things could be quantified, I think Marilyn Manson would have had one of the highest name recognition-to-knowledge of his music ratios in music history. It’s hard to imagine any American who was alive in the ’90s not recognizing his name, and his reputation for evil, for being the “Antichrist Superstar” as he himself put it. In an amazing feat of vocabularic gymnastics, Joe Lieberman managed to convey, at the same time, both the horror that conservative parents had for him and the enthusiasm his youthful fanbase had for him when he declared the band to be “the sickest group ever promoted by a mainstream record company.”

Yet, I sense his music remained invisible to most. I blame Manson himself completely for this, as his bible-ripping antics and album covers depicting himself as a hermaphroditic nude extraterrestrial were part of an overwhelming strategy of shock. But what separated him from older shock rockers, who some jaded music fans said he was simply a rehash of? For me one answer is a sense of humor. Manson had none of Ozzy’s partying spirit or Alice’s joie de mourir. Manson presented himself as a truly grim figure, and with his rod-straight hair, pasty makeup and everpresent weird contact lens, he looked considerably more alien. I only know of one press photo-type picture of him smiling, and it’s not exactly what you’d call humanizing.

In recent years Manson has become more accessible (sometimes in rather inexplicable ways). But I think it’s too bad that his extreme image turned off so many from his music, which is in some instances quite interesting in my opinion, and also can be quite different from what most people think of when they think of him. The album that launched him to stardom, Antichrist Superstar, is full of the goth-ish, pseudo-sacrilegious stuff everyone remembers like the still-powerful song whose title references the Beatles’ “Baby You’re a Rich Man”. But I think his next album Mechanical Animals, a glam, Warholian meditation on fame and popular culture, is masterful, and I still listen to songs on it like “The Speed of Pain” and the title track.

Until not too recently I used to wonder if, given how severe his look was, Manson could possibly age gracefully. Then one day I looked up and realize I’m already sort of getting an answer. As his look wears off and his days as conservative America’s worst nightmare recede ever farther into the past, I’m sure Manson the public figure will fade more and more into obscurity, which I’m perfectly fine with. But I think his work deserves to be remembered.

Lady Gaga (first record: 2008)

Lady Gaga

What past musical artist is Lady Gaga most like? Lots of people have opinions. Madonna? Seems obvious enough. Bowie? It’s written right on her face. And the name “Lady Gaga” is a reference to a Queen song.

Of course, all of these are right. But I also want to suggest that, much like dinosaurs evolving into birds, she is the seemingly incongruous descendant of everyone on this list. The most clear precedent is Grace Jones, and Jones herself certainly agrees. But she also maintains the pointless religious iconography, the obsession with celebrity and the practice of looking bizarre in public that many shock rockers, and I’d say in particular Manson, share.

She’s actually already done a forced-sounding remix with him (I’d love to know what a real song by the two of them would sound like). And just like Manson graduated from subverting norms to grand, conceptual shock weirdness, Gaga seems to have definitely done the same.

Manson flamed out quickly, and that makes me wonder if Gaga will too. My feeling is that her brand of non-horrific shock will work for a lot longer. And even though it has seemed for decades that the genre is close to the end, I sense that somehow or other shock rock will work for a lot longer too.



Defending the E-Book Reader

with 2 comments

A similar controversy plays out in days of old.

After moving from Brooklyn Heights to Forest Hills—a move that was, glamorously enough, reported on in the New York Times—I found myself facing longer subway rides to Manhattan and Brooklyn. What better way to help pass this extra time, I thought, than to get an e-book reader? A fun new gadget would persuade me to read more, I reasoned, and make it more convenient to read during any subway situation, whether it be sitting, standing, or packed into a rush hour cattle car pressed against a steel-jawed hedge fund manager on one side and a Hare Krishna missionary on the other. You gotta love New York.

An e-book reader would also make buying books cheaper, cut down on the mass of my physical belongings (I have enough heavy books already) and allow me to read knowing no trees were killed because of me. What was not to like?

For me, nothing. I have been pleased in every way with my Barnes & Noble Nook (I chose it over the Kindle because I found it more attractive and I figured I might as well support a former employer). But not everyone is so enamored.

Nick Bilton, writing for the Times’ Bits blog, reported that some coffee shops and other eating or drinking establishments around the city have begun instituting no-computer policies which extend to any device that has a screen and requires electricity, including e-book readers. Bilton expressed frustration that he was immediately asked to put his Kindle away when he planned to read while having his drink.

He challenged the shop’s employee to explain what the difference to the shop was between him reading a Kindle or a physical book, and of course got no answer. As a new e-book reader owner myself, I empathized, but many of the blog’s commenters such as someone called JJJ, did not:

“I mean what kind of insecure loser gets bent out of shape because a business won’t let him play with his kindle or gameboy for five minutes? And despite being denied one more opportunity to show-off your latest gadget, was it really necessary to make an honest employee feel small for trying to enforce store policy?”

Bilton’s post was later mentioned on the Times’ City Room blog, and there, some comments, like that of a commenter named George, got downright vicious:

“And, no, a Kindle or iPad etc. is not, a never will be a real book.

A book has substance. A book has a beginning, a middle and an end. A book has context — it is it’s own reference. A reader can flip real pages back and forth, dog-ear them if he likes etc. And there’s yet to be a printed book that cannot be read because its battery just died.

I don’t care if you can carry a “library” of a thousand e-books on your Kindle — chances are you haven’t really read them — you’re just enamored with the gadget.

And that’s the biggest problem. You digi-freaks are enamored with the gadget — not with the content.”

Although I wouldn’t be surprised if Apple developers are readying a dog-earing app for iBooks as we speak, I doubt that would quell George’s concerns. And while most are probably not as virulent about it as George, I have heard similar anti-e-book opinions from friends, including highly literary-minded ones like Les Chappell.

I have a fundamental issue with opinions like George’s though. To say that an e-book “will never be” a real book implies that e-books have to go through a sort of trial before we know whether they will be accepted as real books. The coffeeshops in Bilton’s piece certainly seem to support that. But to me, it’s clear that a book is a book, whether it’s printed on wood pulp, read by a voice actor on a CD, or typed entirely on a Japanese cellphone. The content is what makes a book. You can’t judge whether something’s a book by its format.

If someone’s personal preference is for traditional books because they value their weight or smell or whatnot, of course that’s their right. And I understand why the e-book might be a bit more jarring to tradtionalists than the mp3 or the blu-ray, because music and movies have always required some form of machinery to experience, while books have not.

But despite that fact, I think that literature is the art form that is in fact the least likely to be substantively changed by its more high-tech form. It’s clear that mp3 downloading has deemphasized complete albums in favor of individual songs. And one could argue that the spate of deleted scenes, alternate endings and who-knows-what-else that is found on DVDs or blu-rays has changed how we experience movies.

And while I’d be reluctant to argue that any of theses changes to music or film are seriously negative, it’s hard for me to see what, if any, similar changes befall a book when it’s formatted as an e-book.

I hardly think book lovers would choose, if given the option, to download just their favorite chapters and read them over and over. And George’s suggestion that the owners of e-book readers download books just to amass the most impressive digital collection is, to me, the most ridiculous claim of all. I believe it is in fact far, far more likely that one would purchase traditional books for this purpose. Stocking a bookcase with impressive-looking, never-opened covers is a believable act of ostentation; downloading digital files onto your e-book reader where they remain essentially invisible until called up is not.

I’ve also heard it said that e-books threaten the publishing industry as a whole, and I have two thoughts on that. First, I generally believe that the public should never be forced or even urged to use older technology for the sake of preserving the economic status quo. And second, I get the feeling that, much like the battle over mp3 file sharing, the people who will really get hurt by a large scale shift to e-books are publishing corporations who fail to adapt rather than authors. The potential the e-book holds for self-publishing seems to me even greater than that of the mp3 for the musician, since literature is less dependent on marketing and whatever else publishing companies do than the music industry is.

When it comes down to it, I don’t think there is anything different going on here than when Johannes Gutenberg first put movable metal letters down on paper in 1439. As the printing press spread and books became common, governments feared they would contain revolutionary ideas and churches feared they would distort the Bible.

The issues today may be different, but the ethic, in my mind, should stay the same: Written information, in any and all of its forms, should be tolerated, uncensored, and free to be read in coffee shops. That’s my position. Others can continue the debate. But in the meantime, I’ll be over here, happily reading an e-book.                  

Written by Dan Wohl

09.09.10 at 2:13 pm

A Retrospective of ’90s Kids’ Sports Movies

with 8 comments

Political scientists say the American political landscape shifts on a predictable 30-year cycle. But I’ve found that the bygone decade that each decade is nostalgic for is even more predictable. The ’00s loved the ’80s. The ’90s loved the ’70s (previous post tie-in alert!). Reagan’s ’80s seemed to mostly buck this trend, but you could definitely argue that the ’70s loved the ’50s.

It’s easy to understand why this cycle happens. Two decades ago tends to be when the current tastemakers grew up. And I’m more than ready to contribute to the ’90s nostalgia that will undoubtedly be a huge deal in the ’10s. When I thought about what this meant, one type of film kept coming up in my mind: the kids’ sports movie.

The ’90s in American sports were, if not altogether hopeful times, at least ambitious, as leagues expanded to include a bevy of Sun Belt and Canadian teams that wore teal or purple or both. I’m not sure if the drive to capture more fans that produced this rapid expansion was the reason, but a whole bunch of movies came out trying to get kids excited about sports. All of them are good for serious ’90s nostalgia, but how do they rate upon viewing today? I watched eight of the best remembered ’90s movies involving kids and sports to see. In chronological order, here’s what I found.

“The Mighty Ducks” (1992) dir. Stephen Herek

"I'm sure this will be a real bonding experience. One day, maybe one of you will even write a book about it in jail."

Given the empire this movie spawned, including two sequels (more on those later), an animated series starring anthropomorphic ducks and—this is still unbelievable to me, 17 years and a Stanley Cup championship later—an actual National Hockey League franchise, it’s easy to forget how much heart the original had. I know I did.

My recollection of Emilio Estevez’s Coach Bombay, for instance, having been mostly informed by the sequels, was certainly not that of the hilariously acerbic bastard he is at the beginning. He tells his driver, trying to find the team he’s been sentenced to coach after a DUI, “Just look for the sign that says ‘Personal Hell.'”

“I’m sure this will be a real bonding experience,” he tells the kids, who have been reacting negatively to his cold demeanor. “One day, maybe one of you will even write a book about it in jail.”

The kids also seem, if not necessarily three-dimensional, realer than your standard hard-luck preteen street gang. They also routinely use the term “cake eater,” which I had to look up to learn is apparently a pejorative Minnesota term for someone rich; that being said, I polled several of my numerous Minnesotan friends on the term, and none of them professed to have ever heard of it.

Of course, the trashtalking kids from the streets of Minneapolis eventually teach him the meaning of fun, and Bombay, being the cutthroat trial lawyer that he is, teaches the kids about winning. It’s a terrific combination.

Pro Athlete Cameos: Minnesota North Stars players Basil McRae and Mike Modano show up, saying they played with Bombay when they were kids.

Quality of Sports Action: Solid, though not spectacular. At least it seems that everyone can skate. Many wide shots look good, and that’s probably because they used doubles. At one point a particularly hard slapshot breaks the net completely though, which takes the film down a few pegs on the realism scale. Also, the vaunted “Flying V” formation is, I’m sorry to say, definitely an interference-and-offsides double helping of illegality.

How Does a Single Parent Play Into the Plot? Charlie (Joshua Jackson), the Ducks’ captain, lives in a spacious art-filled apartment that his single mom can afford working as a waitress at Mickey’s Diner. Bombay acts as a father figure to Charlie, leading to him getting together with Charlie’s mom at the end.

“Rookie of the Year” (1993) dir. Daniel Stern

"If we don't sell out every game for the rest of the year, we're going to have to, uh, forfeit the franchise."

After a miraculous injection of talent, a 12-year-old, Henry Rowengartner, gets to play in the major leagues. What baseball-loving kid wouldn’t love that premise? I certainly did, and I recalled the film vaguely but fondly.

Boy, did my perceptions change on viewing it today. This is one of the stupidest movies I’ve ever seen, in the ’90s kids’ sports movies department or otherwise.

Even if you’re willing to suspend your disbelief that it’s possible for a tendon injury to heal so “tight” that it gives one’s elbow a slingshot-like ability to make 103-mph snap throws—believe me, I am—this film is still riddled with conflicts that have some of the most absurd stakes ever.

If the Cubs don’t sell out every game for the rest of the season, they’ll forfeit the franchise! If Henry doesn’t complete the save in his second major league game, he won’t get an endorsement deal with Pepsi! If the Cubs’ owner doesn’t find out first, the slimy GM will “sell” Henry to the Yankees for $25 million behind his back! Despite now being the only professional baseball player at his school, if Henry doesn’t successfully build a boat with his friends, he won’t get to hang out with his dream girl!

Daniel Stern, who also directed the film, plays the Cubs’ pitching coach, who never displays any indication of knowing a single thing about pitching. At one point, he takes a few hacks during batting practice, and manages to pop three balls in a row directly above him, hitting himself in the head each time. I suppose this is meant to show how hapless he is, but if I saw it in real life, I would probably assume it to be a Harlem Globetrotterian display of skill.

This sums up the film well for me: it seems like a sports film made by people who know zero about sports.

Pro Athlete Cameos: Pedro Guerrero, Bobby Bonilla, and pre-steroidal Pittsburgh-era Barry Bonds are all shown whiffing at Henry’s fastball.

Quality of Sports Action: Atrocious, even amongst the non-child actors. Gary Busey, who is supposed to be an aging star pitcher, has mechanics akin to this guy. There is also a breathtakingly illegal hidden-ball trick in which Henry: one, stands on the rubber without the ball (balk) and two, holds the rosin bag in his glove to make it look like he has the ball (probably grounds for an ejection).

How Does a Single Parent Play Into the Plot? Henry’s single mom, has regaled him with stories of a supposedly great baseball-playing absentee father throughout his life. Meanwhile, her current boyfriend becomes Henry’s evil manager, colluding on the Henry-to-the-Yankees conspiracy, leading to Mom dumping him just as she starts to get together with Busey’s character. Finally, in his very last appearance for the Cubs, Henry pulls a patch off his glove to reveal his mother’s name, realizing that all along, the great baseball or softball player was HER, not his father, producing that rare plot element that is sexist, illogical, and detrimental to the film’s premise all at once.

“The Sandlot” (1993) dir. David Mickey Evans

"You bob for apples in the toilet..."

My completely non-scientific sense is that it’s this film that holds the most nostalgia for children of the ’90s of all the films on this list. I could say that’s curious since it’s the only one that doesn’t take place in the ’90s (it’s set in 1962), but the movie’s strength is in capturing the timeless essence of being a preteen on summer vacation.

It’s probably the least sports-dependent of any of these sports films. The journey of Scott Smalls as he discovers good friends and independence for the first time takes place on a baseball field, but it’s relatable in any context. Sure, some things seem stupid now, such as the way Smalls initially gains acceptance among his baseball peers—he stands in the outfield, with arm outstretched and eyes closed, while Benny magically guides the ball right into his glove—but they feel forgivable.

"...and you LIKE IT."

“The Sandlot” shares with “The Mighty Ducks” an excellent sense of the often brilliant, sometimes bizarre way that kids are liable to talk and trashtalk to each other when no adults are around. My favorite line is when a member of a rival team insults a Sandlot kid by saying, “You bob for apples in the toilet..and you like it.” As if grudgingly bobbing for apples in the toilet is a normal and expected state of affairs, but to enjoy it is the real disgrace.

The entire film feels like a hazy memory from years in the future. Witness the beautifully shot slow-motion fireworks-lit game scene, or the comically large size of the dog the kids try to rescue the Babe Ruth-signed ball from. When the dog is finally befriended at the end, he becomes smaller and realer, kind of like the rest of the world does as we grow up too.

Pro Athlete Cameos: No actual pro athletes appear, but Babe Ruth does show up in a dream sequence.

Quality of Sports Action: Truth be told, actually pretty terrible. When the ball comes off Ham’s bat in the first home run of the movie, it is quite obviously heading toward the right field side before a new shot shows it sailing over the left field fence. And everyone that swings and misses does so by about four feet. But it’s hard to be bothered by it when the baseball is so un-central to the plot.

How Does a Single Parent Play Into the Plot? Scotty’s mom is not quite single, but she has apparently remarried very recently to a man Scotty is struggling to connect with. Stealing his Babe Ruth-signed ball and ultimately covering it in mud and dog slobber doesn’t seem to be the way to do it, but everything works out in the end between them.

Connections to Previous Films on the List: Brandon Adams, who plays Kenny DeNunez, also plays Jesse Hall in “The Mighty Ducks.” In both films, he wears some sort of headwear—a black-green-yellow-and-red knit hat in “Mighty Ducks,” and a Kansas City Monarchs cap here—emphasizing his position as one of the one or two African-American kids in the group.

“D2: The Mighty Ducks” (1994) dir. Sam Weisman

"No! It was me!"

I won’t say that this movie is everything bad the first one wasn’t, but it’s close. I actually had a lot more memories of this one since I owned it on VHS, and could quote it even to this day. For example, I established a tradition among the kids I was a counselor for at Camp Tawonga of yelling “Goldberg!” every time a fart was smelled, after which the culprit had to declare with arms raised, “No! It was me!” But does it hold up?

Well, the premise is ludicrous even for a kids’ sports movie. The idea that a coach who had a single court-ordered successful season under his belt would be chosen to coach the national team at the Junior Goodwill Games is silly. The idea that the team itself would be made up primarily of the kids from the Pee Wee team he coached, several of whom possess quite limited talent, is absurd. And the idea that this coach—again, a youth hockey coach—would be able to sign a lucrative endorsement contract (including his own signature shoe model) and become a household name is truly laughable.

All of which could be forgiven if “D2” delivered the same heart and humor that the first does, but it doesn’t. There is considerably more filler and crass humor, nowhere near as many good one-liners, and Bombay’s arc, going from good guy to vapid celeb-coach and back again, is a lot less interesting or relatable.

I did, however, find the choices of the Ducks’ international opponents to be rather fascinating. Had this film come out 10 years earlier, I think that the evil juggernaut European team would likely have been portrayed as the Soviet Union. But with the dust still settling from the dissolution of the USSR, the role is filled here by Iceland, which is not exactly a hockey powerhouse in real life. I assume the writers figured Iceland, given its name, to be a land of powerful dark sorcery concerning anything involving ice. Also, I really want an authentic jersey of the film’s inexplicable Trinidad & Tobago team.

Pro Athlete Cameos: NHL stars Chris Chelios, Cam Neely and Luc Robitaille make appearances, as does NHL GOAT Wayne Gretzky, as do, rather randomly, Kareem Abdul-Jabbar, Greg Louganis, and Kristi Yamaguchi.

Quality of Sports Action: On par with the first for the most part, but with a few extremely cartoonish/unrealistic gags thrown in. Fulton ruptures a net in the first film; in this one, his shot produces an inches-deep indentation on a goalie’s hand. The Ducks deceive Iceland by somehow dressing Kenan Thompson’s character in goalie pads during a 30 second timeout without anyone noticing. And in the deciding shootout shot, suspense is milked incredibly cheaply by the film implying that no one is sure whether the Julie “the Cat,” the goalie, stopped the shot or not before she flips it out of her glove. Isn’t that what the goal siren is for?

How Does a Single Parent Play Into the Plot? It’s only mentioned in passing, but apparently Bombay and Charlie’s mom broke up and she married someone else.

Connections to Previous Films on the List: Mike Vitar, who plays Miami speedster Luis Mendoza, also played Scotty’s mentor Benny Rodriguez in “The Sandlot,” and Natalie Portman lookalike Colombe Jacobsen, who plays Julie “the Cat,” plays Henry’s dream girl in “Rookie of the Year.”

“Little Big League” (1994) dir. Andrew Scheinman

"Lou? You can marry her even if you don't hit a homer."

I didn’t remember this film very well, but after watching it, I’m convinced it’s the cream of this eight-movie crop. It shares many similarities with “Rookie of the Year”: a prodigious kid is handed an incredible opportunity involving an MLB team (in this case, inheritance of the Minnesota Twins and subsequently naming himself as manager); he has two best friends who bemoan him spending less time with them; a hidden-ball trick figures prominently at a crucial moment; his single mother hooks up with someone on the team (although, as you might have noticed by now, that one is not exactly unusual for this type of film).

So what’s the difference? Everything. When “Rookie of the Year” is crass, “Little Big League” is remarkably restrained. This is by far the most intelligent, subdued and mature film on the list. Take, for example, the exchange that the kid owner/manager, Billy, has with his star player (who is dating Billy’s mother) before his at bat that will determine the fate of the season. The player, Lou, says he’s asked Billy’s mom to marry him, and she told him to ask Billy first. Billy tells him, rather melodramatically, that he can—if he hits a home run.

Now, this is the point when 99 out of 100 kids’ sports films would take you into the at bat, fully expecting you to experience the suspense of both the team’s fortunes and Lou’s future married life hanging in the balance. Maybe he’d hit the home run and everything turns out perfectly. Maybe he wouldn’t, and Billy would then sappily tell him that he can marry his mom anyway, because, “I know you’ll always hit a home run for her” or something.

But in this movie, Billy calls out to Lou a second after his first statement, seeming embarrassed for thinking to set up such a scenario, and says, “Lou? You can marry her even if you don’t hit a homer.” Some might think this anticlimactic, but I thought it was a wonderfully organic-feeling exchange to be plopped in the midst of such a crazy situation.

This quality is what I love about the film as a whole. It extends to the entire premise. The concept behind “Rookie of the Year” is effective wish fulfillment, but it isn’t possible without a totally hackneyed medical-marvel plot device. The circumstances which lead Billy to own and manage the Twins are unlikely to say the least, but after accepting them, I honestly find his success almost fully believable. Who hasn’t known a 12-year-old kid who displays a joyfully obsessive devotion to baseball stats, strategy and history? I knew one in particular very well. Maybe that’s one reason this film resonated with me as much as it did.

Pro Athlete Cameos: A whole slew of actual MLB players of the time portray themselves, including Rafael Palmeiro, Tim Raines, Ivan Rodriguez, and in the climactic final game, Randy Johnson and Ken Griffey Jr.

Quality of Sports Action: In another opposite from “Rookie of the Year,” the baseball action in this film is exceptionally realistic. The director seems to know it and shows this off with a ton of slow motion action shots. And the hidden-ball trick in this one is legal and in fact based on an actual play from the 1982 College World Series known as “the Grand Illusion,” which is surely one of the greater baseball moments to be named after a Jean Renoir film.

How Does a Single Parent Play Into the Plot? I basically already covered this, but I will add that Lou and Billy’s mom clearly have known each other for some time before the events of the movie take place, making their romance rather less cringeworthy than those in some of these other films.

Connections to Previous Films on the List: John Beasley, who plays a fieldside security guard who’s complicit in the hidden ball trick, also plays Jesse’s dad in “The Mighty Ducks,” and Brock Pierce, who plays a stickball-playing kid here, plays young Bombay in “The Mighty Ducks” as well.

“Little Giants” (1994) dir. Duwayne Dunham

"I call it...The Annexation of Puerto Rico."

This movie might not be quite as bad as “Rookie of the Year,” but it’s definitely more crass than any film on the list. I’m not sure if five minutes goes by in this film without someone falling over, passing gas or getting hit in the testicles. Yes, “The Mighty Ducks” has a character whose flatulence is an occasional source of humor, but “Little Giants” has one whose farts are actually used as an in-game offensive weapon.

The primary kid protagonist in this film is a girl, Becky “Icebox” O’Shea, which it might deserve a bit of credit for, if her character arc wasn’t played out as dumbly as it is.

Figuring that Junior, the Devon Sawa-played quarterback she has a crush on, will want to date “a girl, not a teammate,” this tomboy succumbs to a sudden attack of femininity and decides to become a cheerleader rather than a player…literally on the day of the climactic game, and without telling any of her teammates first.

Even when a rare moment of genuine sweetness sneaks in, like when the team carries the smallest member of the team on their shoulders after he does something good in practice, the film still can’t resist the call of the slapstick: they drop him.

There is one element of this film that I found interesting though, which is the character of Nubie, the team’s nerdy play-designing mastermind. With his large glasses, sideswept straight blonde hair and everpresent button-down shirt and tie, the filmmakers probably figured he was the epitome of archetypal dweebiness, but in fact, he ends up looking exactly like Andy Warhol. Also, I find the name he gives to the gamechanging secret play—”The Annexation of Puerto Rico”—to be the funniest thing in the entire film. I’m not quite sure why. But I’d definitely like to hear what was going through Nubie’s mind when he deemed it as such.

Pro Athlete Cameos: NFL players Tim Brown, Steve Emtman, Bruce Smith and Emmitt Smith, along with coach/broadcaster/Outback Steakhouse pitchman John Madden, show up for one reason or another to inspire the kids.

Quality of Sports Action: Could be worse I suppose, although the relentless sight gags (a barrelled-over defender leaving a full-body imprint in the turf, a receiver whose hands are glued onto his jersey by stickum, etc.) erase any chance of it seeming realistic.

How Does a Single Parent Play Into the Plot? Icebox’s single dad, played by Rick Moranis, apparently hits a raw nerve when he calls her “my little fullback.” Icebox remembers how her mother called her “my little princess” which is part of what ignites her half-day-long girly-girl phase. In the end, it’s left unsaid whether re-tomboyified Icebox and Junior get together, but Icebox’s dad and Junior’s mom, painfully, do.

Connections to Previous Films on the List: Actually, this is the only movie of these eight that does not share at least one actor with another film on the list.

“The Big Green” (1995) dir. Holly Goldberg Sloan

"There's not much to do in Elma."

I don’t know for sure, but I’d bet that this film was greenlit during the 1994 World Cup, the excitement of which also led to the founding of Major League Soccer in 1996. I found it interesting that the movie does not, however, take place in the middle-class suburban setting that came to be associated with soccer in the ’90s. It actually takes place in a tiny dying rural Texas town, which is definitely a more daring choice.

In the first scene of the movie, some of the kids who later become part of the Big Green are shown dumping a bag of cheese puffs onto themselves and waiting for the circling birds to eat them off their bodies. “There’s not much to do in Elma,” they say, and it at first seems that these are the type of futureless screw-ups who will grow up to operate meth labs in their garages.

Luckily, a teacher from England arrives and teaches them about soccer, which the kids have barely ever heard of at first. What follows is mostly a paint-by-numbers rehash of the triumphant-underdog plot. But the way that soccer gives some purpose to the kids’ lives, and especially, their genuinely loving appreciation toward their teacher because of it, is very sweet.

Despite what the unfortunate poster—which features more goats than girls—suggests, this film is also the most gender-equal on the list. And perhaps most interestingly and unexpectedly, it dips a toe into the illegal immigration debate. The Big Green’s best player, Juan, is an American-born citizen, but his mother is undocumented. The cutthroat coach of the team’s ultimate rival tries to get her deported, but the plan fails, the family gets to stay together, the Big Green win, and Juan and the main girl, Kate, maintain their cute flirtation. (See Icebox? Sometimes guys do want to date a teammate.)

Pro Athlete Cameos: I was totally ready for a cameo from some American from the ’94 Cup such as former free-spirit wild man and current unfailingly negative ESPN soccer analyst Alexi Lalas, but none materialized.

Quality of Sports Action: I got the feeling watching this movie that soccer is probably the easiest kids’ sports film to make look good on film. It’s clear that, aside from Juan, none of the characters (or the actors who play them) are very good, but the simplicity of basic soccer ensures that nothing looks too bad.

How Does a Single Parent Play Into the Plot? Kate’s single father is an alcoholic deadbeat who doesn’t care about her soccer exploits at first, but they ultimately bond over it. Although the setup is sad, it was kind of refreshing to see his character find redemption by reconnecting with his daughter rather than getting a girlfriend. Unfortunately there is a gratuitous relationship between the teacher and the town sheriff, though.

Connections to Previous Films on the List: Big Green players Chauncey Leopardi and Patrick Renna are also “Sandlot” kids, having played Squints and Ham respectively. And another teammate, Billy L. Sullivan, plays one of Billy’s friends in “Little Big League.”

“D3: The Mighty Ducks” (1996) dir. Robert Lieberman

"It's only a letter, Charlie. Here. I have hundreds of them."

One year, they defeat the world. The next year…they struggle to usurp the varsity team at the private school they’ve become the JV team for. It’s a bit of a step down for the Ducks, but least the scope of the conflict is a bit more believable this time around.

The feel of this film reminded me of “Harry Potter and the Order of the Phoenix,” and not just because of the academic setting. Loyalties are tested. A mentor character dies. Romantic relationships become important. A mysterious forbidden prophecy is finally revealed. (Okay, maybe not that one.) In short, this is the Ducks’ growing up moment.

On the one hand, there is a bit of potential in this idea. I was struck by one scene in Charlie (who goes through a similar phase as Bombay did in “D2,” forgetting the game is supposed to be fun) bemoans the team’s new coach stripping him of the C he gets to wear on his jersey for being the Ducks’ captain. Hans, the team’s elderly and soon-to-die equipment supplier/mentor, tells Charlie that if he wants a cloth applique C, he has hundreds—and we, and Charlie, understand the point that being a leader is not about titles or honorifics.

But on the other hand, this film doesn’t have the same sense of fun of the original or even, despite its flaws, “D2.” It has to resort to “Rookie of the Year”-esque conflicts such as the threat of the team’s scholarships being revoked in the midst of their first season just because they aren’t playing well. In a sense I’m glad that this movie exists just to prove that stories of kids’ sports don’t have to end with puberty, but unfortunately it also proves that even strong concepts usually get stretched way too thin by the second sequel.

Pro Athlete Cameos: Paul Kariya, who was the captain of the NHL’s Mighty Ducks at the time, is interviewed for some reason by the school’s hockey announcer.

Quality of Sports Action: On par with the other two, and at least Julie “the Cat” finally takes over from the obviously inferior Goldberg in goal. Goldberg is converted to a defenseman, and in a nice touch, scores the climactic goal of the film in a last-second-of-the-game sequence in which time moves slower than the third dream level in “Inception.”

How Does a Single Parent Play Into the Plot? We assume that Charlie’s mom is still married, although at one point she’s seen talking with Bombay (who is barely in the film, by the way) so who’s to say whether something’s been rekindled there?

Connections to Previous Films on the List: Scott Whyte, who plays the Icelandic team’s best player in “D2,” also plays a snobbish varsity player here.

So there you have it—my survey of ’90s kids’ sports movies. I said at the beginning that ’90s nostalgia is going to be huge in this decade, and I think it’s true not just because of the two-decade-rule I mentioned before. As we mercifully exit the decade from hell that was the ’00s and beseech the gods to let the next one be better, I imagine there will be great interest in a time when everyone first learned about the Internet, the Twin Towers still stood and the government produced a budget surplus. For whatever reason, I think these films capture those times pretty well. So I hope we keep watching them, to remember that even kids that at first have mud on their faces, are big disgraces and have their cans kicked all over the place(s), can still end up as champions—even if it takes a few fart jokes to get there.


Fascinating Art, Unobjectionable Fare, Lazy Garbage and Artistic Terrorism: A Sampling

with one comment

Biz Markie, whose song "Alone Again" was the subject of the lawsuit that fundamentally altered the direction of hip hop.

I remember very well the first time I became aware of sampling in music. “Wild Wild West” had just come out, and I had the CD with Will Smith’s tie-in song. I put it on and my dad said it sounded exactly like “I Wish” by Stevie Wonder. He was right. I Googled it (actually, in those days, I probably Alta Vista’d or Excite’d it) and discovered the truth. The reason the two songs sounded so alike was that “Wild Wild West” sampled “I Wish”; meaning the producers of the former paid the copyright holders of the latter to, basically, rip it off.

And so began my long and fraught relationship with this now-pervasive element of popular music composition. My gut reaction to a song that bluntly samples an older song is intense. I hurt. I feel physically ill. I am dumbfounded by its audacity. And I can’t believe we let them get away with it.

When I say that I can’t believe musical artists get away with it, I mean it in an artistic sense, not a legal sense. Working at ASCAP, I’m well aware that there is an established legal framework for sampling that most producers who do it follow. But it wasn’t always this way. One of the reasons sampling arose in the hip hop scene of the late ’70s was precisely because hip hop began in an environment that didn’t worry at all about copyright issues. Until the Sugarhill Gang set hip hop slowly but surely down the path of commercialization and world domination, the genre was nowhere to be found on records, but exclusive to parties in the Bronx, Harlem and Brooklyn. You had to be there to experience it. Recording hip hop tracks was barely a thought then, because, perhaps uniquely among pop music styles, hip hop music evolved out of something that wasn’t really definable as “music” at all; namely, rapping with existing music happening to play in the background.

Raymond "Gilbert" O'Sullivan, whose song "Alone Again (Naturally)" was sampled in Biz's song.

It was only when DJs began to get more creative with the backing tracks they spinned for MCs, scratching, looping, blending songs together, that hip hop began to establish itself as its own musical style. As hip hop records started to be made and gain mainstream popularity throughout the ’80s, copyright issues continued to be ignored (since there was no law on the books covering sampling yet) which allowed DJs to get ever more creative. Late ’80s and early ’90s hip hop, led by producers like Public Enemy’s The Bomb Squad, featured some dizzyingly complex multimedia tornadoes that could sample dozens of songs (or films, comedy routines, news broadcasts, etc.) and, in my opinion, did not give any sense of anything being ripped off, had plenty of artistic merit on their own, and had little in common with the kind of sampling that later became the norm.

What changed?

To make a long story short, copyright law caught up with what was going on, and the legal verdict was not favorable to samplers. In a landmark case concerning a Biz Markie song that sampled a Gilbert O’Sullivan one, a federal court in New York’s ruling set a precedent establishing that any unlicensed sampling could be considered copyright infringement. The case had a seismic effect on hip hop production. Now that producers were facing the prospect of paying royalties to everyone they sampled—not to mention the time-consuming process of securing licenses from all of them—most producers figured songs built on complex layers of many samples were just not worth the time, effort or money.

Some producers didn’t stop sampling, though; they just refocused their efforts onto much fewer samples at a time, making the sampling songs sound much more like the sampled ones. And although this type of song did exist before, this is what led us to the current preponderance of lazy, detestable songs like the ones I highlighted earlier.

So, my gut reaction is to say I hate sampling because of the examples that insult my intelligence as a music consumer, expecting me to either be unfamiliar with, or indifferent toward the repurposing of, classics like “I Wish” or “Tainted Love,” or (for God’s sake!) “Eleanor Rigby.” I shouldn’t emphasize the classic nature of some sampled songs though; I might argue that it’s even worse to heavily sample a more unknown track and hoodwink the listener into thinking it’s original.

But, I can’t condemn all of today’s sampling as bad, because I have to admit I find some of it acceptable to me as a listener and even quite interesting sometimes. Truly fascinating, in the case of “plunderphonics” groups like The Avalanches, who brave the legal briar patch to construct albums made of literally thousands of obscure samples. (Check out this video for a rundown of some of the samples used in that song.) Melora Creager included a chilling vocal sample from a Nazi-era German opera recording in one of the songs I praised in my recent post about Rasputina, “Hunter’s Kiss,” a fact that I finally learned (after wondering for a long time) by asking her about it on FanBridge. And some mainstream rap songs that use only one sample, like, say, this one, I listen to and have no problem with.

I’ve thought long and hard about why some sampling doesn’t bother me and some makes me want to chop my ears off, and I have come up with what I think is a reasonably effective two-pronged test for what, for me, constitutes “acceptable” sampling. Compare “Hate It or Love It” with the sampled song. Listening to the two, you can tell their connection, but it isn’t particularly obvious. Cool & Dre, the production team who created “Hate It or Love It,” clearly didn’t just hear “Rubber Band” and decide they wanted to ape the melody wholesale. They found one section of the song, probably only about 5 seconds long, and carved another hook out of it. But at the same time, it’s obvious it’s a sample; they aren’t trying to pass it off as completely their creation, either.

In my mind, then, this song satisfied the two criteria I think need to be present for sampling to seem acceptable to me. It A, acknowledges that it is a sample, preserving the murky fidelity of the original and using a clear “looped” structure; and B, does something with the sample more than just xeroxing the riff altogether. There are songs that do A but not B, and there are also songs that manage to do B but not A, making it sound as if the rhythm did not come from another song, usually by dint of what’s called an interpolation (which means the original copyright holders are paid for the rights to the song, but the sampling artist doesn’t actually use the master recording; they just play it over however they want it).

After that, though, there are songs that might satisfy both conditions that I still can’t be okay with. I am always disturbed by songs in which the singer/rapper interacts with the vocal element of the sampled song. I highly doubt that when Michael Karoli of Can—who has, by the way, been dead for years—sung the phrase “drunky hot bowls” he felt fine with the idea that years later Kanye West would condition you to hear him say “drunk and hot girls” instead. That makes me feel very weird, and reminds me of something else I am deeply troubled by: commercials featuring dead celebrities through the digital manipulation of old footage.

Along those lines, there is one condition that will make any sampling automatically horrendous in my mind: Any song that insults the sampled song or artist. There are famous examples; that one is insulting, as far as I’m concerned, just because I consider it an act of artistic terrorism to push peoples’ associations with something like “Annie” to be even a little bit closer to the line “If you with me mama rub on ya’ tits.” And there are other examples that literally make fun of the original artist. “Whatever she said, then I’m that.” To me that sounds like, “Whatever she said, hahaha! Can you believe this dumb woman who I am making money off of sings in a language I don’t know?” Also, if you make it to the end of the video, you can see that Erick babblingly imitates/mocks the Hindi vocals. Predictably, the joke is on Erick: The line, which was taken from here, translates as, “If someone wants to commit suicide, what can you do?”

What I can do is try to avoid this stuff. Hopefully someday, that will be easier to do than it is now.