Skip to Content

When “Great Leaps Forward” Aren’t, or, the Art of Looking Bad

I recently came across Washington Post critic Ann Hornaday writing about a screening several weeks ago at CinemaCon of 10 minutes from Peter Jackson’s (and Warner Bros.’) attempt to extend the Lord of the Rings franchise with The Hobbit: An Unexpected Journey.  The film is currently slated for release in December of this year. Some of what she had to say has me wondering if looking crappy might not be the new cool for the silver screen.

CinemaCon – or, more formally, the Official Convention of the National Association of Theatre Owners —  is an annual come-together in Las Vegas of exhibitors and other industry professionals gathered to see what the studios have coming down the pipeline.  Exhibitors have been just as hungry as LOTR fans to see if Jackson has pulled off his long buzzed-about four-play.  After all, the original trilogy grossed a combined theatrical box office total of nearly $3 billion worldwide.  If Hobbit plays at the same level, that kind of lobby traffic will move an awful lot of popcorn.

Jackson had startled audiences and woken up the movie industry with how far he’d pushed the use of CGI and motion-capture throughout the LOTR films.  Though they’ve become routine now, remember how fresh those from-here-to-the-horizon hordes of invaders looked at the time?  And no one had, to that point, used motion-capture as deftly as Jackson had with the obsessed, skeletal Gollum.  Match Gollum against CGI-created Jar Jar Binks from George Lucas’ second Star Wars trilogy, and the consensus is Jackson had out-Lucased Lucas.

Exhibitors had reason to be a little drooly about what Jackson might visually have in store with The Hobbit, and they weren’t the only ones.  LOTR fans, sci fi and fantasy geeks, cinema tech heads and movie hounds have all been sitting up, heads cocked, tongues out, their fingers flying around their keyboards as they’ve blogged away in a lathered frenzy of anticipation because Jackson has been shooting The Hobbit in a new 3-D digital format designed to make previous 3-D processes look like your grandpa’s GAF Viewmaster in comparison.  The new process shoots film at 48 frames per second – twice as fast as the since-anyone-can-remember standard of 24 – providing an unprecedented clarity of image.

From what Hornaday says, the new process did exactly what it was supposed to do…and, evidently and ironically, that may be the problem.

The images screened at CinemaCon were so clear, so vivid, they looked more like video than film.  And while that seems to have given Hobbit’s CGI-rendered critters a unique visual pop, it doesn’t seem to have done as well by the movie’s humans.  According to Variety’s Josh L. Dickey, “…human actors seemed overlit and amplified in a way that many compared to modern sports broadcasts…and daytime television.”

Hornaday also reports, however, that not everyone was put off by the Good Morning, America-ish results of the new process.  Let me quote from her story:

“But at least one film-lover in Vegas liked what he saw.  The Hobbit footage, wrote online film columnist Jeffrey Wells on his Web site, Hollywood Elsewhere, was ‘like watching super high-def video, or without that filtered, painterly, brushstroke-y, looking-through-a-window feeling that feature films have delivered since forever.’  The high frame rate, he continued, ‘removed the artistic scrim or membrane that separates the audience from the performers’.”

I thought Wells’ was a remarkable statement because I wouldn’t normally consider descriptives like “painterly,” “brushstroke-y,” and “artistic scrim” a bad thing.  It’s ironic I came across this story during the same week we’ve been discussing Ridley Scott’s Prometheus (2012) and Blade Runner (1982) here at Sound on Sight.  “Painterly” – and correct me if I’m wrong – is what Scott usually tries for.

My take on Wells’ comment was it was a bit like saying that the 48 fps process had made a dream seem less, well, dreamy…and that that was a good thing.

But then the longer I thought about what he’d said, the more it made a kind of unhappy sense to me.

*****

Providing Mr. Wells isn’t just some contrarian who likes to stir the pot to get a good argument going, he may be onto a new, developing visual sensibility.

One reason movies have changed over the years is because the sensibility we bring to watching them has changed.  The movies – by and large – of the 1940s look substantially different from the movies of the 1960s and 1970s.  The standardized briskly-paced, master-medium-close-up formula of Old Hollywood gave way to a European-influenced languor in the 1960s – long shots, long takes (think Altman, Coppola, Kubrick) – alternating with an Eisensteinian delight in fracturing space and time (Peckinpah); a veering between dense, naturalistic dialogue (Scorsese and, again, Altman) and a dramatic, almost opaque minimalism (Boorman, Pakula).  Those were stylistic changes which worked for the young, cinema-attuned audiences of the time.

Come the 1980s, another change for another audience sensibility.  Films became faster – more edits, more beats – reflecting a sensibility first cultivated by cruising through the growing cable spectrum, then by videogaming, then by cruising the infinite variety of the Internet.

Jeffrey Wells may have tipped to yet another evolutionary phase in audience sensibility; something being shaped by the interplay of, principally, two media dynamics.

1.  Speed Freaks

Videogames, the Net, talking to each other in 140 characters bits on Twitter, texting during every waking moment because five minutes without some kind of stimulation is a form of mini-death have long had their impact on movie storytelling:  hyper-accelerated, action/effects-packed movies which may not make much sense because they don’t have to, populated with broad-stroked characters because that breakneck pace won’t allow for much more.   Think Michael Bay (I try not to).

2.  The (Un)Real World

The boom in reality programming since the Writer’s Guild strike of 1988, both on the broadcast networks and on cable, is cultivating a generation of audience growing attuned to the unsophisticated, unpolished, unapologetically raw quality of unscripted TV.

Each demographic cohort following the Baby Boomers has been watching less TV than the generation before, and, not coincidentally, spending more time on alternate, generally non-narrative media (online, videogaming, texting, tweeting, etc.).  When those younger generations do tune in to TV, they’re just as likely to head for cable’s more sensational unscripted offerings as for the broadcast nets.

It’s primarily a young audience fueling cable successes like MTV’s Teen Mom (the series’ 2009 premiere was MTV’s highest-rated launch in over a year) , Comedy Central’s Tosh.0 (which outdraws both The Daily Show and The Colbert Report), Bravo’s various The Real Housewives of Wherever, and MTV’s ratings monster, The Jersey Shore (drawing approximately nine million viewers at its peak putting it on par with a number of broadcast network hits and ahead of shows like Glee, House, and Law & Order:  SVU).

Throw that in with how much time young users spend on YouTube (which accounts for 43% of the online video market and is the third most visited website behind Google and Facebook) watching amateur video, and it’s not a hard stretch to conceive of a generation of video viewer for whom the badly-lit, badly-framed footage produced by non-professionals has become a new standard.

That might – and I emphasize might since this is nothing more than an instinctive guess – account for the popularity of “found footage” flicks like the Paranormal Activity series (three installments so far with a fourth due in October), Cloverfield (2008 with a sequel in the works), The Last Exorcism (2010), The Devil Inside, Chronicle, Project X (all 2012), and the flick given credit for kicking off the found footage craze, The Blair Witch Project (1999).

You mix those two ingredients together and you get an intriguing paradox:  an appetite for a more “honest” look, something devoid of the usual studio veneer and artifice, something visually pure and true…in service – at least in the case of The Hobbit – of a story that’s pure artifice.  It’s like saying, I want a fairy tale that looks like a hidden camera documentary.

Say what?

Why?

*****

If it turns out, in time, Wells is onto something, there’s no “why” to it; it’s an almost natural by-product of those two cultivating influences.  And, to be fair, there’s not a “good” or “bad” to it anymore than there was a good or bad to the change in how movies looked from the 1940s to the 1960s.  The eye learns to look anew at a new way of looking.

Wells may like the unvarnished quality of this new 3-D, but we’ve been through this kind of thing before:  the Mumblecore movement of the early 2000s, and before that, Lars von Trier and his Dogme 95 disciples, each looking for something more natural, more real, something honest.

Which it never really is, particularly – as is the case with a number of Mumblecore and Dogme 95 films – when it’s in service of stories which are, in their own, faux naturalistic way, as contrived and manipulated as an old-fashioned, high-gloss studio “meller.”  Dancer in the Dark (2000) isn’t any more life-like than Tyler Perry’s Good Deeds (2012).  It just does a better job of looking more life-like.

Being enthused about how Gandalf won’t look like a carefully-crafted, studio-polished, CGI-enhanced envisioning, but, instead, like somebody being quizzed by Matt Lauer isn’t something I can quite plug into.  There’s a part of me that, intellectually, gets what Wells is saying.  But there’s another part of me that keeps saying, “Dude, ya know the guy’s a wizard, right?”

*****

Every step forward usually requires leaving something behind, and if this is, indeed, the way the crowd is walking, I’m going to miss that painterly and – God forbid! – artistic look.

When I was a film student a few million years ago, I remember a discussion about comparing film to other forms.  Yeah, you sat in a theater and watched the action play out in the proscenium of the screen, but it wasn’t like theater.  Stories played out often like novels, but it couldn’t go interior the way novels could; nope, it wasn’t quite like a novel.

The closest we could approximate was a movie was like a dream.

Like dreams, movies range from the brutally real to the utterly fantastic, but always have their own, consistent (when done well) logic.  In the hands of a good director, a movie feels real although it’s intangible.  In fact, it’s that very intangibility – its unreal-ness – which fosters the illusion of reality.  Wells is right; there has always been a separation between the audience and the performance in movies, but it’s that inability by the audience to reach beyond that “artistic scrim” and disrupt the dream which keeps the dream intact and makes it real.

You stare at a painting, you get lost in the painting.  But then you stand too close, close enough to see the blots and brushstrokes, and the illusion dies.  Wells seems to think that’s a good thing…or believes it won’t kill the illusion.

Even at their most earnest, movies have only ever given a creative impression of reality.  The shadowy noirs of the 1950s were more emotionally honest than the glossy melodramas of the 1930s, but, in their own way, they were just as stylized; just as the milestone flicks of the 1960s were stylized in a different way, but with the same intention of reflecting something of the complexity and ambiguity of the real world.

The second a director – even a documentary director — decides what goes in the frame, that he grants a figure power with an up-angle, mystery by cloaking it in shadow, or fakes authenticity with a handheld camera, he – or she – has manipulated reality, and any talk of visual purity after that is pointless.  The magic of movies – just as with any magic trick – has been in convincing us that what couldn’t possibly be real is real.  Once we see it’s just a trick, it’s not magic anymore.

Bill Mesce

[wpchatai]