Skip to Content

CGI and the Banality of the Incredible, Part 1

CGI and the Banality of the Incredible, Part 1

“If it can be written, or thought,” said Stanley Kubrick, “it can be filmed.”  Kubrick could very well have been articulating the credo for every cinematic explorer of the fantastic since Georges Melies.

Ironically, Kubrick – who was second to none in pushing the limits of filmmaking technology – several times found himself in the position of not being able to turn something written or thought into something filmable.  On 2001:  A Space Odyssey (1968), his most aesthetically and technologically daring work, Kubrick wanted to create an alien life form for the film’s climax but abandoned the idea after several attempts using various techniques always feeling the results were unacceptable.  Twelve years later on The Shining, Kubrick was forced to abandon his plan to bring topiary animals in a haunted hotel’s gardens to life when, again, he felt it couldn’t be done credibly.

Just a few years after having made The Shining, Kubrick might easily have managed to pull off both effects and gone on to create even more remarkable images thanks to Computer-Generated Imagery (CGI).  Evolving over the course of three decades, CGI has brought movie-making to that Kubrickian ideal; with CGI, the only limitations on what’s possible on screen are the time and money available, and the imagination and expertise of the creative personnel involved.  CGI does more than make the impossible possible; its photorealistic capabilities make it credible as well, trumping traditional, more transparent effects like miniatures, puppetry, stop-motion animation, superimposition, split screens, and so on.  CGI can make the big bigger, the extravagant ever more spectacular, and the apparently precarious eminently safe.

The critical importance of CGI to commercial moviemaking today is declaratively illustrated by Hollywood’s 2010 slate.  Although final tallies are yet to come in on such late-year releases as Gulliver’s Travels, The Chronicles of Narnia:  The Dawn Treader, Tron Legacy, and Yogi Bear, going into December nearly half of the year’s 50 top-earning releases depended, to some degree, on CGI to carry off their storytelling.  In fact, it wouldn’t be an understatement to say that at least one-third of 2010’s top 100 films probably couldn’t have even been made were it not for CGI.

It was only a few years after The Shining when CGI began to appear in any substantial way in features (although usually in brief sequences), first in Star Trek II:  The Wrath of Khan (1982), and then in the original Tron (1982), a movie which, appropriately enough, was set inside a videogame.  In 1984, The Last Starfighter used CGI more extensively, using the process to create spaceship-v-spaceship battles in its interplanetary adventure tale.  The following year came Young Sherlock Holmes and the first fully, three-dimensional photorealistic fantasy character generated by CGI:  a human figure from a stained glass window come to life.

Every year or so thereafter, it seemed the CGI process became more sophisticated, its repertoire of photorealistic fantasy more extensive and effective, its use more versatile.  In 1991’s thriller Backdraft, for example, CGI created controllable flames and convincingly put fire chief Kurt Russell on the collapsing roof of a burning building.  Similarly, in The Fugitive (1993), CGI made possible Harrison Ford’s last-minute leap to safety from in front of an onrushing train, while the following year the process credibly inserted Tom Hanks into any number of real-life historical tableaus in Forrest Gump. For period saga Braveheart (1995), CGI helped director/star Mel Gibson pump up his $53 million budget to epic proportions, multiplying his hundreds of extras into screen-filling thousands for the movie’s grand scale battle scenes.  In 2000’s Gladiator, director Ridley Scott’s effects crew digitally grafted actor Oliver Reed’s face on to another performer’s body to finish out Reed’s performance after the actor died before shooting was completed. In 2004, writer/director Kerry Conran shot all of his sci fier Sky Captain and the World of Tomorrow in a studio with just his actors and minimal sets, using CGI to create the rest of a retro-styled futureworld hearkening back to 1930s serials like Flash Gordon. The following year, co-directors Robert Rodriguez and Frank Miller did likewise on Sin City, this time using CGI to recreate the visual flavor of Miller’s graphic novel on which the movie was based.  In one pointed illustration of the extent to which CGI technology grants filmmakers unprecedented and total control over content, director Robert Zemeckis used CGI in the 1996 sci fi adventure Contact to – among other more spectacular effects — adjust the position of one of actress Jodie Foster’s eyebrows.

Any number of the most notable thrillers produced over the last 25 years or so (as well as some of the most forgettable) might not have been possible without CGI.  CGI made possible the alternately majestic/terrifying leviathans of Jurassic Park (1993), the destructive storms of Twister (1996), the crushing waves of The Perfect Storm (2000), the alien worlds of the second Star Wars trilogy, the here-to-horizon besieging armies of fantasy creatures in The Lord of the Rings trilogy, as well as the X-Men, Spider-Man, The Matrix, Pirates of the Caribbean, Batman, and Harry Potter series’ to name a few.  It’s arguable whether or not the commercial ascendance, proliferation, and box office dominance of the blockbuster over the last several decades would even have been possible without CGI technology.

Since the 1980s, thrillers have grown faster and their action and effects more spectacular as they have come to mimic the non-stop, escalating, action-driven constructs of videogames as well as the restless pace of flipping through a hundred-plus cable channels and cruising the boundless, ever-changing terrain of the Internet.  CGI allows moviemakers to take the level, pace, and quantity of action beyond what was possible and/or could be credibly portrayed through more traditional, non-digital forms of special effects, with some moviemakers pushing digital technology still further to replicate even the look of videogames.

Sword-and-sandal adventure 300 (2006) a fanciful telling of the Battle of Thermopylae inspired not by history but by the graphic novel by Frank Miller and Lynn Varley – was largely shot on a soundstage with bare bones sets, with CGI providing most of the settings and nearly everything else, from the portentous skies overhead to the whirling, suspended slow-motion spurts of blood as scantily clad, heavily-muscled ancient warriors hacked away at each other. Director Zack Snyder’s compositions often replicated the illustrations from the Miller/Varley work exactly, while CGI gave 300 a surface texture and color palette so closely resembling videogame imagery some viewers seeing the film’s first teasing TV ads wondered if they were, in fact, watching a promo for a new videogame rather than an upcoming movie release.

Director Robert Zemeckis went one step further with Beowulf (2007), a liberal interpretation of the sixth century heroic poem wherein the entire film was shot on a Culver City sound stage. Zemeckis completely replaced not only physical settings but his actors as well with digital replication through “motion-capture” technology. Motion-capture records the movements and even facial expressions of live actors, using that data as a computerized armature over which is built whatever digitally-created form the moviemaker desires. Motion-capture can provide a natural fluidity of movement as well as a sense of heft not always achieved in wholly computer-generated entities. Director Peter Jackson had used actor Andy Serkis to provide the motion-captured movements for the skeletal Gollum in his Lord of the Rings series, and, again, as the basis for the movements of the titular great ape in his 2005 remake of King Kong, but these were still digital creations inserted into a live-action context. Zemeckis had previously used motion-capture on his Christmas fantasy, The Polar Express (2004), to turn out a completely CGI-rendered movie, and did the same for Beowulf but with more refined motion-capture technology. Co-screenwriter Neil Gaman – himself a graphic novelist — caught the photorealist-yet-unreal flavor of the movie’s CGI visuals when he said, “Watching this thing is like walking around in a graphic novel.”

Freed by an escapism-addicted, sensation-seeking young audience from any obligation to be credible or even remotely possible, excess has become the blockbuster’s rule-of-thumb, and CGI technology puts infinite excess within easy reach. Said Rob Moore, president of Paramount marketing and worldwide distribution in an article on Beowulf, explaining the blockbuster aesthetic, “For a young audience, this is the world they live in.”

But despite the increasing amounts of money, time, effort, and digital technology thrown into creating ever-more-spectacular and first-of-a-kind CGI-visuals, there may be a point of diminishing return to all this eye-drowning, mind-blowing, computer-generated razzle-dazzle. When miracles become commonplace, they no longer seem miraculous, and the gift of CGI is also its curse – the ability to produce miracles on demand for a market constantly demanding ever more amazing miracles.