The big-budget (usually summer) blockbuster is the financial cornerstone of the American motion picture industry, and has been for much of the last 35 years or so. In all its forms – action/adventure, suspense, Western, war story, horror, science fiction, fantasy, et al – the big budget thriller’s earning power is unmatched by any other movie form. Romantic comedies like The Proposal (2009), slapstick and teen comedies like The Hangover (2009) and Little Fockers (2010), are sometimes capable of blockbuster-caliber domestic earnings, but rarely match those of the thriller, nor can they rival its attraction overseas. The performances of more adult-themed dramas and comedies – even those considered financial successes — are often weaker still. The reliance of most major thriller releases today on action-driven plots is a form of cinematic Esperanto, transcending barriers of language and cultural nuance. The blockbuster thriller is as accessible to Asian audiences as it is to Latin American audiences as it is to U.S. ticket-buyers, even more so in some cases. Consider the comparative domestic/worldwide earnings of the top five live-action thrillers of 2010 vs. the top five live-action non-thrillers.
U.S.Worldwide (in millions)
Alice in Wonderland / $334.2 / 1063.2
The Twilight Saga: Eclipse / 300.5 /698.5
Iron Man 2 / 312.4 / 622.1
Harry Potter & the Deathly Hollows Pt. 1 / 295 / 954.5
Inception / 292.6 /825.5
The Karate Kid / 176.6 /359.1
Grown Ups / 162 / 271.4
Little Fockers / 148.4 / 309.5
The King’s Speech / 135.5 / 398.5
Sex and the City 2 / 95.3 / 288.3
Each year, the blockbuster thriller dominates box office charts here and abroad, tends to comprise the single largest block of the year’s top 20 earners, and normally produces the industry’s single biggest block of box office revenue. In 2009, for example, out of a year’s total output of 522 titles, live-action thrillers comprised six of the year’s top 10 earners, turning in domestic receipts of $2.2 billion out of a total box office for the year of $10.6 billion; in other words, the year’s top six thrillers alone accounted for over 20% of the American movie industry’s total theatrical gross for the year. In 2010, out of 531 releases, five of the top 10 were thrillers collectively earning $1.5 billion of the year’s $10.6 billion take.
The box office dominance of the big-budget thriller is overwhelming. As of the end of 2010, of the twenty all-time box office champions, 16 are thrillers, only two of which were released earlier than 1993; of the top 100, almost two-thirds, the majority of which were released after 1990.
The earning muscle of the thriller doesn’t stop at the box office. The strongest theatrical earners also tend to be the strongest performers in ancillary markets like DVD sales and rentals, sales to television, and in overseas theatrical and ancillary markets. The successful blockbuster is often a platform for launching a (hoped-for) long-running series of movies: a franchise. The franchise is a brand name cutting through the marketing clutter of a crowded theatrical marketplace, with a name value capable of being spun off into related products: film sequels and spin-offs, TV programs, internet attractions, recorded music, publishing, toys, video games, Halloween costumes, collectibles, and so on. Today, Hollywood earns less than 18% of its revenues from the domestic box office, the bulk coming, instead, from overseas, ancillaries, and merchandising. Unsurprisingly, then, this upcoming summer will see the release of sequels Pirates of the Caribbean: On Stranger Tides, Transformers: Dark of the Moon, Harry Potter and the Deathly Hollows – Part 2, Spy Kids 4: All the Time in the World, Final Destination 5, and prequels X-Men: First Class and Rise of the Planet of the Apes, as well as hoped-for franchise launches Thor, Priest, Green Lantern, and Captain America: The First Avenger. The year has already seen the release of sequels Scream 4, Fast 5, and attempted franchise launch The Green Hornet.
Hollywood’s blockbuster mentality has now been in place so long that to a young generation of movie-goers it must seem as if this is how things have always been. Actually, in terms of the industry’s 100+ year history, it is a fairly recent phenomenon.
The Hollywood studio system established during the 1920s-30s arose from the need to feed an insatiable public demand for movie entertainment. Even with a rollback in audience in the early years of The Depression, weekly movie attendance throughout the 1930s averaged almost 69 million, and increased during the World War II years peaking at 84 million in 1943 and 1944.
The studio system, with its salaried rolls of performers, behind-the-camera talent, and craftsmen, and its enormous physical assets (back lots, standing sets, manufacturing shops, recording studios, etc.), gave studios the ability to pump out a constant stream of “A” and “B” features, animated and live-action shorts, newsreels and adventure serials on a timely and cost-efficient basis. While many today tend to think of the studio era as represented by evergreen classics like Casablanca (1942) and Gone with the Wind (1939), such timeless works were the exceptions more than the rule. Studio output at the time generally tended toward quantity over quality. A film only had to amuse audiences for a week or so, just long enough for the next feature off the studio production line to take its place. Some genres were more popular than others, but none were particularly critical to the financial health of the studio.
The dynamics of the movie industry began changing with the end of World War II. The business received a financial body blow with the resolution of a long-running anti-monopoly case initiated by the Federal government in the 1930s. Most of the studios had owned their own chains of theaters, or were owned – in fact, had been set up – by exhibition companies to provide their screens with a steady flow of product, a construct the government viewed as monopolistic. The case was resolved in 1948 when the studios agreed to divest themselves of their theaters.
There was, of course, an immediate financial consequence as each affected studio now had to split box office revenues with an outside exhibitor.
Perhaps more worrisome for studio chiefs, however, was that studios no longer had a guaranteed exhibition platform for their product. Each release would now have to compete head to head with every other film in distribution for screen space turning what had prior been a somewhat predictable business into one marked by tremendous volatility.
Then there was the cannibalizing effect of television. Although only a few hundred thousand U.S. households owned TV sets in the late 1940s, sets were in 90% of American homes by 1962. As TV ownership became more universal, movie theater attendance – already sliding since 1945 – nosedived. TV programming evolved away from the high-brow live dramas considered the hallmark of the so-called Golden Age of TV of the early 1950s toward more widely popular forms, and by the end of the ‘50s, prime time airwaves were glutted with game shows and action-driven fare like Westerns and police shows — the kind of B-type pulp which had once comprised so much of Hollywood’s output. Legendary movie producer Sam Goldwyn looked at the shoot-‘em-ups on TV and the sagging theatrical box office concluding, “It’s a certainty that people will be unwilling to pay to see poor pictures when they can stay home and see something which is at least no worse.”
By the mid 1960s, with costs soaring and attendance beginning a third decade of steady decline (not bottoming out until the 1970s with a weekly attendance of just 17 million), the industry seemed on the verge of collapse. The financially-bleeding studios had been forced to sell off their physical assets including back lots and props, do away with their salaried pools of talent, and shrink production slates. It was a sign of their growing enfeeblement that, from the 1960s through the 1980s, most of the major studios were absorbed by conglomerates: Paramount by Gulf + Western Industries, Warner Bros. by The Kinney Group for example.
Paradoxically, the apparent near-collapse of the industry provided just the right circumstances for its creative resurgence.
A new, younger, more daring breed of production executive began to come to power in the industry in the 1960s and 1970s, some of the more noteworthy being John Calley at Warner Bros., and the legendary Robert Evans at Paramount. Calley, Evans, et al had an earnest passion for movies and found themselves in tune with a new generation of young movie-goers that loved them, too.
In the 1960s and 1970s, movies became part of the youth culture in a way that went well beyond their role as entertainment. In a proliferating number of college classes and degree programs, young people studied films and filmmaking, learning to appreciate the range of cinema from the classics of the American studio era to the more visually stylish and dramatically opaque films of the European nouvelle vogue. This new, young, cinematically-literate audience came to theaters with a hunger for the provocative and unconventional, and for material reflecting one of the most turbulent times in American social history.
Dissatisfaction with the war in Vietnam, and a sense of disillusionment following a series of political assassinations and revelations of misconduct at the highest levels of government were fueling a widespread sense of social dislocation and questioning of the status quo. The magazine Life would go as far as to describe the time as one of the greatest periods of social upheaval in the U.S. since the Civil War.
In the same period, there came an infusion of fresh storytelling talent into the industry in tune with this rising young audience, and artistically equipped to tell new, brave kinds of film stories reflecting, in some manner or another, the angsty ethos of the era in new, brave ways. Graduating from television were directors like Sidney Lumet (Fail-Safe, 1964), John Frankenheimer (The Manchurian Candidate, 1962), Franklin J. Schaffner (Planet of the Apes, 1968), Robert Altman (M*A*S*H, 1970), William Friedkin (The French Connection, 1971), Robert Mulligan (To Kill a Mockingbird, 1962), Martin Ritt (Hud, 1963), Arthur Penn (Bonnie and Clyde, 1967), Sidney Pollock (They Shoot Horses, Don’t They?, 1969), and Sam Peckinpah (The Wild Bunch, 1969). Another group of directors who had been working at the fringes of the industry – some for quite a few years — were brought into the mainstream where they would do some of their most notable work such as Stanley Kubrick (2001: A Space Odyssey, 1968), Robert Aldrich (The Dirty Dozen, 1967), and Don Siegel (Dirty Harry, 1971). The new, open-minded production chiefs also welcomed a generation of artistically daring European filmmakers, raising them up from the art house circuit and bringing them into the circle of commercial majors i.e. Roman Polansky (Chinatown, 1974), John Schlesinger (Midnight Cowboy, 1969), Karel Reisz (The Gambler, 1974), John Boorman (Point Blank, 1967), Ulu Grosbard (Straight Time, 1978), Nicholas Roeg (Don’t Look Now, 1973), and Peter Yates (Bullitt, 1968).
But perhaps the most notable group of Hollywood newcomers were the so-called “film brats,” a generation of directors unlike any which had come before. They were young, had studied film in heralded cinema programs at the likes of UCLA and NYU, were influenced by both the highly visual styles of the European nouvelle vogue and the storytelling of classic American mogul-era Hollywood. Their names would become synonymous with 1960s/1970s cinema, among them: Francis Ford Coppola (The Godfather, 1972), Brian DePalma (Carrie, 1976), Robert Benton (Kramer vs Kramer, 1979), Peter Bogdanovich (The Last Picture Show, 1971), Terry Malick (Badlands, 1973), Steven Spielberg (Jaws, 1975), Paul Schrader (American Gigolo, 1980), George Lucas (Star Wars, 1977), and Martin Scorsese (Mean Streets, 1973).
It was a “perfect storm” of circumstances combining to produce one of the most creatively fertile periods in American commercial movie-making: a new breed of production chiefs trying to save their faltering studios by gambling on an incoming generation of artistically ambitious talent, and a receptive audience hungry for the dramatically provocative, thematically relevant, and stylistically daring, all happening within the context of a society gripped in a painful period of self-questioning and re-examination.
It appeared no subject matter was beyond consideration, no approach too daunting, with the best of the crop intelligently and artfully capturing all the moral confusion, ambivalence and ambiguity of the time, the period’s sense of self-doubt and dislocation, the true-to-life idea that right was not always clearly identifiable from wrong, and even when it was, doing the right thing did not necessarily guarantee a happy ending. It was, in retrospect, one of the most explosively and liberated creative periods in commercial movie-making, a part of a worldwide binge in expansive cinematic artistry aptly described by the French film journal Cahiers du Cinema as, “the furious springtime of world cinema.”
But just two movies would change the creative direction of Hollywood forever.
In summer of 1975, Universal rolled out Jaws, the movie adaptation of Peter Benchley’s bestselling first novel. Every major aspect of the release ran contrary to then conventional industry wisdom:
Typically, summer was a time for cheap juvenilia the studios served up to summer-idled youngsters, yet Universal rolled out this major release in June;
The standard release protocol at the time called for putting a movie into upscale cinemas in major markets around the country, then touring the title through secondary markets and cycling it down through first-, second-, and third-run venues. Wide releasing was reserved for expected failures, a way to mine some kind of quick return on a title a studio suspected would die as soon as negative word-of-mouth spread. But, looking at the success Paramount had had pioneering wide releases for major titles with Love Story (1970) and The Godfather, Universal chose to roll out Jaws in a nationwide “break” of over 400 theaters;
While national TV ad buys had never made sense during the days of limited release with movies generally supported by local print advertising, Universal promoted its wide release by supplementing its print ads with a national TV promo campaign.
Despite having defied all the major conventions of the day – or rather because of these contraventions – Jaws powerhoused its way through the summer of 1975 to become the biggest – and fastest – earner in Hollywood history, being the first movie to ever earn more than $100 million in rentals; the mark that would thereafter define the “blockbuster.” Jaws would ultimately gross $260 million in the U.S., and ring up another $210.6 million overseas, an incredible tally for the time.
Any idea the extraordinary success of Jaws might be what the industry sometimes describes as a “nonrecurring phenomenon” was dispelled two years later when 20th Century Fox released Star Wars, written and directed by another young directorial wunderkind, George Lucas. In its initial release, Star Wars earned a staggering domestic gross of $322.7 million and another $191 million overseas.
Along with out-earning Jaws, Lucas took the blockbuster concept several steps further. For one thing, he had shrewdly retained the merchandising rights for Star Wars and turned his hit film into a wildly successful merchandising platform, with Lucas ultimately earning more from merchandising than the films themselves earned.
For another, he had also retained the sequel rights. Historically, Hollywood had considered sequels a purely mercenary effort. Normally produced quickly and for less than the original, often with lesser talent both in front of and behind the camera, a sequel was deemed a success if it grossed 40% of the original. But, when Lucas produced The Empire Strikes Back three years later, he upped the budget from the original — $18 million v. $13 million – to turn out a bigger, more spectacular follow-up justifying the effort with a domestic gross of $209.4 million and overseas earnings of $247.9 million making it, at the time, the second highest-grossing movie of all time behind the original Star Wars. He only reconfirmed the concept of nurturing a franchise – rather than cheaply exploiting it – in 1983 when Revenge of the Jedi, made for $32.5 million, returned $263.7 million at the U.S. box office and another $128.1 million overseas.
However, even before Jedi, the idea of the blockbuster thriller as a replicable phenomenon had been taking root. Between Jaws and Jedi had come such near-blockbuster and blockbuster hits as The Omen (1976), Rocky (1976), Close Encounters of the Third Kind (1977), Smokey and the Bandit (1977), Superman: The Movie, Star Trek: The Motion Picture (1979), Alien (1979), The Amityville Horror (1979), The Cannonball Run (1981), Raiders of the Lost Ark (1981), 48 Hours (1982), E.T.: The Extra-Terrestrial (1982), Poltergeist (1982), and First Blood (1982). Most of these titles would be followed by sequels, and, in a number of cases (i.e. The Omen, Rocky, Superman, Star Trek, Alien, Poltergeist, First Blood, Raiders) lead to long-running film series and TV spinoffs some of which continue to this day.
It is not difficult to extrapolate from that point to the current environment in which Hollywood pumps out a parade of hoped-for blockbusters throughout the year, with the competition among them reaching a fever pitch during the summer months when the studios can best target the audience most critical to the blockbuster: the young males who not only are attracted to the kind of action-driven fare which is the blockbuster’s forte´, but are willing to go back to a thriller they like two, three or more times as well as buy affiliated merchandise.
In the late 1970s and early 1980s, the blockbuster was still a singular event; Jaws had the summer of 1975 to itself. Today, the summer season sees a major studio picture roll out in wide release nearly every weekend. Looking at just the top releases for the upcoming summer for example:
May 6 Thor
May 13 Priest
May 20 Pirates of the Caribbean: On Stranger Tides
May 26 The Hangover Part II / Kung Fu Panda 2
June 3 X-Men: First Class
June 10 Super 8
June 17 Green Lantern
June 24 Cars 2
July 1 Transformers: Dark of the Moon
July 15 Harry Potter and the Deathly Hallows – Part 2
July 22 Captain America: The First Avenger
July 29 Cowboys & Aliens
August 5 Rise of the Planet of the Apes
August 19 Conan the Barbarian / Fright Night / Spy Kids 4: All the Time in the World
August 26 Final Destination 5
In an attempt to insure that prerequisite big-earning opening weekend, summer thrillers roll out in wide breaks sometimes numbering thousands of screens, supported by relentlessly cross-promoting multi-media ad campaigns beginning weeks, maybe months, sometimes as much as a year in advance in the hope of stoking audience anticipation. Toward the end of the 1990s, the cost of promoting a movie in an increasingly crowded marketplace was growing at twice the rate of actually making a movie. Marketing expenses for any wide release now run somewhere between $30-40 million, and those for big-budget summer thrillers even higher.
The competitive pressure among blockbusters has sent production budgets soaring. The cost of the average major release today runs $60-70 million, but those of summer blockbusters often considerably higher. The thrillers in the slate listed above were produced for a cumulative outlay of almost $2 billion (not counting marketing costs), with budgets ranging from a low of Fright Night’s $17 million to a high of Transformers: Dark of the Moon’s $250-300 million, for a per-picture average of somewhere close to $130 million (again, minus marketing).
In the 1960s/1970s, the studios – with nothing to lose – had been willing to gamble modest budgets on the iconoclastic and unconventional, but by the 1980s, having regained a measure of attendance stability, and finding among young ticket-buyers an audience almost predictably attracted to a narrow range of often expensive material, they became increasingly risk-averse. As the potential risks – and rewards – grew to unprecedented heights, creative development among the studios trended conservative. Big budgets became the province of franchise launches and sequels and clones of previous major hits, of remakes and big screen adaptations of TV shows with their presumed name recognition value, of adaptations of comic books and video games which appealed to the same demographic the studios were trying hardest to reach. Projects came to be selected by studio committees which not only included creative executives, but marketing and merchandising officers, with one eye on the American audience and another on the overseas box office. The summer of 2011 illustrates the strategy in extremis; of the 18 major releases listed above, ten are sequels, two are remakes, seven are based on comic books, and one is an adaptation of a 1970s TV cartoon series which, in turn, was based on a set of toys. In fact, only one – the relatively modestly budgeted ($45 million) Super 8 – has no ties to comics, is a sequel, or is intended as a franchise launch.
While critics often decried the apparent creative poverty of an industry addicted to remakes, rehashes, and sequels, and even some in the industry bemoaned an era in which it seemed marketability counted for more than a project’s dramatic qualities, and some industry gadflies – like Variety editor and one-time production chief Peter Bart – went as far as to declare studios had an “obligation” to produce movies for a wider variety of audiences than had become the practice, box office scores regularly made such points academic.
Just as impressive as box office scores is the fact that so many blockbusters turn in strong earnings despite tepid to strongly negative reviews, i.e. 2010’s Alice in Wonderland, ($1.024 million worldwide), The Twilight Saga: Eclipse ($698 million w/w), Iron Man 2 ($622 million), Clash of the Titans ($493 million), The Chronicles of Narnia: The Voyage of the Dawn Treader ($415 million), Tron Legacy ($399 million), Prince of Persia: The Sands of Time ($335), Robin Hood ($322 million), The Last Airbender ($320 million), and Resident Evil: Afterlife ($296 million). That such undistinguished fare consistently scores at the box office, and that it seems often to be review-proof, says as much about the blockbuster audience as it does the mercenary attitudes of studio production execs.
As was the case in the 1960s/1970s, young moviegoers remain the major driver of the contemporary box office, but in the case of the blockbuster thriller, the audience skews much younger than it did a generation or two ago with the under-18 crowd now more important to the success of big budget releases. An R-rating can translate into 20-30% fewer potential ticket-buyers, a lethal drop in the case of movies costing $100-200 million or better. As a consequence, producers of costly action/effects-driven fare tamp down – or completely avoid – strong sexual or violent elements as well as adult themes, developing, instead, material incorporating elements with a strong juvenile appeal. By way of measuring the shift, during the first six years after the institution of the Motion Picture Association of America’s (MPAA) rating system in 1967, over a third of theatrical releases were R-rated, including 25% of the top-performing thrillers of the time. In comparison, in the years 2000-2005, only 11 of the top 60 live-action thrillers were R-rated.
More than a simple shifting of the demographic has been at work. The studios now deal with a markedly different audience mindset then faced them in the 1960s/1970s. Today’s young consumers are incredibly technologically adept and immersed, yet, paradoxically, despite their near-addictive use of the internet, studies show them to have limited interest in or knowledge about the world around them.
The generational visual sensibility has changed as well. Today’s youngsters grow up in households with access to dozens of cable TV channels, internet access to hundreds of websites, and with an enormous affinity for video games, with young males particularly attracted to fast-paced and action-driven gaming scenarios. Cruising the cable or internet spectrums and a regular diet of frenetic videogames has cultivated a penchant in the young audience for non-stop pacing and over-the-top action which has, not coincidentally, become the standard thriller construct.
On this growing “addiction” to a lifestyle of high-tech multitasking, particularly prominent among the young, psychologist David Greenfield observed that the priority seems to be about “…distraction, numbing oneself… There is no self-reflection, no sitting still”; style characteristics common in most of today’s big-budget thrillers.
Today’s mainstream commercial thriller – and especially the big-budget blockbuster – has come full circle; as escapist and disposable as the movies of the 1930s, but overblown with outscale action and special effects. At a time when advances in special effects technology – particularly in the field of Computer Generated Imagery (CGI) – have given moviemakers the ability to put on film anything they can conjure in their imaginations, paradoxically thrillers have become less dramatically imaginative than ever before.
The life-sized, resonant thrillers of the 1960s/1970s have been replaced with a steady output of live-action comic books, drowned in the fantastic if not outright fantasy, and their richly shaded life-sized heroes replaced by pure-of-heart superheroes or similarly larger-than-life protagonists. With their breathless pace and non-stop action, there is little room for character, texture, or layered plotting. In fact, such hyper-energized constructs force plotting and characterization toward easily and quickly digestible clichés and predictable forms. Commitment to projects is based not on a passion for the material, but on a calculation of how many toys might it sell; how well it might play in Japan; how easily it can be condensed into a catchy 30-second TV ad. The cinema of ideas, Peter Bart once mourned in his weekly Sunday Morning Shootout TV series, is long dead and gone.
In a review of the 2007 thriller Rendition, Entertainment Weekly reviewer Owen Gleiberman wondered if “…America…has checked out on the promise of movies that delve into the issues of our time,” a proposition seemingly borne out by the across-the-board underperformance of a raft of issue-inspired and/or drama-driven thrillers released in the fall of that year along with Rendition: The Kingdom, Michael Clayton, In the Valley of Elah, The Assassination of Jesse James by the Coward Robert Ford, We Own the Night, and 3:10 to Yuma.
In fact, what had once been the selling points of the 1960s/1970s classics are now looked at as debits: complexity, moral ambiguity, topical relevance, dramatic resonance, a willingness to upset expectations and challenge the audience. Instead, rampant imitation, predictability, an adherence to an extremely narrow range of story forms, and total escapist irrelevance are the order of the day along with a dependence on spectacle and effects gimmickry.
Bigger, costlier, more physically impressive than ever before, the studio thriller of today is also typically forgettable and disposable, with one big budget opus looking much like another. For all their scale, so said director provocateur Oliver Stone in a TV profile, movies today are, in the dramatic sense, “growing smaller,” and all the surfeit of large scale action set-pieces and eye-dazzling effects cannot mask the fact that Hollywood’s movies have “lost (their) magic.”