Warren Beatty, who co-hosted the Oscars in 1976, summed up the event fittingly: “We want to thank all of you for watching us congratulate ourselves tonight.” On March 7, the occasion of the 82nd Academy Awards, America will do it once again.
The Oscars is the oldest media awards ceremony, and the prototype for most if its successors. But it’s an improbable television institution, in terms of both its origins and—to put it simply—just how little it has to do with television.
Emanuel Levy’s book “Oscar Fever” traces the awards back to their inception. The Academy of Motion Picture Arts and Sciences, founded in 1927, was the brainchild of MGM’s eponymous Louis B. Mayer. Its first awards ceremony took place in 1929—the operative logic being that the best way to legitimize the fledging industry might be to host a highly publicized event in its honor.
The Oscar ceremonies were traditionally funded by contributions from the major studios. But before the 25th Academy Awards, several of its primary financial supporters unexpectedly backed out.The Academy needed to secure another sponsor, or else to cancel the extravagance they’d planned. Just in time, RCA purchased the rights to broadcast the ceremony, and it was watched on NBC by the largest audience in the history of television at that time.
Yet, in 1953, the relationship between the film and television industries was far from friendly. Still very much a new medium, TV had conquered the country in the first few years of the decade: it constituted a tremendous improvement on radio, and watching “I Love Lucy” cost no ticket price—this correlated, not surprisingly, with a sharp drop in box office revenue. Hollywood responded with the jealous petulance you’d expect from any first-born child. Many studios forbade their contracted stars from appearing on television, and the networks—devoid of their own celebrities—were considered little more than a dumping ground (and a source of licensing fees) for stale feature films.
Televising the Oscars (the ceremonies had been broadcast on radio for some time) represented a convenient symbiosis. But the merger of film and television presented producers with a formidable challenge: how to create a program that would appeal to both the cinephile—deigning for one night to watch, shame of shames, television—and the devout TV viewer whose remote control happened to lead him there.
It is rare that anything, with the possible exception of sleeping, can hold one’s interest for four hours. But to a significant extent, the Academy Awards manages to do so, in a way that reflects the status shift in media that its broadcast entails. For the Oscars, celebrities are quite literally brought down to size—transported from a fifty-foot wide movie screen to a thirty-two-inch TV screen. The real genius of the Academy Awards broadcast is what it invites its viewers to do: fancy ourselves among the elite, if not somewhere slightly above them.
Year after year, the award categories are populated with consistent archetypes: the underdog, the obvious filler, the perhaps-not-particularly-deserving-this-year-but-boy-is-it-about-time-she-won-already. The winner selection process, intrinsically tainted by Academy politics, is anything but quantitative—statistically, even the most impulsive civilian guesser is likely to make at least one correct prediction. This lends a satisfying, authoritative feel to one’s preferences regarding, for instance, Meryl Streep—who should be given an Oscar every year, by default, just to thank her for being Meryl Streep—versus Sandra Bullock—who, incredibly, is somehow still allowed to make films after appearing in “All About Steve.” The cult popularity of the Golden Raspberry Awards (the “Razzies”), which honor the year’s worst films, thrives on the same instinct for superiority.
Which Oscar moments do people remember? Keep in mind that—as the Academy Awards is a multi-hour ceremony that has aired on television more than fifty times—that’s a lot of moments to choose from. In general, it seems, we most enjoy those that make famous people look uncomfortable, stupid, or silly. Take any one of the inevitable bloated, overemotional acceptance speeches—technically constrained by a forty-five second time limit, but which nevertheless regularly result in millions of dollars of wasted airtime.
In 1998, James Cameron reached new heights of tacky by shouting “I’m the king of the world!” after winning Best Director for Titanic. Here’s hoping we won’t find out how he’d react to a win for Avatar.
Professionally shrill red carpet hosts encourage us to cackle at the fashion foibles of celebrities, and the same nebulous schadenfreude motivates the compiling of countless Worst Dressed Lists. TV cameras are masterfully positioned to capture deliciously revealing reaction shots; there’s nothing quite like the strained smile of someone who just lost an Oscar.
—Columnist Molly O. Fitzpatrick can be reached at fitzpat@fas.harvard.edu.
Read more in Arts
The Dutiful DJRecommended Articles
-
And We Proudly Present...
-
Challenge Showcases Startup Ventures
-
Four Harvard Faculty Members Receive Award From President ObamaIt’s not every day that W. Nicholas Haining receives an email from the White House, and he said that he certainly took notice.
-
Rewarding PhilanthropyCelebrities should not make up the bulk of award recipients.
-
‘The Artist’ to Ascend to Academy Glory
-
Men's Hockey Players Earn ECAC Awards