Computer Animation Entertainment

3571

7812

7999

INDUSTRY SNAPSHOT

By the mid-2000s, computer animation was a billion-dollar business. The use of digital technology was becoming more pervasive throughout the motion picture industry, accounting for as much as half of the $150 million typically spent to make a feature film. Competition was heating up as major studios increased production and relied more heavily on computer animation for both animated and regular feature films.

By the 2000s, movie and television audiences could watch celestial collisions pulverize Earth, realistic dinosaurs hunting terrified scientists around an island, and insects and toys living out their own dramatic fantasies in feature-length films. Moreover, all of this looked perfectly normal. Elaborate computer-generated special effects have become the motion picture industry's standard. Developments in computer animation reached a fever pitch by the turn of the twenty-first century, and studios raced to employ the latest developments in order to showcase the most dazzling, amusing, and terrifying effects Hollywood had to offer.

In their zeal to wow audiences, directors and studios took turns beating each other over the head with their checkbooks. Early on, films such as Twister, The Matrix, Titanic, and a host of others all carried computer animation budgets that alone would have made studios of yesteryear blanch. By the early and mid-2000s, such budgets were par for the blockbuster course. At the same time, however, advances in technology were allowing movie producers to make less expensive animated films.

Animation, the art of producing the illusion of movement from a sequence of two-dimensional drawings or three-dimensional objects, has long been a staple of the entertainment industry. Animation can take on many shapes, ranging from primitive drawings in television cartoons such as The Flintstones to complex dinosaur-sized creatures in Jurassic Park. The 1990s saw an animation renaissance, and during this period the animation studios and production companies that were able to find the right combination of creative talent and technical wizardry found a burgeoning marketplace for their products. Computer animation's entertainment uses were spread far and wide. From the dreamlike landscapes of What Dreams May Come to the intricately detailed talking mouse of Stuart Little to lifelike digital characters and imaginary landscapes in 2005's Star Wars: Episode III—Revenge of the Sith, computer animation continued to transform the face of cinematic reality into the mid-2000s.

ORGANIZATION AND STRUCTURE

Animation has always been a labor-intensive process, and as a consequence, most animation projects are collaborative efforts integrating the talents of animators, technical directors, producers, artists, and engineers. Even though computer animation is technology driven, the workflow for an animated feature movie is still essentially the same as it was in the earliest days of animation. While the computer has replaced some hand drawings, it has not entirely eliminated pen-and-ink sketches. An animation project begins with the creation of a storyboard, a series of sketches outlining the important points of the story and some of the dialogue. When animation was strictly done by hand, the workload would then be distributed between senior artists, who sketched the frames where the most action was occurring, and junior artists, who filled in the in-between frames. When computers are used, artists use the storyboard sketches to create clay figures that are made into digitalized three-dimensional (3D) characters, which are then manipulated by animation artists who also create the background fill. Altogether, an animated feature is the collective work of many people, including animators, lighting experts, story writers, and sound technicians.

BACKGROUND AND DEVELOPMENT

From the very beginning, studios employed animation artists, people who could painstakingly draw quirky animated characters by hand. Before the arrival of computers, in fact, all animation was created in this way. Production teams simulated motion by drawing a series of successive, incrementally altered frames. In order to trick the eye into seeing motion, each second of animated sequence for film required 24 frames. In the earliest cartoons, each one of the frames was hand drawn, thereby creating a tremendous workload just to complete a short cartoon. For instance, the 1910 cartoon Gertie the Trained Dinosaur, which was primitive compared to later animation, required 10,000 separate drawings. In 1915 Earl Hurd streamlined the process by developing a time-saving method known as cel animation whereby each individual character is drawn on a separate piece of transparent paper while the background is drawn on a piece of opaque paper. When the animation was shot, the transparent paper was overlaid on the opaque. With this method, the background was drawn once and only the parts that needed to be changed had to be redrawn instead of the entire frame. The animation industry flourished this way from the 1930s to the 1950s, largely through the efforts of pioneer Walt Disney, who produced such full-length animated movies as Dumbo, Bambi, and Snow White and the Seven Dwarfs.

Thus, for many decades, hand-drawn animation was the industry standard. Filmmaker John Whitney began to change that in the late 1950s and early 1960s. Whitney pioneered motion graphics using equipment that he purchased at a government war surplus auction. These precise instruments allowed Whitney to develop motion control of the camera, zoom, and artwork. Later these techniques would be used to create the star-gate slit-scan sequence in Stanley Kubrick's 2001: A Space Odyssey (1968). In 1986, Whitney received an Academy of Motion Pictures Arts and Sciences "Medal of Commendation for Cinematic Pioneering" in recognition of his contribution.

The 1960s saw another development that led to the eventual rise of computer animation. In 1963 Ken Knowlton, who worked at Bell Laboratories, created a programming language that could generate computer-produced movies. It was not until 1982, though, with the release of Disney's Tron, that computer-generated imaging (CGI) would be explored as a serious moviemaking technique. The widespread arrival of computers in the 1980s marshaled in teams of new workers (namely scientists, engineers, and programmers) capable of developing complex animation software. Computers thus revolutionized the animation process, supplementing traditional animation methods with hardware and software capable of creating realistic on-screen characters.

Advancements in the computer animation field swept the entertainment industry in the 1990s. Before 1995, conventional wisdom held that computer-generated imaging (CGI) was too inexact a science to replace hand-drawn animation, which many felt was the only way to capture...

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT