August 17, 2022

Animation is a branch of film-making that has been part of every child’s life ever since the first animation films were made. Of course, the first name that comes to mind when you hear the word ‘animation’ is Walt Disney and their timeless ‘Mickey Mouse’. Others might think of ‘The Simpsons’, or you might even think of Studio Ghibli, who have produced, without a doubt, some of the greatest animation films ever made.

One of the first versions of Walt Disney’s Mickey Mouse
A scene from Studio Ghibli‘s My Neighbour Totoro

 

What is CGI?

The word ‘animation’ is self-explanatory when you think of the meaning of the word itself— ‘bringing to life’ or ‘animating’ something— which, in this case, would be images or a series of drawings to come together to form motion films. People often use the term ‘CGI’, perhaps not synonymously, but definitely in close relation with this technique of film-making.

You hear it when people talk about fantasy films like ‘The Lord of the Rings’ or ‘Avatar’. But video games and action films are the main playgrounds for 21st century CGI. Off the top of my head— Marvel Studios’ ‘Avengers’. We hear the actors speak about ‘CGI’, ‘green screens’, and ‘visual effects, all of which contribute heavily to the landscapes and stunts we see in the movies. Unless you’re a person who’s interested in film-making, graphics and technology— the basic assumption that you make when you hear this term is that it is digitally-based edits involving all sorts of technology which create those unrealistic effects in the movies.

A scene from Marvel’s ‘Avengers’ showing us the CGI used for the background and the Hulk

But modern CGI consists of much more than this misconstrued concept of digital additions to live-action movies.

The full form of CGI is ‘computer-generated imagery’. Often called computer animation, it is a form of animated graphics using computer technology. CGI has, in many places, replaced both the ancient ‘stop-motion’ animation of scale-model puppets and the hand-drawn animation of a series of drawings (modern 2D animation continues to use this technique).

Why develop this software when simple drawings suffice?

The reason why experts developed CGI is quite obvious. 2D animation techniques often involve an entire team of artists drawing out every frame for the film so that when the editors put the drawings together, they create an illusion of motion when, in fact, they are still frames of art being played at 25-30 frames per second. Technology in the digital age has given production houses a less laborious, cost-effective and frankly much easier option than the entire process of drawing out each frame. Computers can be used in every step to supply the in-between drawings for the animation. However, a team of specialist artists is still needed to create high-quality computer animation.

A animation specialist working on the CGI used in Star Wars.
The last stage of rendering CGI animation

This rise in computerisation also gave filmmakers another dimension in animation they had never considered possible. A three-dimensional motion picture would mean live-action, wouldn’t it?

Another dimension

Until the 1950s, people thought they could only use the term ‘animation’ for films which were clearly ‘two dimensional’ and made out of moving drawings, like the original Disney movies. But computers could also translate figures into their software and generate a sequence of images that seemed to move and rotate through space, giving the figure another dimension on the screen. This sort of animation is the type you would see in movies like Toy Story, Zootopia, Encanto or Frozen, to name a few. Specialists in this field called these ‘computer-generated images’ or, in short, CGI’. But to truly appreciate and understand the considerable step that legends in this industry took to create animation as we know it today, we have to go back to the 1940s to witness the birth of computer animation.

The CGI animation editing behind a scene in Frozen

The birth of computer animation

In the 1940s, people started experimenting with computer graphics to further science and technology. John Whitney Sr., a composer, animator, and inventor, built a computer device from a converted Kerrison Predictor, a World War II-era anti-aircraft fire-control system. Using specific mathematical formulas, Whitney allowed this invention to manipulate lines and shapes. Then, in 1958, with the assistance of graphic designer Saul Bass, he tested out this computer and animated the opening title sequence for Vertigo, produced by Alfred Hitchcock. This film is one of the first live-action films to use computer animation, and after this movie, Whitney earned a well-deserved spot in animation history.

The poster of Alfred Hitchcock’s Vertigo

The arrival of digital computers

The real boom of development in computer animation came after the arrival of digital computers in the 1960s. The Universities of Utah and Ohio were the first to establish computer animation departments on the campus. At the same time, other institutions, like the National Film Board of Canada, also began experimenting with this new discipline.

This was the beginning of the race to create a well developed, cost-effective and efficient animation software. Researchers mainly directed these early explorations in computer animation toward scientific and engineering purposes. But after Whitney’s groundbreaking progress in Vertigo, the creative aspect of this new technique would never die out.

The first CGI short films

Zajac’s CGI short film- ‘Simulation of A Two Gyro Gravity Gradient Attitude Control System’

Edward E. Zajac, from Bell Laboratories, New Jersey, made one of the first-ever computer-generated films in 1963. It was a simple animation sequence representing a satellite orbiting around a sphere representing the earth and had the title— ‘A Two Gyro Gravity Gradient Attitude Control System’.

By the 1970s, the common public officially recognised computer animation as an art form. Government funding and an increasing amount of interest from the public encouraged the evolution of graphic design tools and software for computers.

Ed Catmull, in 1972, produced ‘Hand/Face’, one of the notable computer-generated short films of this period. But his film remained in the inner circles of computer science research circles and laboratories.

Catmull’s CGI animation short film- Hand/Face

CGI reaches the big screen

The 1973 movie ‘Westworld’, produced by Director Michael Crichton, was one of the first CGI films to reach the big screen.

Crichton asked John Whitney Jr and Gary Demos of Information International, Inc. to create some of the scenes in Westworld to represent the point of view of the android in the movie. He wanted to give these scenes a digitally enhanced feeling, and Whitney Jr didn’t fail to impress. Using the best of 70s technology, he processed the scenes to make them digitally pixelated and blurred, keeping in mind Crichton’s vision for the movie

The POV of an android in Westworld
Another CGI shot in Westworld

In 1979, George Lucas took some of the top talents from the Computer Graphics Laboratory at the New York Institute of Technology. He set up a special effects division called the Graphics Group of Lucasfilm, which eventually introduced the first wireframes in CGI movies. Later, in 1986, Apple’s Steve Jobs funded this group to branch off and create the now very well known animation production house— Pixar.

Epilogue

This post was just your first glimpse at the long journey that CGI technology took before becoming the masterpiece of software it is now. To read about what maybe the most significant decades of development this software went through, stay tuned for the next post in this series!

2 thoughts on “Animation Technology Pt.1 – An Introduction to CGI

Leave a Reply

Your email address will not be published.