The Advantages & Disadvantages of CGI
One Film Nerd’s Opinion
Out of all of the modern, overused pieces of movie technology, none have probably been as heavily debated among film fans as Computer Generated Images (CGI). CGI was first introduced in major films as early as Michael Crichton’s Westworld (1973), and is now Hollywood’s “go to” for special effects, and has been for well over the last decade. CGI has been both criticized and praised by film audiences. Some see it as a way to push the special effects to their fullest limit, creating new worlds and expanding possibilities, while others, view it as being overused, fake, and never as good as reality. Personally, I find the use of CGI, in modern Hollywood films, to be a bit bothersome, but acknowledge that it also has its advantages.
CGI, as we know it, was first developed in Russia, in 1968, by mathematicians and physicists using a groundbreaking computer model called BESM-4, that allowed them to move an animated cat across the screen. This BESM-4 computer was able to print hundreds of frames that could, eventually, be converted into film. Later, CGI really gained popularity in the entertainment business in the 1970’s, and the first 2D animated effect appeared in the film Westworld(1973) when actor, Yul Brynner, conceived a sharpshooter gun shot. In that film, the 2D effect was nothing to write home about, as it didn’t feel very “CGI-ish”. Instead, it showcased a more primitive version of what we now know as CGI. The lambasted sequel, Futureworld(1976), took it further, showcasing the first 3D CGI with a simple character appearing on the screen waving his hand. Laughable and outdated by today’s standards, for sure, but back then that was considered groundbreaking in film. The movie that really brought CGI into the forefront, and the same film that basically changed blockbusters, as we know of them today, was, of course, George Lucas’ Star Wars(1977). Lucas saw potential in CGI, and incorporated into Star Wars Larry Cuba’s work from the University of Chicago, in the video of the “Death Star” shown to Rebel fighters during their training.
Following Star Wars’ wake, there were several films that used CGI in various forms as film producers started to see more potential in the format. Superman: The Movie(1978) was the first film to use an all CGI title sequence, and a very stirring one at that. Films like Alien (1979), and Disney’s The Black Hole(1979), pushed the format further with the creation of 3D wireframe rasters (a graphic technique using arrays of pixel values), which made for more detailed CGI effects. Even though CGI was being used more and more into the 80’s and early 90’s, it was only used for certain things, such as the genesis effect (a fractal-generated landscape) in Star Trek II: The Wrath of Khan (1982). The first extensive use of fully rendered CGI was in Tron(1982), followed by the first integrated use of CGI in The Last Starfighter(1984), and the first photo-realistic CGI character, the stained glass knight, in Young Sherlock Holmes(1985).
Even as Hollywood continued experimenting with CGI, the earlier use of special effects, like practical effects and blue-screen, were still the preferred format until the 1990s with the success of two films that would change the industry forever: James Cameron’s Terminator 2: Judgment Day (1991), which had the first partially CGI main character, T1000, as well as, the first to create major movie 3D effects; and Stephen Spielberg’s Jurassic Park(1993) with its photo-realistic CGI dinosaurs. For better or worse, both films made huge changes to CGI as we know it.
With the success of both Terminator 2 and Jurassic Park, filmmakers were willing and able to use CGI more, in a variety of ways, often replacing everyday items and characters that otherwise would have been shot realistically. This led to film directors being criticized for “abusing” the format. Such examples of overuse of CGI in the 1990’s included Scorpion, Subzero and Reptile in Mortal Kombat(1995), and Hell in Spawn(1997), which was lauded as being the superhero visual spectacle at the time, but, in truth, looked laughable and outdated even then. Each of these films, and their use of CGI, made critics say it “looks like a videogame”.
As CGI become more dominant, filmmakers continued to find ways to push the format. Peter Jackson’s Lord of The Rings (2001- 2003), for example, introduced the first use of Artificial Intelligence (AI) for the character of Gollum in a format that was developed by Weta Digital out of New Zealand. After the Lord of The Ringsseries, CGI seemed to have officially become Hollywood’s primary source for special effects, and practicals and blue screens seemed on their way out, much to the remorse of film fans, such as myself. Probably the biggest recent development of CGI though, came in the form of James Cameron’s (always one to push the format) Avatar(2009), which used performance capture to create photo-realistic 3D characters, and featured a fully CGI 3D photo-realistic world, setting the stage for many films to come in the years that followed, like the recent Planet of The Apes films (2011, 2014, 2017).
While CGI definitely had its benefits, it also created a film world that seemed so much less real. Sure, fantasy is fun, but when your characters are literally there, it makes watching the film all the more impactful. We look at the special effects of the past, and maybe we laugh, but at least, they used real models, whereas, CGI often feels forced and unnatural.
That being said, as special effects have grown, CGI has accomplished so much. Many films could not have been brought to the screen had it not been for the advancement of CGI, like in Doctor Strange’s(2016) astral world or in Gravity(2013). Similarly, CGI has gotten so good in the past few years, it’s managed to create creatures much more lifelike than ever before, as evident by Disney’s recent remake of The Jungle Book (2016). So, truthfully, CGI does have its perks. It’s just the overuse and over reliance of it in many films that people, such as myself, have issue with. The best way to use CGI, in my opinion, is to mix it in with a prominent use of practicals, which filmmakers, such as Christopher Nolan, have masterfully employed in films, such as in The Dark Knight (2008) with a building blowing up, in Interstellar (2014) during the climax of the movie, or recently, in Dunkirk(2017) for its war scenes. Hopefully, future filmmakers won’t forget that having real props, set designs, and actors help enormously with the overall experience, and to not overuse CGI. I know I would like that.
To read my Spiderman: Homecoming Film Review - Marvel Makes Spider-Man Relevant Again click here.
To read my Dunkirk: It's The Real Deal Film Review click here
Click here to sign up to get an email alert every time I add a blog entry. And if you like an article, please share it with your friends. Facebook works great, or just copy the web address and e-mail it. Thank you!