With CGI ever more prevalent, should filmmakers be focusing on computer-generated characters over the real deal?
In recent times, filmmakers have taken us for uncomfortable and awkward trips through the ‘Uncanny Valley’. This “valley” is the dip in a graph of comfort level. In imagery, this line graph extends from stick figures at its lowest, to real life humans at its peak, the dip coming right before real life. By getting close, but not close enough to photo-realism, viewers in turn become freaked out by what they’re looking at, unable to fully relate.
We become so aware that we’re watching second rate visual effects that the film world we were engrossed in stops being real
People can be scared by unfamiliarity when placed out of its own context, but in the horror genre, for example, this can be a good thing, usually due to the use of bone-chilling prosthetics. Dolls, masks, even advanced robotics have frightened mankind inside and out of the multiplex for generations. When it comes to lathering a film in CGI, however, it more often than not causes the audience to jump out of the diegesis entirely. We become so aware that we are watching second rate visual effects that the film world we were originally engrossed in stops being real. We become aware.
Speaking of horror, much to its detriment, this visual effect reliance has become a trend, CGI becoming an unwanted replacement for good, old-fashioned make up. Mama is a good recent example. Where it excels in all the standard, tension-rising moments and scares (that even the worst of horror flicks, even when lacking in story, can get right), the lasting effect by its conclusion falls flat, as we yawn over the same contorted computer-generated faces reminiscent of those in I Am Legend.
There is a reason as to why Chucky, Michael Myers, and even the gremlins scared so many – it was because their aesthetic didn’t look out of place in this real world. Now with filmmakers presuming they can simply hurl visual effects at a piece in post to make it work (and why shouldn’t it? You can create literally anything, right?), they seem to be skipping the special attention once afforded to the creative process, and instead abuse an oft-abused VFX industry to do the work for them.
It’s said you can tell who the best visual effects artists are by how little you notice their work – Nolan is particularly adept at this
But this is apparent across all genres – even films with the biggest budgets cause us to roll our eyes dizzyingly as they try desperately to crawl out of a valley they stumbled into themselves. Just this year, reason was negated on production of the monk characters in 47 Ronin, whose faces were wrought in completely unnecessary visual effects, where prosthetics would have easily sufficed. Often with CGI-heavy movies, there are going to be at least one or two moments that simply can’t hold up because of the duration (the likes of the ‘great’ Avatar included). CGI is sticking out like a saw thumb at the moment, and people are beginning to appreciate special effects (effects created ‘in-camera’, on set during the production) over visual.
If anything, a subtle amalgamation of the two is what’s needed. It’s said you can tell who the best visual effects artists are by how little you notice their work. Nolan was particularly adept at this when it came to his direction of The Dark Knight Rises. Here, CGI became a tool to aid an already firm backbone of practical effects. Why lose the style points? Crossing platforms for a moment, with the next generation of consoles trying to make their mark, people are becoming obsessed with playing computer games and seeing how well textured a door frame is over being immersed in the game’s overall beauty.
More on film: Did we lose Peter Jackson to CGI overload?
Of course, computer games are a whole other story, where actors are reduced to mo-cap puppets and voiceovers for practicality. Interestingly though, one of the closest examples of breaking past the uncanny valley through pure animation came about in the cut scenes of Halo 4. Spliced together, they were basically at cinematic length, too. If computer games can do it, why can’t live-action film get it right? To put it simply, it shouldn’t have to.
It may be difficult to ever reach past the uncanny valley, not that there would be much point with human characters anyway – why not just use humans?
It may be difficult to ever reach past the uncanny valley, not that there would be much point with human characters anyway – why not just use humans? This is why Gravity succeeded. You can notice how, despite almost in the film everything being a visual effect, the faces themselves weren’t, thus allowing us to relate to the characters. With all the technological advancements, we’re certainly close to scraping our way out of the valley and rendering our audience incapable of telling the difference, but outside of games, what would be the point? Why risk it? It could be another millennia before that viewer immersion becomes a universal reality for all film and we’re bound to be faced with many VFX flops.
Spectacle is forever growing, the innovators of film wanting to take it to the next level over and over again, which is good. Visual effects expanding the inorganic, and even the alien, have proven successful time and time again, but those films have to be rooted in human reality, otherwise we can never relate to their heroes. Passing the uncanny valley in that regard is certainly an attainable goal, and has been attained in small doses, but maybe as a natural instinct, we as a species simply like our humans ‘human’.
More on CGI: Hollywood’s next big fad should be realism
Featured image: Warner Bros
Inset image: Universal; Warner Bros