Years ago, in the early days of CD-ROM, live actor video footage was very common. Today, this is almost completely unheard of. Computer generated graphics are the standard. I recently tried the PS3 demo of Command & Conquer Red Alert 3 and was shocked to see live actor footage in a new game.
During it’s popularity, live actor footage had a bad reputation among gamers; it was the quick time events of the early 90’s. Gamers generally disliked it for a few reasons:
- It’s presence implies games that you watch rather than games that you play.
- It creates a discontinuity between gameplay graphics and cut scene graphics. In other words, playing a 3D rendered character and then switching to watch a live actor can kill the immersion of the game.
- Part of the allure of video games is that they are a showcase for flashy technology and programming. CG cut scenes show of the skills of the artists and programming teams. In-engine and in-game cut scenes are common bragging points. Live actor video is the complete antithesis of all of that: it avoids programming and technology.
- The quality of the acting is generally horrible. The field of video games may be known for bad stories and writing, but the live acting is an order of magnitude worse.
The Red Alert 3 acting is a rotten example: it’s purposefully campy, silly, and outrageously stupid.
However, today is a different time. We’ve seen a few games grow by leaps and bounds in delivering quality dialog, humor, atmosphere, and voice over work. I’d like to see a more serious dev team give this a shot and try to deliver something genuinely innovative. What do you think: good idea or is this a technique that should stay buried?
Written by: Darrin
- Contributing Editor