I wholeheartedly agree with Patrick that cut-scenes need some revision -- someone needs to send an editor to the Kojima offices, for instance. I don't particularly like superfluous sequences that break the gameplay's flow; however, I don't think quick-time events are the answer, either (as I've written before). I still think the closest we have to seamless narrative in video games is the Half-Life approach that puts a premium on interactive storytelling (to a degree).
So, what makes a good cut scene? I'll share a few points that gamers should keep in mind when determining whether a cut-scene has had some time and effort put into it.
The action within the cut-scene remains the same as the gameplay the user experiences or at least doesn’t exceed it. This is, in some ways, the most important point. Often in cut-scenes found in intense action games such as Resident Evil or Crysis, the character isn't actually able to perform the cut-scene's action during normal gameplay.
And this completely ruins any sort of immersion the gamer has. If you’ve been controlling a character for a solid 10 hours, you’re going to feel in sync with that character and know how to operate him or her efficiently.
But too often gamers see that character in cut-scene perform moves that aren’t in sync with the way they have been playing. The character might move too quickly, use her hands when she might have relied on weapons, and so on.
The cut-scene at some point provides some type of action for the user to perform. Some users might know of these as “quick-time events.” These operate just like a normal cut-scene, but within that scene the game gives users some type of action to perform: an attack, a handshake, or whatever it may be.
Sometimes there is a consequence for missing the event, and sometimes there isn’t. Many gamers hate quick-time events, but they are at least better than doing nothing at all and leaving the gamer to sit back and watch. Taking Mass Effect as a great example, games need to live up to their actual definition and provide some type of gameplay for users to interact with.
Cut-scenes are fine, but they need to be coupled with action to make them worthwhile.
The cutscene contributes something to the plot. There’s a saying in screenwriting that basically states: A writer needs to start a scene as late as possible, and then finish that scene as early as he can. The same applies to cut-scenes. Developers really need to be asking themselves: Is this cut-scene really necessary?
At any point, can the information in the cut-scene be delivered through gameplay, which should be given preference over the passive activity? If at any point there is doubt over that, then gameplay should take precedence.
The Call of Duty series is the absolute worst offender of this. The franchise has gamers interact with their characters, and all of a sudden they will enter a quick-time event that sees them slide down a hill, or grab on to a cliff. Why do something that completely ruins the gamer’s immersion instead of providing her with the tool to actually interact with the game on a deeper level? If given the choice, allow gamers hang on to the cliff themselves -- they shouldn’t have it done for them.
Cutscenes need to actually integrate with the story. Every cutscene -- every single one -- must provide some type of information to the player. That information might be visual or verbal, but it must contain something they didn’t know before.
Simply showing something cool, like an explosion and the character dodging away, might be fun but it isn’t actually delivering the player any type of narrative prompt. For all of its benefits, Prince of Persia: Sands of Time was pretty bad at this.
After every fight, there would be a short cut-scene showing the Prince putting his sword back into place. Who cares? Do we need this? Not at all -- it provides nothing.
Cut-scenes are a tool -- they are not the finished project. They need to assist the player in coming to the conclusion of the story -- not just give her something pretty to look at.
















