59

When playing games nowadays there are often story related cut-scenes, during which you cannot interact with the game, you can only listen to it or watch it while it plays.

However in a lot of games these story scenes are rendered live with the game engine, and not with pre-rendered videos. I can't seem to understand why the story scenes aren't pre-rendered during game development

My question is:

Why not pre-render a story cut-scene during game development as a video with the highest settings so the user doesn't need to render it on his/her end?

My guess is that game developers don't choose to do this because videos and real-time game rendering need a smooth transition a lot.

Vaillancourt
  • 16,325
  • 17
  • 55
  • 61
Thomas Wagenaar
  • 703
  • 1
  • 5
  • 7
  • 28
    There's also asset size. Rendering with the engine, you just store the animation data. Pre-rendering at high res means gigabytes of data. – Almo Nov 11 '15 at 20:33
  • 58
    I can't tell if you already know this, but fyi most games used to do their cutscenes with prerendered movies. They actually moved away from doing it that way. In particular, refer to FF7 and 8 on Playstation. – jhocking Nov 11 '15 at 20:54
  • @jhocking yeah, the good old times of FF7... blocky arms and 4 pixels face features don't make good cut-scenes :P – Vaillancourt Nov 11 '15 at 21:02
  • 4
    and the good older days, where they used actual actors in the cutscenes (like Jedi Knight) instead of rendered ones (like in the expansion of Jedi Knight) – SztupY Nov 11 '15 at 21:29
  • 3
    @jhocking Spot on! Don't forget FF10! – MonkeyZeus Nov 11 '15 at 21:48
  • 4
    Actually, there were entire games consisting only from pre-rendered scenes (some older adventures). It was back then when rendering 3D scenes in decent quality was big problem. – wondra Nov 11 '15 at 22:18
  • Anyone remembers Rise of the Robots or Killer Instinct on Super Nintendo/- Famicom? Looks like I'm Old Hat now. Not sure if good or bad ... – phresnel Nov 12 '15 at 12:45
  • 2
    Because you shouldn't have cut scenes at all. :-) – R.. GitHub STOP HELPING ICE Nov 12 '15 at 17:37
  • 2
    There is a possible compromise between the two approaches. The scene can be rendered live but with more frames buffered than during game play. The additional latency from more buffered frames would not be an issue in a non-interactive scene. – kasperd Nov 13 '15 at 01:34

5 Answers5

93

I can think of a couple of reasons:

  • Pre-rendering and recording the video cost a LOT more space on the disk that animating it live with bone animations.
  • Visual aspects. If you pre-rendered half of what you see in the game with "better" quality, the transition could make the game lose its flow. I can image following the main character of a game which is rendered with 1000 polygons during normal game play, but suddenly seeing him rendered with 2000 polygons, nice skin smoothing and 25 shader effects, this for two minutes then jump back with the low-res character... this would look odd.
  • Internal approbation/modification process. I would guess that the pipeline to simply modify an animation for a cut scene from within a game company is much simpler than modifying the original asset of the animation, render it and make have it approved.
  • No need for it (or 'because they can'). Game engines and hardware are getting more and more sophisticated and powerful, so the quality of what they produce is getting closer to cinema quality graphics, so since the end user machine can do it, why bother doing pre-rendering?
  • Flow. As you suggested, unless you push your engine to do that, there could be a small loading time before showing a video, whereas a transition to a cut-scene imply loading a smaller amount of data and a small game state transition.

Suggested from the comments:

  • Cost. Using a software with its architecture that optimize the pre-rendered scenes (such as Rad Game Tool's "Bink Video") can be costly to licence and implement. (Thanks @Honeybunch)

  • Customization. If the player can change their character's appearance, equipment, or party composition, then either these choices are not reflected in the pre-rendered sections (e.g. a canonical character/loadout/party is always shown) or a number of pre-rendered videos are required to cover all possible combinations (which quickly becomes impractical if the player has many choices). (@DMGregory & @Kevin)

Vaillancourt
  • 16,325
  • 17
  • 55
  • 61
  • You beat me to it. The only thing I would add is that there are 3rd party tools to handle video optimized for games, such as Rad Game Tool's "Bink Video". These solutions aren't free though and don't really solve the fact that you'd still be storing a large number of movie files. – Honeybunch Nov 11 '15 at 20:44
  • @Honeybunch You mean that they don't pre-render because the use of a 3rd party software to optimize the rendering would be too costly? – Vaillancourt Nov 11 '15 at 20:55
  • Yes, spending time on creating an optimal system for video playback is costly and there aren't many good, free solutions. The good solutions are very costly and don't really solve the problem of "we have a game engine, it works and this sequence is quick so lets not bother" – Honeybunch Nov 11 '15 at 20:57
  • 9
    The last point actually relates to the other ones in a tricky way. Game developers used to use prerendered cutscenes because the graphics in-engine weren't good enough (ie. they had to), but stopped doing it as soon as they could because of the other reasons. – jhocking Nov 11 '15 at 20:58
  • I agree with your points, but do note that for a good while this was indeed done a lot in games (because visuals were crap so they could only really shine in the cinematics). – Ludwik Nov 11 '15 at 21:39
  • Some speculation leading to an additional, unmentioned point: In-game graphics have become more cutscene-capable. Maybe I am thinking of different games than others in here, but I have a hard time remembering any older game that provided pre-rendered cutscenes that were actually using the game engine. Cutscenes were, for instance, 3D renderings in games that had tile-based 2D graphics, or cartoon-like graphics in otherwise rather pixelated games that did not support different camera angles (think some adventures like "Sam & Max: Hit the Road"). – O. R. Mapper Nov 11 '15 at 21:49
  • 8
    I think, especially during the transition away from pre-rendered cutscenes, a lot of developers were going realtime to prove that they could. It's still not uncommon to see trailers lead with the boastful promise "everything you're about to see is rendered realtime" - it's a selling point in this view, and falling back on something pre-rendered is sometimes felt to be "cheating" or trying to cover up for a game engine that's not up to the task. Also, realtime cutscenes can reflect the player's character/party customization, while pre-rendered would need to include vids for every combination. – DMGregory Nov 11 '15 at 22:29
  • 5
    If you pre-rendered half of what you see in the game with "better" quality, the transition could make the game lose it's flow. Yes, exactly. This effect was very noticeable (and jarring) in Final Fantasy 8, for example, where the in-game graphics were already very good anyway, for a PS 1 at least, but then suddenly you hit a cutscene and everything looks all different because you're in a movie instead of the game engine. – Mason Wheeler Nov 11 '15 at 23:16
  • 43
    Another point: If the player character has customizable anything (even if it's something really simple like outfit color), the player will notice when your prerendered cutscene doesn't match their customizations. Or else you have to prerender a separate cutscene for every possible custom character, which may or may not be viable (e.g. try prerendering every possible Commander Shepard or Dovahkiin). – Kevin Nov 12 '15 at 00:06
  • @DMGregory I did not add that there because Brian Rasmussen had already posted that exact answer before any one else suggested it in the comments. The point is valid and it's been added by the community so I won't edit it out :) – Vaillancourt Nov 12 '15 at 17:48
  • Good point, sorry I noticed that late. I can roll back my edit if you'd like. – DMGregory Nov 12 '15 at 17:50
  • @DMGregory I don't think it should be rolled back, no worries :) – Vaillancourt Nov 12 '15 at 17:51
  • 1
    (Probably unimportant from a dev perspective, but very appreciated by this user:) rendered scenes seem to provide better control than movies. Some (precious few) games allow you to pause while in a cutscene, which can be very useful during a long cutscene with someone banging on the door. I've yet to see a pre-rendered cutscene that I could pause. – muru Nov 13 '15 at 02:40
  • 1
    Also, modularity. English is not my native language, so bear with me on this thought: If you make a pre-render scene to show on certain story points, that scene will always be the same, unless you make different versions of the same ending to meet different criteria (like Mass Effect 3 final push). But if you make it "in-game", you can make small, procedural differences, like in Dark Souls. When that giant crow picks you up to take you to the Asylum? The number of feathers is always different, it's randomized. – Hugo Rocha Nov 13 '15 at 18:33
  • That brings to the game a different perspective each time the player runs the game and reach that point - minor differences can make huge differences - dark souls is full of that. It feels natural, unique. That's only possible with in-game cinematics. – Hugo Rocha Nov 13 '15 at 18:36
  • 9
    You forgot another important one: Real-time cutscenes allow the game to render at exactly the selected display resolution. With a pre-rendered cutscene, upscaling or downscaling and different aspect ratios will look bad. – Keavon Nov 13 '15 at 23:09
  • 1
    @muru: I'll have to check whether I misremember this, but I think the good old Lucas Arts adventures behaved excellently in that the game could be paused with the space bar at any point in the game, cutscene or not. Unfortunately, many newer games do not only not support this pausing at any time, they even make things ambiguous in that it is not intuitively clear whether space (or whichever key you use to pause) will actually pause the cutscene (so you can complete watching it later) or skip the cutscene (so you cannot watch it at all). – O. R. Mapper Nov 16 '15 at 12:18
44

A lot of game allows characters to change appearance during play. Rendering cut scenes in the engine can reflect these changes. With videos you don't have that flexibility.

House
  • 73,224
  • 17
  • 184
  • 273
Brian Rasmussen
  • 543
  • 3
  • 7
  • 31
    This can have hilarious consequences when your character is involved in a serious cutscene while wearing the wackiest costume you can find. – BenM Nov 11 '15 at 22:05
  • 3
    I agree, and I think that's the entire point of games like Saints Row :) – Brian Rasmussen Nov 11 '15 at 22:07
  • 3
    I remember a game that exactly had this problem (Gothic 1 and Gothic 2). It was solved by needing a special outfit to go on in the game to go on (so the outfit matches the one in the rendered scene). Another point that might be interesting is future calculating power. Older games can now be played with higher resolution and details then when they hit the market. Depending on how the scenes are done the quality in game might be higher than the scene. – Skalli Nov 12 '15 at 12:59
  • 6
    @BenM. This reminds me of Assassin's Creed 4: Black Flag, in which the player can wear costumes with or without hoods. However, since the official Assassin robe features a hood, we get cutscenes in which Edward tries to cover his face with his hood, even though his current attire doesn't have one. It looks really silly. – Nolonar Nov 12 '15 at 13:38
  • 2
    This is 100% of the reason that we use in-engine cutscenes instead of prerenders. Being able to do our cutscenes in Maya would save us a lot of tool development, and there's enough space on the blu-ray to store the video we'd need. But our game lets players costume their character, so that costume needs to be in the cutscenes. – Crashworks Nov 12 '15 at 21:30
  • 1
    I remember being impressed when the cutscenes in the original GTA3 picked up smoothly from the action, with all the cars, bodies, etc wherever you left them. – AShelly Nov 13 '15 at 18:37
23

Other advantages of rendering cutscenes in-engine include:

  • It scales to the hardware, unlike pre-rendered videos. Some older games have cutscenes rendered in 720p or lower, which simply don't look as nice on a 4k or higher resolution screen.

  • It is more mod-friendly. If you choose to install high-res texture mods, or simply wish to replace your Witcher's horse with a cool-looking giant wolf, those modifications would be nowhere to be seen on a pre-rendered video.

  • Lip synchronization is much easier to deal with in-engine. Nowadays, lip animations depend on audio tracks. That means the voice actor doesn't need to sync his speech to the character's lips' movements. If you ever need to localize your game from one language to the next, your voice actor only needs to focus on his lines and you don't need to re-render the scene.

  • It is more flexible. Whether your character is barely standing upright due to his injuries, or whether he's drunk, his current state can be reflected much more accurately when rendered in-engine, than on a video.

  • The environment is reflected more accurately. Maybe you just burned down the entire forest, or you somehow managed to avoid Aerith's death against all odds. Well, a pre-rendered video can't account for every eventuality.


In the end it all boils down to the fact that cutscenes rendered in-engine are dynamic, and therefore offer flexibility which a pre-rendered video simply cannot.

Nolonar
  • 947
  • 1
  • 7
  • 19
19

Games which got famous for pre-rendered cutscenes looking far better than the actual game are the PlayStation 1 era Final Fantasy games. The protagonist of Final Fantasy VII, for example, looked like this during cutscenes:

enter image description here

And like this during the actual game:

FF VII Cloud ingame

What is the problem with this? The difference between the aesthetics during cutscenes and actual gameplay was so different, that it barely looked like the same game. The break between cutscenes and gameplay was so obvious, it interrupted the immersion and challenged the players suspension of disbelieve.

Nowadays games avoid this by rendering narrative cutscenes with the game engine. That way the cut between interactive and non-interactive portions of the game is far more seamless.

Although it is something which is not 100% abandoned. The Witcher 3, for example, integrates a few pre-rendered CGI sequences. Famous is the intro cutscene, but there is also some subtle use during some key cutscenes in the actual game which can only be noticed by paying attention to video compression artifacts. The consistency problem due to character customization is solved by doing this mostly during scenes where no main characters are on the screen.

Philipp
  • 119,250
  • 27
  • 256
  • 336
  • 5
    For me seeing close ups of ingame graphic is far more immersion breaking than a pre-rendered video with better graphics. Especially since faces/hairs aren't handled well in real time, even today. – CodesInChaos Nov 12 '15 at 13:55
  • 3
    @CodesInChaos even games which render cutscenes in realtime frequently use higher-quality models, textures, and facial animation during the cutscene portions (since the rendering budget is easier to control in these scenes as compared to when the player is free to roam the whole world, they can afford more detail here where it's most visible). So even in real-time cutscenes, you're not necessarily looking at just a zoomed-in view of the regular character assets. – DMGregory Nov 12 '15 at 17:48
  • 3
    I think Resident Evil 2 handled this problem beautifully, IMHO. Almost all cutscenes start (or are entirely) in-engine procedure, and gracefully transition to CGI. I think this is also comparable to Myst's use of FMV vs., with rare exception, pretty much any other FMV game of the 1990's. Of course, CGI was a little more impressive in 1996 than FMV was in 1992, so I suppose we were more willing to put up with awkward flow to have our CGI cutscenes. – user1103 Nov 13 '15 at 11:36
2

Another issue of pre-rendered cutscenes, is covering possibilities, as in today's games, your actions changes the course of in-game story.

Therefore, resulting in countless different combinations of outcomes and so, countless different pre-rendered cutscenes. This has some obvious limitations (Hundreds of GB, possibly TB).

Having the game engine render it in real time, means that you only need to send the relevant parameters containing what actions the gamer took up to that point. The game engine then calculates what the possible outcome is from those parameters.

This way, from only one engine, you can add as many possible outcomes you want.

Teio
  • 21
  • 3