3

My understanding is that the differences in graphic quality between games for systems like Xbox One and PS4 and Pixar-style movies stems from that the former are rendered in real-time.

But even with real-time rendering, it looks like some game makers can get rather nice looking results. An example below is about the new Gears of War game:

scene from Gears of War

Above image is from this YouTube video posted by GameSpot. So admittedly it could be from a video (e.g. a promo) that has been rendered more than the actual game.

Anyway so what I wanted to ask is, are games solely dependent on the real time rendering on the target system, or are there some steps developers can take (besides the obvious: start with as good graphics as possible) to improve the graphics in their game?

Edit: here I am referring strictly to in-game graphics.

  • Are you talking specifically about in-game graphics or cutscenes? While it's getting much better these days, cinematic scenes are much different than in-game graphics, even if they are real-time/mechanim. – Fuzzy Logic Aug 23 '15 at 00:36
  • @Fuzzy Logic, I have now edited my question (I was referring only to in-game graphics) –  Aug 23 '15 at 02:30

0 Answers0