Image Quality - Xbox One vs. PlayStation 4

This is the big one. We’ve already established that the PS4 has more GPU performance under the hood, but how does that delta manifest in games? My guess is we’re going to see two different situations. The first being what we have here today. For the most part I haven’t noticed huge differences in frame rate between Xbox One and PS4 versions of the same game, but I have noticed appreciable differences in resolution/AA. This could very well be the One’s ROP limitations coming into play. Quality per pixel seems roughly equivalent across consoles, the PS4 just has an easier time delivering more of those pixels.

The second situation could be one where an eager developer puts the PS4’s hardware to use and creates a game that doesn’t scale (exclusively) in resolution, but also in other aspects of image quality as well. My guess is the types of titles to fall into this second category will end up being PS4 exclusives (e.g. Uncharted 4) rather than something that’s cross-platform. There’s little motivation for a cross-platform developer to spend a substantial amount of time in optimizing for one console.

Call of Duty: Ghosts

Let’s start out with Call of Duty: Ghosts. Here I’m going to focus on two scenes: what we’ve been calling internally Let the Dog Drive, and the aliasing test. Once again I wasn’t able to completely normalize black levels across both consoles in Ghosts for some reason.

In motion both consoles look pretty good. You really start to see the PS4’s resolution/AA advantages at the very end of the sequence though (PS4 image sample, Xbox One image sample). The difference between these two obviously isn’t as great as from the 360 to Xbox One, but there is a definite resolution advantage to the PS4. It’s even more obvious if you look at our aliasing test:

Image quality otherwise looks comparable between the two consoles.

NBA 2K14

NBA 2K14 is one cross platform title where I swear I could sense slight frame rate differences between the two consoles (during high quality replays) but it’s not something I managed to capture on video. Once again we find ourselves in a situation where there is a difference in resolution and/or AA levels between the Xbox One and PS4 versions of the game.

Both versions look great. I’m not sure how much of this is the next-gen consoles since the last time I played an NBA 2K game was back when I was in college, but man have console basketball games significantly improved in their realism over the past decade. On a side note, NBA 2K14 does seem to make good use of the impulse triggers on the Xbox One’s controller.



Battlefield 4

I grabbed a couple of scenes from early on in Battlefield 4. Once again the differences here are almost entirely limited to the amount of aliasing in the scene as far as I can tell. The Xbox One version is definitely more distracting. In practice I notice the difference in resolution, but it’s never enough to force me to pick one platform over another. I’m personally more comfortable with the Xbox One’s controller than the PS4’s, which makes for an interesting set of tradeoffs.

Image Quality - Xbox 360 vs. Xbox One Power Consumption
Comments Locked

286 Comments

View All Comments

  • A5 - Wednesday, November 20, 2013 - link

    AMD is pretty bad at power consumption. See: Bulldozer, R9 290, etc.
  • JDG1980 - Wednesday, November 20, 2013 - link

    That's not really the best comparison, though. Kabini, which uses the same Jaguar cores as the PS4 and XB1, has very good power consumption figures at both idle and load. AMD's mid-range GPUs like the 7790 and 7850 equal or beat Nvidia's solutions in terms of performance/watt.

    Bulldozer was an inefficient design, no doubt about it. Piledriver was a bit better and Steamroller should be better still. But none of that is being used here.
  • Hubb1e - Wednesday, November 20, 2013 - link

    I really think this is a case of MS and Sony failing to add the necessary code to take advantage of the silicon. I think they had so many things to do to get these systems working that idle power consumption fell into the le'ts do that later category which greatly simplifies everything from the initial coding of the OS to the testing and validation. Anand thought that maybe that silicon for turning off cores wasn't there. I doubt that and I think it will be coming with a patch in the 3 -12 months timeframe.
  • mikato - Monday, November 25, 2013 - link

    Agree, and I don't know why Anand thought AMD didn't make that available. No reason to remove it that I know of.
  • kallogan - Wednesday, November 20, 2013 - link

    A powerfull PC with quad core i7 and a GTX Titan can idle below 30W. Gosh these are really prehistorical devices. Not green.
  • kyuu - Wednesday, November 20, 2013 - link

    Source please? I don't doubt it idles lower than either console, but 30W seems pretty low to me.
  • ydeer - Thursday, November 21, 2013 - link

    30W is low, but not out of the realm of possibility.

    The HardOCP Haswell test system with 16GB RAM and two SSDs used 32W idle. (http://www.hardocp.com/article/2013/06/01/intel_ha...
    A Titan would add less 10W to that because the IGPU would be completely disabled. (http://www.techpowerup.com/reviews/nvidia/geforce_...

    So maybe not "less than 30W", but 35W idle should be absolutely possible for a Haswell/Titan machine.
  • ananduser - Wednesday, November 20, 2013 - link

    Sorry for the offtopic Anand, but since you mentioned cutting the cord a few years ago... care to share with us your avenue of choice(as in streaming services, set top boxes and whatnot) ?
  • tipoo - Wednesday, November 20, 2013 - link

    The PS4 browser being twice as fast is a surprise, since the CPUs are so close. Do we know the official PS4 CPU clock yet?
  • bill5 - Wednesday, November 20, 2013 - link

    it's 1.6. vs 1.75 on xone.

    anand speculates the ps4 is using more cores for os.

Log in

Don't have an account? Sign up now