Balancing The System With Other Hardware Features

The biggest technological advantage consoles have over PCs is that consoles are a fully-integrated fixed platform specified by a single manufacturer. In theory, the manufacturer can ensure that the system is properly balanced for the use case, something PC OEMs are notoriously bad at. Consoles generally don't have the problem of wasting a large chunk of the budget on a single high-end component that the rest of the system cannot keep up with, and consoles can more easily incorporate custom hardware when suitable off-the-shelf components aren't available. (This is why the outgoing console generation didn't use desktop-class CPU cores, but dedicated a huge amount of the silicon budget to the GPUs.)

By now, PC gaming has thoroughly demonstrated that increasing SSD speed has little or no impact on gaming performance. NVMe SSDs are several times faster than SATA SSDs on paper, but for almost all PC games that extra performance goes largely unused. In part, this is due to bottlenecks elsewhere in the system that are revealed when storage performance is fast enough to no longer be a serious limitation. The upcoming consoles will include a number of hardware features designed to make it easier for games to take advantage of fast storage, and to alleviate bottlenecks that would be troublesome on a standard PC platform. This is where the console storage tech gets actually interesting, since the SSDs themselves are relatively unremarkable.

Compression: Amplifying SSD Performance

The most important specialized hardware feature the consoles will include to complement storage performance is dedicated data decompression hardware. Game assets must be stored on disk in a compressed form to keep storage requirements somewhat reasonable. Games usually rely on multiple compression methods—some lossy compression methods specialized for certain types of data (eg. audio and images), and some lossless general-purpose algorithm, but almost everything goes through at least one compression method that is fairly computationally complex. GPU architectures have long included hardware to handle decoding video streams and support simple but fast lossy texture compression methods like S3TC and its successors, but that leaves a lot of data to be decompressed by the CPU. Desktop CPUs don't have dedicated decompression engines or instructions, though many instructions in the various SIMD extensions are intended to help with tasks like this. Even so, decompressing a stream of data at several GB/s is not trivial, and special-purpose hardware can do it more efficiently while freeing up CPU time for other tasks. The decompression offload hardware in the upcoming consoles is implemented on the main SoC so that it can unpack data after it traverses the PCIe link from the SSD and resides in the main RAM pool shared by the GPU and CPU cores.

Decompression offload hardware like this isn't found on typical desktop PC platforms, but it's hardly a novel idea. Previous consoles have included decompression hardware, though nothing that would be able to keep pace with NVMe SSDs. Server platforms often include compression accelerators, usually paired with cryptography accelerators: Intel has done such accelerators both as discrete peripherals and integrated into some server chipsets, and IBM's POWER9 and later CPUs have similar accelerator units. These server accelerators more comparable to what the new consoles need, with throughput of several GB/s.

Microsoft and Sony each have tuned their decompression units to fit the performance expected from their chosen SSD designs. They've chosen different proprietary compression algorithms to target: Sony is using RAD's Kraken, a general-purpose algorithm which was originally designed to be used on the current consoles with relatively weak CPUs but vastly lower throughput requirements. Microsoft focused specifically on texture compression, reasoning that textures account for the largest volume of data that games need to read and decompress. They developed a new texture compression algorithm and dubbed it BCPack in a slight departure from their existing DirectX naming conventions for texture compression methods already supported by GPUs.

Compression Offload Hardware
  Microsoft
Xbox Series X
Sony
Playstation 5
Algorithm BCPack Kraken (and ZLib?)
Maximum Output Rate 6 GB/s 22 GB/s
Typical Output Rate 4.8 GB/s 8–9 GB/s
Equivalent Zen 2 CPU Cores 5 9

Sony states that their Kraken-based decompression hardware can unpack the 5.5GB/s stream from the SSD into a typical 8-9 GB/s of uncompressed data, but that can theoretically reach up to 22 GB/s if the data was redundant enough to be highly compressible. Microsoft states their BCPack decompressor can output a typical 4.8 GB/s from the 2.4 GB/s input, but potentially up to 6 GB/s. So Microsoft is claiming slightly higher typical compression ratios, but still a slower output stream due to the much slower SSD, and Microsoft's hardware decompression is apparently only for texture data.

The CPU time saved by these decompression units sounds astounding: the equivalent of about 9 Zen 2 CPU cores for the PS5, and about 5 for the Xbox Series X. Keep in mind these are peak numbers that assume the SSD bandwidth is being fully utilized—real games won't be able to keep these SSDs 100% busy, so they wouldn't need quite so much CPU power for decompression.

The storage acceleration features on the console SoCs aren't limited to just compression offload, and Sony in particular has described quite a few features, but this is where the information released so far is really vague, unsatisfying and open to interpretation. Most of this functionality seems to be intended to reduce overhead, handling some of the more mundane aspects of moving data around without having to get the CPU involved as often, and making sure the hardware decompression process is invisible to the game software.

DMA Engines

Direct Memory Access (DMA) refers to the ability for a peripheral device to read and write to the CPU's RAM without the CPU being involved. All modern high-speed peripherals use DMA for most of their communication with the CPU, but that's not the only use for DMA. A DMA Engine is a peripheral device that exists solely to move data around; it usually doesn't do anything to that data. The CPU can instruct the DMA engine to perform a copy from one region of RAM to another, and the DMA engine does the rote work of copying potentially gigabytes of data without the CPU having to do a mov (or SIMD equivalent) instruction for every piece, and without polluting CPU caches. DMA engines can also often do more than just offload simple copy operations: they commonly support scatter/gather operations to rearrange data somewhat in the process of moving it around. NVMe already has features like scatter/gather lists that can remove the need for a separate DMA engine to provide that feature, but the NVMe commands in these consoles are acting mostly on compressed data.

Even though DMA engines are a peripheral device, you usually won't find them as a standalone PCIe card. It makes the most sense for them to be as close to the memory controller as possible, which means on the chipset or on the CPU die itself.The PS5 SoC includes a DMA engine to handle copying around data coming out of the compression unit. As with the compression engines, this isn't a novel invention so much as a feature missing from standard desktop PCs, which means it's something custom that Sony has to add to what would otherwise be a fairly straightforward AMD APU configuration.

IO Coprocessor

The IO complex in the PS5's SoC also includes a dual-core processor with its own pool of SRAM. Sony has said almost nothing about the internals of this: Mark Cerny describes one core as dedicated to SSD IO, allowing games to "bypass traditional file IO", while the other core is described simply as helping with "memory mapping". For more detail, we have to turn to a patent Sony filed years ago, and hope it reflects what's actually in the PS5.

The IO coprocessor described in Sony's patent offloads portions of what would normally be the operating system's storage drivers. One of its most important duties is to translate between various address spaces. When the game requests a certain range of bytes from one of its files, the game is looking for the uncompressed data. The IO coprocessor figures out which chunks of compressed data are needed and sends NVMe read commands to the SSD. Once the SSD has returned the data, the IO coprocessor sets up the decompression unit to process that data, and the DMA engine to deliver it to the requested locations in the game's memory.

Since the IO coprocessor's two cores are each much less powerful than a Zen 2 CPU core, they cannot be in charge of all interaction with the SSD. The coprocessor handles the most common cases of reading data, and the system falls back to the OS running on the Zen 2 cores for the rest. The coprocessor's SRAM isn't used to buffer the vast amounts of game data flowing through the IO complex; instead this memory holds the various lookup tables used by the IO coprocessor. In this respect, it is similar to an SSD controller with a pool of RAM for its mapping tables, but the job of the IO coprocessor is completely different from what an SSD controller does. This is why it will be useful even with aftermarket third-party SSDs.

Cache Coherency

The last somewhat storage-related hardware feature Sony has disclosed is a set of cache coherency engines. The CPU and GPU on the PS5 SoC share the same 16 GB of RAM, which eliminates the step of copying assets from main RAM to VRAM after they're loaded from the SSD and decompressed. But to get the most benefit from the shared pool of memory, the hardware has to ensure cache coherency not just between the several CPU cores, but also with the GPU's various caches. That's all normal for an APU, but what's novel with the PS5 is that the IO complex also participates. When new graphics assets are loaded into memory through the IO complex and overwrite older assets, it sends cache invalidation signals to any relevant caches—to discard only the stale data, rather than flush the entire GPU caches.

What about the Xbox Series X?

There's a lot of information above about the Playstation 5's custom IO complex, and it's natural to wonder whether the Xbox Series X will have similar capabilities or if it's limited to just the decompression hardware. Microsoft has lumped the storage-related technologies in the new Xbox under the heading of "Xbox Velocity Architecture":

Microsoft defines this as having four components: the SSD itself, the compression engine, a new software API for accessing storage (more on this later), and a hardware feature called Sampler Feedback Streaming. That last one is only distantly related to storage; it's a GPU feature that makes partially resident textures more useful by allowing shader programs to keep a record of which portions of a texture are actually being used. This information can be used to decide what data to evict from RAM and what to load next—such as a higher-resolution version of the texture regions that are actually visible at the moment.

Since Microsoft doesn't mention anything like the other PS5 IO complex features, it's reasonable to assume the Xbox Series X doesn't have those capabilities and its IO is largely managed by the CPU cores. But I wouldn't be too surprised to find out the Series X has a comparable DMA engine, because that's kind of feature has historically shown up in many console architectures.

SSD Details: Xbox Series X and Playstation 5 What To Expect From Next-gen Games
Comments Locked

200 Comments

View All Comments

  • defaultluser - Friday, June 12, 2020 - link

    We already had that. Sandforce had too many bugs, and if your data was already compressed, it did nothing for load times.

    This is the big and powerful PC, where content had been delivered for the last twenty years over the Internets (and thus, is highly-compressed RAR or 7zip formats). And our media tends to be heavily-compressed (mp3s, h.265).

    And on top of that, overhead like Denuvo is also stopping SATA6 i/O from being the limiting factor in PC game load times.

    Console disc ISOs contain much lower-compression game assets, so this is an easy fix for Sony.
  • Kurosaki - Friday, June 12, 2020 - link

    Denuvo? I only play cracked games to play it without denuvo. It's the only righteous path!
  • Nozuka - Friday, June 12, 2020 - link

    Most games are likely still developed with backwards compatibility in mind (PS4/XB1), since they will still be in more houses for the next few years. That will likely slow down the SSD optimization process... :(
  • Midwayman - Friday, June 12, 2020 - link

    That's true every console generation. You might get a first party title or two early that leverages the new hardward, but it takes awhile for people to figure out the systems. Too low of an install base early on for 3rd party devs to risk making something they can't easily port.
  • lmcd - Friday, June 12, 2020 - link

    Microsoft has been talking about keeping Xbox One S-level support for longer than just that. At the end of the day, these two console generations have enough in common that Microsoft will likely enable a title to work on older Xbox entirely based on the game's visual complexity even 5 years from now.
  • shelbystripes - Friday, June 12, 2020 - link

    That’s true, but with these specs, a game developed to run on PS4/XB1 will still benefit from the optimized hardware, and you’ll see much better performance (including ridiculously shorter load times).

    You won’t really need software SSD optimizations until you’re developing games that require the full potential of the PS5/XBX hardware anyway.
  • beginner99 - Friday, June 12, 2020 - link

    Given the size of games these SSD are rather small. You can probably have maybe 5 games installed at one time on the PS5 max (without ssd extension). Meaning if you play a lot of games you will have to do a lot of uninstall-install operations on an already very full drive (=higher write amplification). At TLC or QLC on top and it makes we wonder how long these drives will last?
  • tipoo - Friday, June 12, 2020 - link

    Worth noting you shed 10-12GB of duplicated assets because you don't need them when all systems will be SSDs, that was a method for speeding up HDDs.

    I expect 9th gen games to continue to expand of course, but at first that will blunt the impact.
  • brucethemoose - Friday, June 12, 2020 - link

    Game hoarders are definitely going to burn through their SSDs can create some bad PR a few years from now. I wouldnt be suprised if there are firmware updates for more over provisioned area.
  • brucethemoose - Friday, June 12, 2020 - link

    *That being said, I don't think SSD wear will be an issue for 99% of customers. They're going to be more reliable than, say, the infamous spinners on the 360.

Log in

Don't have an account? Sign up now