A good explanation of VRAM usage in current games for us mere knuckle draggers. Probably the first day of school in comparison, for the likes of Andy
@SharkyUK
🤓 Mooar GDDRz. That's my knowledge exhausted. 😛
I think what's tended to shift in recent times though, is it's not always resolution that kicks VRAM in the nads.
For the vast majority of times, the rule of 'more res = more VRAM required' is definitely true. But with more fidelity and fancy effects being applied, those can hog memory just a bad - especially if poorly optimised.
I'd like to know what happens with the other end of hardware though - and how well games are aware of available VRAM and whether or not they utilise it? So, the likes of The Last of Us utilising circa 12GB of VRAM - if played on a 4090Ti, does it 'see' that it has double that to play with and take advantage of that fact?
It's a bit of a pain in the ass when it comes to memory management, especially on PC where you effectively have to have two copies of data; one copy in host memory (CPU / system RAM) and a copy that gets uploaded to the device (GPU / VRAM). There's work going on to get around this limitation but the current architectures we use aren't well-suited to it and the dream of ultra-fast/shared memory pools across an entire system is still a bit of a dream. It would be lovely to have a great big block of general [fast] memory available that the CPU and GPU can address quickly and independently without the inherent stalls and slooooow bus transfers that we currently have to deal with.
Yeah, you're right - it's not just the resolution of the primary display(s) that push VRAM requirements up - it's the heavyweight G-buffers that game engines use. Especially as the demand for increased fidelity, detail and realism increases. When the component parts of the G-buffers have to start increasing their resolutions to realise the sort of visuals that demanding consumers/gamers are after... well... you can probably imagine! As discerning gamers and techies, we know that jumping from a resolution of 1024x1024 to 2048x2048 isn't simply doubling the requirements for that specific component - it's 4x more. 4x more storage, 4x more data to move around a system, 4x more data to address, and potentially 4x more pixels (work) for the various shaders to work their way through. Of course, this is a very basic outline but memory usage can easily explode.
Memory management itself is a bit of a double-edged sword - older versions of, say, DirectX/3D did quite a lot to hide away the complexities of memory management from the developer. This used to be a blessing in some ways, yet ultimately we'd end up wishing we had finer control over how and when memory was allocated and used. With DX12 and Vulkan, it's really up to the developers to deal with, and it's both great and a f**king pain. Being able to have control over how your memory is managed is all well and good, but it's so easy to get it wrong. Sounds crazy, how difficult can it be to allocate, dish-out, claw-back, chunks of memory? You'd be surprised...! It doesn't take much to get it wrong and suddenly you're in a world of pain with stuttering, stalls, data not being ready when the renderer calls for it, and a million and one other issues. But it's part of the development process so it's up to the devs to get it right ultimately. Or, to get it right 17 patches later after release... 😜
Ray tracing (and path tracing) only serves to increase demand for VRAM. In addition to the game assets, the ray tracing hardware on the GPU needs some form of BVH in VRAM against which it can cast rays and run it's hit/miss shaders, etc. The BVH is effectively an optimised form of the world data (level, characters, effects, and so on) against which rays are traced and hit-tested. As it suggests, that's yet another significant chunk of memory needed for another copy (subset) of the world data - but specifically for the RT side of things. Admittedly, that's a somewhat simplified overview but it would get boring and TLDR; otherwise. LOLgpu.
Here's a quick example I've just run up on one of my dev PCs... a relatively simple scene but with hidden complexity. It's fully path traced and uses high-quality PBR assets. As a result, it's using up 22GB of VRAM and 30+GB of system RAM in total. FLOLIPOP. That's how you do it, boys, girls and thems. 👌 The CPU is hardly being utilised.
Er, not sure what else to say now. I can make more pretty pics? 😂 The red, green and blue textures for the curtains alone are 125MBs...
Ok, so this example is a bit extreme and not really indicative of what a current game would use... BUT... with the evolution of Unreal Engine 5 (and similar) this is what we are moving towards. The gap between "game" assets and "movie" quality assets is closing but the requirements to facilitate that closing of the gap are ballooning. Thank goodness for AI, DLSS, detail inference, ML, and all the other stuff that's going on.