ClioSport.net

Register a free account today to become a member!
Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

  • When you purchase through links on our site, we may earn an affiliate commission. Read more here.

The Power PC Thread [f*ck off consoles]



Darren S

ClioSport Club Member
Fixed it. I must have nudged the 8 pin power at some point last night, reseated it this morning and it powered straight up.

Friendship with liquid cooler is over. Now air cooler is my best friend.
View attachment 1700068

Ran 30 minutes of Furmark and Cinebench simultaneously, before and after to give some data for comparison. And took the temps while everything was still under 100% load.

The CPU temps predictably increased, mostly due to having the intake sucking hot air directly from the hole in the backplate of the GPU, but going from 52°C to 61°C is more than acceptable.

The GPU on the other hand dropped from 74°C to 65°C, causing a slight increase in core clock speed and promising more headroom for fine tuning later 😎

I think it's pretty safe to say that having a 280mm radiator as an intake was a bit restrictive for the graphics card!
The main reason I've always had my rad top-mounted.

Would it fit in there with the rad and two fans attached to it, up top?
 

Card Drama Nonce

CSF Harvester
ClioSport Club Member
  Clio 182
The main reason I've always had my rad top-mounted.

Would it fit in there with the rad and two fans attached to it, up top?
Nah I tried that, turns out my case only supports 140mm fans up top, not fans plus radiator. The top of the motherboard is too close to the top of the case, the rad or fans would interfere whichever way you oriented them.

It's fine though, I need to sell my old system and I was missing a PSU and a CPU cooler, now I'm only missing a PSU, and I've lost no performance from the CPU and the GPU can breathe easier and therefore performs much better. Basically got myself an upgrade by default 🤣
 

Card Drama Nonce

CSF Harvester
ClioSport Club Member
  Clio 182
After missing out on all the latest Nvidia tech for the last few years I finally tried using DLSS today. f**king Hell, what sorcery is this?!

I'd been forced to use FSR on Hogwarts Legacy and although it was passable, it wasn't what I'd call an ideal solution.

Played The Last Of Us at 4k earlier, with a mix of high and ultra settings, with DLSS performance enabled and I can't believe how good it looks.

The reason behind buying an Nvidia card over a comparable AMD card was purely the cutting edge tech like ray tracing and DLSS, and tonight has really confirmed that I made the right decision. At 1080p I have zero issue getting the framerates I want, but in order to get the required framerates at 4k I would have had to spend a lot more for an AMD card in order to run the games at native resolution because in my experience FSR simply isn't as good. It's not bad, but it's not the best.

This was native resolution, running between 30-45 fps
1000038458.jpg

And this was DLSS performance preset, getting a v-sync locked 60fps
1000038460.jpg


If I zoom right in I can see some slight aliasing on the DLSS image, but even putting my face near the screen it's imperceptible in real life.

I honestly thought ray tracing was going to be the thing that amazed me with this card, but frankly it's barely noticeable in the games I've tested. Modern baked in lighting, global illumination, and SSR are so good that ray tracing is almost a waste of time. DLSS on the other hand is an absolute game changer.

In future I'm going to experiment with upscaling games using DLDSR and running DLSS at the same time to improve image quality at 1080p without sacrificing framerate like when using the old DSR system.

Thanks for coming to my TED talk 🤣
 

Jonnio

ClioSport Club Member
  Punto HGT Abarth
After missing out on all the latest Nvidia tech for the last few years I finally tried using DLSS today. f**king Hell, what sorcery is this?!

I'd been forced to use FSR on Hogwarts Legacy and although it was passable, it wasn't what I'd call an ideal solution.

Played The Last Of Us at 4k earlier, with a mix of high and ultra settings, with DLSS performance enabled and I can't believe how good it looks.

The reason behind buying an Nvidia card over a comparable AMD card was purely the cutting edge tech like ray tracing and DLSS, and tonight has really confirmed that I made the right decision. At 1080p I have zero issue getting the framerates I want, but in order to get the required framerates at 4k I would have had to spend a lot more for an AMD card in order to run the games at native resolution because in my experience FSR simply isn't as good. It's not bad, but it's not the best.

This was native resolution, running between 30-45 fpsView attachment 1701507
And this was DLSS performance preset, getting a v-sync locked 60fps
View attachment 1701508

If I zoom right in I can see some slight aliasing on the DLSS image, but even putting my face near the screen it's imperceptible in real life.

I honestly thought ray tracing was going to be the thing that amazed me with this card, but frankly it's barely noticeable in the games I've tested. Modern baked in lighting, global illumination, and SSR are so good that ray tracing is almost a waste of time. DLSS on the other hand is an absolute game changer.

In future I'm going to experiment with upscaling games using DLDSR and running DLSS at the same time to improve image quality at 1080p without sacrificing framerate like when using the old DSR system.

Thanks for coming to my TED talk 🤣

DLSS is cool, but ray/path tracing is the real game changer. 😋

qi-fight.gif
 

Card Drama Nonce

CSF Harvester
ClioSport Club Member
  Clio 182
DLSS is cool, but ray/path tracing is the real game changer. 😋
It depends how it's implemented. Quake 2 is insane (which I believe is fully path traced), Hogwarts Legacy was barely noticeable most of the time, Lego Builder's Journey looks great but also looks great using normal lighting and reflections, Metro Exodus looks about the same as usual. Cyberpunk looks great no matter what system is being used, although I don't have the horsepower required for the full path tracing experience so I can't comment on that. Minecraft shaders like PTGI are brilliant.

And DLSS lets you use all these fancy ray tracing things without suffering as much performance loss 😏

Framerate > visuals, always 🫡
 

Ph1 Tom

ClioSport Club Member
Tried a couple of games with ray tracing. Didn't notice it so I don't bother with it.

DLSS I found even on the highest quality settings made the image quality worse. FSR3.0 was better but still not as good as native. So again I don't bother.
 

Card Drama Nonce

CSF Harvester
ClioSport Club Member
  Clio 182
Tried a couple of games with ray tracing. Didn't notice it so I don't bother with it.

DLSS I found even on the highest quality settings made the image quality worse. FSR3.0 was better but still not as good as native. So again I don't bother.
I tried DLSS at 1080p and it was horrendous, but at 4k I'm struggling to see a difference with the naked eye. FSR for me seems to have the same ghosting issues as TAA so I avoid it.

I got path tracing working on Cyberpunk by putting all settings on low and enabling DLSS but it looked no better than ultra ray tracing, which looked barely better than the standard game. Think I'll just stick to ultra settings and no upscaling and 140fps instead of 60fps with fancy lighting.
 

Card Drama Nonce

CSF Harvester
ClioSport Club Member
  Clio 182
I think it depends how sensitive you to are to the changes in the image, some people seem to “see” it much more than others.
I'll be honest, I notice everything which is why I was so shocked at the DLSS performance in TLOU. It's the only thing I've tried so far though, so maybe I'll be disappointed by other titles. I know it was dogshit in Cyberpunk but I'm chalking that up to being 1080p rather than DLSS having issues.
 

SharkyUK

ClioSport Club Member
It depends how it's implemented. Quake 2 is insane (which I believe is fully path traced), Hogwarts Legacy was barely noticeable most of the time, Lego Builder's Journey looks great but also looks great using normal lighting and reflections, Metro Exodus looks about the same as usual. Cyberpunk looks great no matter what system is being used, although I don't have the horsepower required for the full path tracing experience so I can't comment on that. Minecraft shaders like PTGI are brilliant.

And DLSS lets you use all these fancy ray tracing things without suffering as much performance loss 😏

Framerate > visuals, always 🫡

Ray tracing can bring so much more to the visuals and, thankfully, is the route that graphics hardware is moving. Given the uplift in performance required to utilise it, it will take a while to get there and it doesn't help that current GPU architecture is still very much about the rasterisation performance (which, of course, it needs to be as pretty much everything out there is about rasterisation and getting those polygons rendered on screen). It's going to be some time before we see fully ray-traced/path-traced (RT/PT) visuals at a decent resolution and at acceptable framerates, but the move towards doing so is beneficial in what it brings to the table. At the moment, we are stuck in the very early stages and RT/PT is used "sparingly" and only to realise certain phenomena (and thus only satisfying a small part of the lighting equation). Hence, we get limited options to enable basic reflections, ray-traced shadows, global illumination (with more than one bounce), and so forth. As noted, the differences aren't always so noticeable - especially to an untrained eye.

Whilst RT/PT has its own set of problems that need to be overcome (especially that need for outright compute performance) it is quite a simple algorithm to implement and would mean that developers could do away with the multitude of smoke and mirror hacks that comprise modern graphics engines. Modern engines still use somewhat simple lighting models and approximations that don't capture the complexity of real-world light interactions. Hence, there is this need for overly complex shaders and "hacks" in order to generate the visuals. Whilst modern [non-RT/PT] visuals are undoubtedly very impressive, the under-the-hood work that is going on is becoming quite a ballache to deal with, difficult to maintain, and scalability can be a big issue (along with the resource demands in terms of VRAM and shifting that data around). However, GPUs in their current form (more or less) have been around for over 3 decades now and, over that time, we have become quite good at working these hacks - to the point where they are fundamental to any modern engine out there that you care to mention. I think, in 30 years, RT/PT will be in a much better place. 🤣 It is the future and it is the direction that graphics hardware is moving in, but it is going to take time and folks will continue to bash and lament the technology in the meantime! 🤣
 


Top