ClioSport.net

Register a free account today to become a member!
Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

  • When you purchase through links on our site, we may earn an affiliate commission. Read more here.

The Power PC Thread [f*ck off consoles]



Geddes

ClioSport Club Member
  Fiesta Mk8 ST-3
I’m having an issue on my WiFi card on my pc, it’s not picking up any signal. It’s the first time I’ve had one and had an issue with it. It’s loose inside and not screwed down and not connected to anything.
I’ve updated it and is their anything else I could do or do they become faulty easy these things?
 

N0ddie

ClioSport Club Member
  Tesla Model 3
Additional monitor added to my setup today:

30274739138_914a7ef60e_b.jpg

Xbox One X is hooked up to it along with my rig. Only got it today so cable tidying is not quite perfect.
 
  Evo 5 RS
Nice. I want to replace the TN Swift at some point. Was hoping to jump on the new HDR offerings but NVIDIA messed that up when they priced them out of the market.

Also your background offends me. (Just sold my VIII :()
 

N0ddie

ClioSport Club Member
  Tesla Model 3
Ha. I've got the wallpapers on rotate every minute from a "Speedhunters Wallpaper" folder. For a TN panel I must say I'm quite impressed. Obviously looking at the monitor straight on helps but quite pleased.
 
  Ph2 172, 106 Rallye
A good cantilever dual monitor mount would set that off nicely N0ddie. I didn't think I'd benefit from a monitor mount but I don't think I could live without it now. Clears up space in front of the keyboard too.
 

N0ddie

ClioSport Club Member
  Tesla Model 3
Its something I've been considering recently as the stand on the Ultrawide is quite big, though most solutions only go up as big as 32" where I need 34".
 

N0ddie

ClioSport Club Member
  Tesla Model 3
Nvidia presentation was boring as fudge. Will be interesting to see how the new cards fair when real non-bull$hit figures are revealed. The 2070 performance will definitely not be better than a current Titan X in terms of FPS in games.

Prices are ridiculous. Nearly £1300 for a good AIB card? I was half that for my Ti when it was released.
 

SharkyUK

ClioSport Club Member
F**k it, placed a pre-order on the EVGA 2080 Ti (which I may not go through with). Also sent a letter (via my software development business) to see if nVidia will also send me a sample card to evaluate through their developer programme. They haven't said no! :p
 

SharkyUK

ClioSport Club Member
You fool. ?

What "real world" bump do you expect over the 1080Ti?
Believe it or not, it's a legit reason to buy mate - I'm not so much after massive gaming gains. I'm interested in the extra CUDA cores, Tensor cores and the RTX engine for real time rendering capabilities. I'd love to port my rendering software over to it and see what it can do. I'm not sure what gaming bumps we'll see as the architecture is a fair bit different - but there's a good chunk more throughput / bandwidth available. I would have liked to see 16GB RAM on it from a raytracing perspective but then it's really getting even more stupid in terms of selling to the mainstream (well, enthusiasts).
 
I'll wait for the benchmarks to see if it's worth the bump over my 1070 to a 2080 or whatever

At face value it doesn't seem like good value for money if you just want to play games
 

N0ddie

ClioSport Club Member
  Tesla Model 3
To unveil the thing at a gaming convention and not give any info on how games run on it surely tells you there isn't a significant performance jump? At the end of the day everyone will turn these features off RTS/shadows as that what people do to get better performance.

Also. Look at the Battlefield 5 demo when Ray Tracing is off while they are showing the reflections on the car. It's not a simple Ray Tracing on/off, is Ray Tracing On vs reflections set at their lowest detail level.
 
Last edited:

SharkyUK

ClioSport Club Member
Real (high fidelity) ray tracing and global illumination in real time at high fps is still decades away. I'm surprised how much it is being pushed. That said, the AI / deep learning used for the de-noising stuff is incredible. Properly incredible. That's what makes this raytracing possible in real time right now (albeit with a lot of limiting factors and constraints). I expect a decent jump in gaming performance though all considered (on the Ti) but the cost is off-putting. I know folks won't agree with me but the technology packed in this new GPU is pretty impressive to say the least and I don't think the prices is too bad considering the potential. Pricy for a pure gaming card though!
 
  Evo 5 RS
I think they picked the wrong crowd for the conference content. I know that RTX obviously surrounds ray tracing massively, but gamescom probably isn't the best place. I'm personally really excited to see where things go next, but the fact that Jenson had to really push the boat out to explain just what a leap the technology is, shows he was basically talking to a brick wall. They needed performance metrics to back it up (rasterisation comparisons with the 1080)
 

Darren S

ClioSport Club Member
I think they picked the wrong crowd for the conference content. I know that RTX obviously surrounds ray tracing massively, but gamescom probably isn't the best place. I'm personally really excited to see where things go next, but the fact that Jenson had to really push the boat out to explain just what a leap the technology is, shows he was basically talking to a brick wall. They needed performance metrics to back it up (rasterisation comparisons with the 1080)

Does seem a strange crowd to present to. Almost like advertising the new RS Megane on a stand at a Camping & Caravans exhibition. Trying to wow potential customers with the boot space isn't going to impress many if that's what the car isn't about.
 

N0ddie

ClioSport Club Member
  Tesla Model 3
I think they picked the wrong crowd for the conference content. I know that RTX obviously surrounds ray tracing massively, but gamescom probably isn't the best place. I'm personally really excited to see where things go next, but the fact that Jenson had to really push the boat out to explain just what a leap the technology is, shows he was basically talking to a brick wall. They needed performance metrics to back it up (rasterisation comparisons with the 1080)

This.
 

SharkyUK

ClioSport Club Member
I've had to avoid massive rants on this over the past few days (i.e. nVidia RTX, etc.)... some of the comments and things I've been reading have been laughable. @Silent_Scone has nailed it somewhat. This card isn't really a 'gamers' card as such. I fully expect to see a performance increase over the equivalent 10x0 models when based purely on rasterisation [as used in current games] thanks to the improved shader cores, the number of cores, the increased memory bandwidth, etc. But this is without RTX / raytracing. As soon as that is enabled it will absolutely tank performance. I maintain what I've already been saying for some time... that real time raytracing is still a long, long way away. I'm talking 10, 15+ years. For current movie level fidelity you're talking even longer - probably 25+ years given the current rate of technology performance increase. You simply CANNOT base the new RTX range GPU performance against GTX and earlier as the architecture is too different. This represent a new step in a new direction for real time rendering that will take decades to realise fully. This, in my opinion, is the first big step in embracing true physical based rendering pipelines based on realistic light transport (ray tracing, path tracing and all that it brings) whilst still having to rely on rasterisation techniques to support existing games and to allow current tech to maintain the performance we have become accustomed to in modern games. I am VERY excited by this technology as it's absolutely what I do as a hobby and professionally and I fully understand what they are doing here. Which is why I don't think the price is too bad. And it's also the reason why I agree with Silent and think that the keynote should not have been delivered to gamers as such.

The cost? £1200+ for the 2080 Ti? It's not bad when you consider it's equivalent to past Titan GPUs. And make no mistake, this is effectively what the new 2080 Ti is - a Titan-like GPU with gaming ability and the fancy new RT and Tensor Core stuff. My understanding is that the Titan will be no more and the Ti are effectively the new Titan, bringing incredible compute performance as well as maintaining the tech and rasterisation performance we expect to see from new flagship GPUs. It represents a 'cheap' way of getting your hands on the new tech without having to spend $10,000 on the Quadro RTX card. Not necessarily good if you're only interested in gaming.

The problem is that nVidia released / announced this to a gaming crowd and folks instantly expect ray tracing in modern games at 60fps (or high frame rates they've been accustomed to when buying new flagship GPUs). That is simply not going to happen. Not a cat in hell's chance. Gamers aren't generally interested in the 'how' or 'why', they just want performance (which is fair). But trying to explain to the average gamer why they won't get the performance they expect is like pissing into the wind! They aren't interested (which, again, is fair enough) and aren't willing to accept the reasons why. I fully understand why nVidia are so keen to push this tech (which is incredible, truly it is) but it's absolutely the wrong crowd to preach to.

The AI / deep-learning routines that can run across the Tensor Cores open a lot of doors in terms of funky processing, with very little impact on the rendering portion of the pipeline. I'm genuinely excited by the tech I've seen here; de-noising, DLSS (per game or generic), image upscale inference, we're talking mind-blowing stuff (at least to a geek like myself). I could post a load of stuff here that is happening now and will be making its way onto the likes of the RTX hardware. In fact, it's the AI and deep-learning stuff that makes the real time raytracing possible right now. It is the AI and deep-learning that is able to produce good quality visuals from the low-sampling rates used in the ray tracing portions of the rendering pipeline. If you were to see the RTX ray tracing demos side by side with and without this enabled you'd be very surprised by the differences in quality.

TLDR;
Meh.

It's an expensive gamers' card, but represents the first step in a new direction in realising real time graphics on the GPU. Ironically the move to real ray / path tracing makes the algorithms and coding simpler but the computational costs increase by several orders of magnitude. I'll leave it there.

This is the most excited I've been for a hardware release in a long, long time.

(Sorry for the mistakes and bad grammar - typed in a hurry!)
 
  Evo 5 RS
It is real ray tracing just with what NVIDIA is referring to in technical marketing as a “budget” of 1-2 samples per pixel which is basically all is possible right now. So the methodology is real, it’s just at a low level and the rest is still trickery to make up for the difference. It shouldn’t really to take anything away from just how cool this is, though.
 

SharkyUK

ClioSport Club Member
@SharkyUK So if its not real time ray tracing, what is it?
It's exactly as Scone says mate - it IS real ray tracing but at a VERY coarse level. As he already states, they are using only a handful of samples per pixel (i.e. sending out a few rays per pixel to be bounced around the scene, with a very limited bounce depth) which results in a very incomplete and noisy image. The AI technology running on the Tensor Cores then takes that less than ideal image and 'infers' (i.e. best guesses) what the result should look like. It's hard to impress just how impressive that tech is. Without that de-noising to improve the image quality the RTX card would need to be sending out thousands of samples per pixel, not just 1-2 samples as it does currently. I'm sure the RTX card could be made to do that but then you're looking at several hours per frame (maybe days) depending on the complexity of the 3D scene.

Think of the ray tracing as an add-on feature for now that can be enabled/disabled much like any other GameWorks-esque feature. You can take advantage and reap the visual benefits, but you won't be winning any performance records with it enabled. Think of it as a nod to the future and a step in the direction towards real physical based rendering (i.e. based on how light acts in the real world).

In addition to the additional rendering overhead, the GPU will also need to build an additional version of the 3D scene geometry that is spatially partitioned into an acceleration structure. Once that structure is built (which is incredibly intensive if the scene is dynamic, as games generally are) the RT core can then use the hardware to bounce the rays around that acceleration structure and determine where light rays hit walls, floors, characters, etc. and then determine the surface properties of the surface hit, and then determine how the light should interact given the type of light and the properties of that surface, and then determine if the next ray should 'bounce' from that point, should terminate, should reflect, should refract, should cast a shadow ray, etc. Ultimately it arrives at a decision as to what colour that pixel should be. All that effort for just a single pixel, which has taken several millions of instructions to get to. Multiply that by the number of pixels on a typical screen and the costs mount up, hence why nVidia currently only use a few samples per pixel. The additional BVH memory structures, etc. will also use additional precious RAM hence why I'm not sure 11GB is really enough for serious ray tracing, hence the 48GB found in the pro-level Quadro RTX cards. It's not a cheap process!

We're a long way from movie quality CGI though. Monsters Inc (quite an old film) was one of the first to use global illumination and Pixar were using a render farm of 24,000 cores. Each frame still took on average 29 hours to render. Those are the sorts of scales you're dealing with with this sort of ray / path tracing technology. It is this scale that is hard to impress on folks and the reason why 'full' ray tracing is still a thing of dreams for gaming. And the reason why we're going to be seeing this hybrid method for some time. DirectX and nVidia RTX use new shaders in the rendering pipeline that developers can use to hook into the raytracing functionality of the card. Not only do you have vertex, pixel and geometry shaders; you now have 'hit' shaders and 'miss' shaders that can be called when a traced ray hits a surface or when a ray misses everything and shoots off into space (that's a very abstract explanation!)

I'll try and dig some stuff out to give examples later. :p
 

SharkyUK

ClioSport Club Member
In addition to my previous post on raytracing shizzle on the GPU and the performance vs. quality issue... feel free to skip if you can't be arsed, I won't be offended. :)

These images were taken a few moments ago from my own GPU path tracer. The rendering is running wholly on the GPU at 100% utilisation, the GPU being an EVGA 1080 Ti FTW3). I am rendering a single frame of a very, very simple scene. My software is fairly well optimised and, due to the nature of the tracing, produces accurate reflections, refractions, shadows, caustics, etc. - as you would expect from this sort of technology. No rasterisation is going on here - it's pure path tracing and all courtesy of CUDA, thus running millions of instructions per pixel per second in parallel in real time.

Below is the image sampled at 2 samples per pixel. There's a lot of 'noise' and the quality is far from acceptable. It took around 0.1s to render.

noise01.jpg


Below is the image sampled at 10 samples per pixel. There's still a lot of 'noise' and the quality is a little better. It took around 0.5s to render.

noise02.jpg


We're (below) now at 100 samples per pixel and things are showing improvement. It took around 6.5s to render the frame and the soft shadows are still noisy.

noise03.jpg


In the image below we're at 1000 samples per pixel and we're now starting to reach acceptable(ish) quality levels. It took a whole 71 seconds to reach this quality level. For a single frame. Assuming a gamer wants 60fps then that equates to a single frame being rendered and delivered to the screen in 16 milliseconds. This took 71 seconds - some 4400-odd times slower. Welcome to the computationally expensive world of ray tracing! :p

noise04.jpg


Here's another example (below) showing off something a little more 'raytracey'. The samples and times are shown again in the images.

noise2_01.jpg

noise2_02.jpg

noise2_03.jpg

noise2_04.jpg


Of course, the examples I give here are very simple and not particularly scientific. But they show how expensive the process is for even the most simple of 3D scenes. You're basically looking in excess of 60 seconds per frame for the above rather than 60 frames per second. :p

But that's where the technology gets really cool. nVidia are sampling at a very low rate hence the results tend to resemble those seen in the first images of my examples. So how come the resulting images don't look that bad? That's where the AI, deep-learning and de-noising tech on the Tensor Cores can come into play. They can take that noisy image and 'estimate' what the final image should look like based on learning. They are effectively inferring what the final image should look like given a very poor input image quality. In fact, similar technology can also be used to upscale 1080p or 4k images to 8k, 16k+ resolutions whilst keeping quality losses to a minimum. The technology can amazingly 'infer' the detail and produce impressive results.

Take a look at the following video. The video shows a low sampled ray traced image in real time alongside a technique that can (using AI and deep-learning) 'infer' the final result. This is actually pretty incredible; being able to 'guesstimate' what the final image should be from such a poor quality input. This sort of tech is becoming big in the movie industry and has real tangible benefits in all realms of visuals / image generation. Path tracing with these de-noising techniques are key to bringing this tech into the realms of real time at acceptable fidelity.




Another video showing noise reduction at work - part collaboration with nVidia. Basically training an AI to remove noise from images without the AI actually knowing what noise is and what the final image should look like. Sounds stupid but it works and the results are incredible. This is what the Tensor Cores will be used for in time - running these algorithms using computed knowledge and producing great visuals even from less than ideal input imagery. Yeah, I'm getting my geek on but I truly did not expect to see this level of technology for at least another 10-15 years.




I'm done. I know I'm preaching to the few rather than the many but this is (in part) why the new RTX range is so impressive. Not so much purely for gamers but in terms of the future and what is coming. It's just going to take some time to get there. I have to be honest and say I welcome this change in direction and I hope AMD follow suit (and Intel seeing as they are entering the GPU arena in 2020). It's the right way to go if visual realism is the holy grail.
 

N0ddie

ClioSport Club Member
  Tesla Model 3
Don’t launch a product at a games convention if it’s not 100% for the gamers imo.

If this was presented as a stand-alone tech show there would be much less of a fuss made about it. RTX is cool, but 99% of gamers couldn’t give a f**k.
 

Darren S

ClioSport Club Member
Andy - going off your X-Wing examples above (which explain what you’re saying very well, btw!) - I’m guessing that there’s no difference in rendering times, if the images were purely monochrome?

Is the addition of colour for ray-tracing work, not of much consequence?
 

SharkyUK

ClioSport Club Member
Andy - going off your X-Wing examples above (which explain what you’re saying very well, btw!) - I’m guessing that there’s no difference in rendering times, if the images were purely monochrome?

Is the addition of colour for ray-tracing work, not of much consequence?
Define monochrome? ;) I'll explain in a second...

Good question! Generally, no - there's not going to be much difference in rendering times if rendering wholly in monochrome. The majority of the computational effort still has to happen. Monochrome renders are generally rendered as per any other and the conversion to monochrome typically happens as a post-process effect whereby the colour information is converted into luminosity / grayscale values. If anything, this is creating additional work for the renderer... although the conversion is so fast it's practically negligible.

BUT...

In this situation we are assuming that 'monochrome' means grayscale (i.e. devoid of colour information and represented as various shades of gray). However, in ray tracing terms, it would be argued that my colour images shown are also monochrome. Why? Simply because my renderer is cheating and assuming a single wavelength colour (monochromatic). What this means is that I'm ignoring the laws of physics and how humans interpret light, instead choosing to assume that the RGB components of light are the same wavelength. As you are probably aware, this is not the case in nature as red, green and blue colours all have different wavelengths. However, by treating RGB as monochromatic (single wavelength) it means I can still calculate a pixel colour by tracing a ray through a scene and arriving at a final radiance value. Generally this value is 'good enough' to produce realistic results. But it's not truly accurate. For that to happen we would need to write a spectral ray / path tracer. It's pretty much the same as my renderer but the key difference is that each RGB is treated independently and with their correct wavelength values. What does that mean? It means that where I previously generated and/or traced a ray through the scene I now have to do it three times; for each of the RGB wavelengths. In the best case scenario I have at least trebled the work effort required to arrive at a pixel colour value!

I hope that makes sense :)
 
So the first reviews are flooding out for the 20XX series cards.. doesn't seem to be the generational leap people were (stupidly) expecting. I suppose there was a reason they went all out on the ray tracing stuff. Must be plenty of people regretting selling their 1080Ti's at this point
 


Top