Always interesting to read Sharky's views on these topics.
Thanks guys. I'm sometimes a bit... 'reluctant' to post on threads like this. It's a subject I have a bit of experience and much interest in and consequently I don't want to come across as a complete to$$er or know-it-all. I'm not and I don't! But it's good to know that I do make the odd post that has some worth now and again!
Bang on Andy. 512MB dedicated wasn't enough from the word go (I think 3GB would be wiser, it has to last 7 years remember
). It will also benefit the PC platform if DirectX 11 support gets introduced. Like you say besides a handful of flagship titles that release it as a patch, or poorly optimised high-end modes there really isn't any 'proper' support.
Sod it I'm linking the Samaritan demo
Yes, 3 or 4GB would be fantastic. 2GB would be great. 1GB would be disappointing but most likely (with maybe a little bit of specialist embedded RAM to work directly with the GPU(s) - if they go down a route that would benefit from such a setup). Running a thin / lightweight console 'operating system' does mean that your RAM goes a fair bit further but time will tell...
I do hope that we see DirectX11 being supported across the next-gen Xbox and PS (forget Wii U as I'm not convinced it will be able to compete at the level of these two). This will potentially make cross-platform development easier and quicker and hopefully less fragmented (between the two consoles and PC development). It's inevitable that the PC will continue to evolve and leave the fixed hardware consoles 'behind', but hopefully not to the point where PC games are seemingly compromised when (if?) more focus and resources are aligned to console versions / ports. I just hope that mobile gaming on smartphones / tablets doesn't also lead to less focus on the PC and consoles as this area if growing quick at the moment; and it's surprising how many developers are looking to recruit smartphone developers (or will be doing very soon). I'm not a smartphone game fan to be honest. Not at the moment anyway.
The Samaritan demo... always worth a view.
Also, anyone who thinks that current graphics can't get much better (it looks almost real now etc.) only need to take a look at the likes of the new Tintin movie to see what graphics can look like. Imagine that s**t in a playable environment.
Sharky, I read once that eventually, PC hardware will be able to ray trace such graphics in real time. How far away do you think this kind of tech is? Or will ray tracing be replaced by something equally as capable for real time rendering but much much faster?
Years off. You're looking at CPU bottlenecks more than anything
I had a figure in my head of perhaps 10 years?
Agreed - Tintin looks absolutely fantastic. I've watched it a few times now and it never fails to impress me. Rasterised graphics (as with modern day games) can produce some great visuals, but there's still an awful lot to come I feel. Not through higher and higher screen resolutions or bigger and bigger displays; but through new and improved "uber-shaders", algorithms and techniques that more closely (and accurately) mimic the physics of what and how we see. Whether or not rasterised graphics is the future is hard to say but they'll be around for some time yet. Already we have seen evidence that rasterised graphics can produce some stunning visuals and, currently, this is orders of magnitude quicker than raytracing.
Roy - raytracing is an interesting field but current hardware doesn't lend itself well to the technique (in terms of realtime performance). In terms of dedicated raytracing hardware, this would be a significant different beast to the current devices we see from AMD and nVidia. In reality, a 'raytracing card' would possibly be a phenomenally fast array of CPU's / cores that was optimised for parallel processing and LOTS of processing oomph. Raytracing aside for one moment, I would love to have a beast of a processor that would allow me to run a software renderer at insane resolutions and framerates. I would happily dig out some of my old software engines and update them! With software there is potentially so much more you can do... not being locked into a given API, not having to send the geometry to a given GPU as a list of triangles, the list goes on...
Getting back Roy's point regarding raytracing, I don't see realtime raytracing being with us anytime soon. I've developed a couple of raytracers in the past and also been to a few seminars where realtime raytracing has been the subject. The last one I went to was a month or so before nVidia announce Fermi and the then new hardware was being used to accelerate raytracing of very small environments. To put it bluntly... it sucked. The director of the company producing this technology was admirably keen and enthusiastic but I fear his predictions were a long way off. Due to NDA's I can't give specific details but he believed he'd have realtime raytraced environments at playable framerates (which means 15-20fps in this particular context) within 5-7 years. I disagreed. Unless the scene is very basic (I'm talking VERY basic and less than HD resolutions) I don't see realtime raytracing being viable for a very long time. Not even in the next 10 years to be fair. The sheer amount of processing required to trace rays through an environment, to bounce them, to reflect them... the figures are mind-boggling. I'd say there is NO way we will see a raytraced game to the level of Battlefield 3 in the next 15 years. If we did I would wager it was not properly raytraced!
For the foreseeable future I see rasterised graphics leading the way (i.e. polygons and geometry being pushed to dedicated graphics hardware). The aforementioned "uber-shaders" will continue to improve and I've no doubt we'll begin to see more "raytrace-like" visuals in time. I just don't believe they will actually be raytraced; just very good and clever approximations.
Why is Sharkey's opinion so valid? Does he work for for some high tech company?
Not questioning just curious
Hi mate - I spent over 10 years in the games industry working as a programmer; my speciality was 3D graphics engines and technology. I was lucky enough to work closely with the likes of nVidia and ATI / AMD at times on various projects. It's an area I still have a passion for.
He's in the game industry Sebastian. Codey I think.
I did, Roy. I moved out of the industry a while back due to the volatile nature of the workplace (redundancies, stupidly long working hours, etc.) but luckily my current occupation means I still get to work/code with 3D graphics technology and visualisations. I would like to add that I'm not a nerd. Honest.