ClioSport.net

Register a free account today to become a member!
Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

  • When you purchase through links on our site, we may earn an affiliate commission. Read more here.

A bit of UE4 "tech stuff"



The 'Square Enix Luminous Engine' does look very impressive, more advance that UE4.




Still can't believe this is meant to be actual gameplay and not cgi !
 
Last edited by a moderator:

SharkyUK

ClioSport Club Member
Yeah, the Luminous Engine does look good but I'm not sure it's more advanced than UE4. Luminous Studio's demo is purely a cinematic playing a scripted cutscene (not gameplay). Whilst it does this very well there's no real-time editor running in the background, there's no physics simulation managing the scene elements and their interactions and so on. No doubt it looks great but hats off to UE4 tech team; I don't know of many 3D engines and editors out there that allow the designer/coder to alter character source code on the fly, compile it, run it and deploy it without having to drop out of the game!

I don't think the Luminous demo is indicative of what the first few waves of next-gen games will be like. It's 'too' good and obviously produced with a cinematic vibe to increase the drama (and does so with a lot of polish). I think that the UE4 demo's - in this instance - are more indicative of what we can expect. In fact, I think the Samaritan demo will probably be a good yardstick by which to measure the forthcoming crop of next-gen games...

All in my humble opinion of course. :D
 

Darren S

ClioSport Club Member
Yeah, the Luminous Engine does look good but I'm not sure it's more advanced than UE4. Luminous Studio's demo is purely a cinematic playing a scripted cutscene (not gameplay). Whilst it does this very well there's no real-time editor running in the background, there's no physics simulation managing the scene elements and their interactions and so on. No doubt it looks great but hats off to UE4 tech team; I don't know of many 3D engines and editors out there that allow the designer/coder to alter character source code on the fly, compile it, run it and deploy it without having to drop out of the game!

I don't think the Luminous demo is indicative of what the first few waves of next-gen games will be like. It's 'too' good and obviously produced with a cinematic vibe to increase the drama (and does so with a lot of polish). I think that the UE4 demo's - in this instance - are more indicative of what we can expect. In fact, I think the Samaritan demo will probably be a good yardstick by which to measure the forthcoming crop of next-gen games...

All in my humble opinion of course. :D

Do you think that it's still going to be GPU dependant for the most part then, m8? Or is the CPU going to start being 'in demand' more?

Sounds like the GPU of the next-gen is going to have a whole lot of tasks asked from it!

D.
 

SharkyUK

ClioSport Club Member
Do you think that it's still going to be GPU dependant for the most part then, m8? Or is the CPU going to start being 'in demand' more?

Sounds like the GPU of the next-gen is going to have a whole lot of tasks asked from it!

D.
The GPU is very much still being utilised in more ways and, increasingly, developers are looking for new and better ways to harness the raw processing power it offers. However, there is a trade-off and a bit of a balancing act to consider. It's all very well running physics routines, etc. on the GPU (and moving more general-purpose stuff over to the GPU) but it still has to have the ability to render the hi-fidelity visuals we come to expect from next-gen titles. There's only a limited amount of resources available at the end of the day and how they get shared between 'rendering' and 'non-rendering' tasks is definitely something that has to be carefully thought about in designing new titles. It's actually quite easy to bring the latest all-singing, all-dancing GPU to it's knees!

I wouldn't say I've seen a big shift either way as yet (either towards GPU or CPU) and it's still very much a case of determining which tasks run better on CPU and which can be better executed on the GPU. CPU's and GPU's are simply getting faster and more powerful and developers looking to take advantage of that in anyway they can. Getting data to and from the GPU has always been a bit of an issue (especially reading back from the GPU) so there's some good work going on in terms of developing new (and existing) techniques for making that transfer more efficient and hardware friendly. Thankfully the new shader models and GPU chipset features get better with each iteration and I think the next-generation of hardware will allow for some pretty cool (new) techniques to be developed... it's quite an exciting time to be a graphics/tech developer.

Your last comment is indeed true. :D The GPU(s) of next-gen hardware won't be having an easy life.
 

Darren S

ClioSport Club Member
The GPU is very much still being utilised in more ways and, increasingly, developers are looking for new and better ways to harness the raw processing power it offers. However, there is a trade-off and a bit of a balancing act to consider. It's all very well running physics routines, etc. on the GPU (and moving more general-purpose stuff over to the GPU) but it still has to have the ability to render the hi-fidelity visuals we come to expect from next-gen titles. There's only a limited amount of resources available at the end of the day and how they get shared between 'rendering' and 'non-rendering' tasks is definitely something that has to be carefully thought about in designing new titles. It's actually quite easy to bring the latest all-singing, all-dancing GPU to it's knees!

I wouldn't say I've seen a big shift either way as yet (either towards GPU or CPU) and it's still very much a case of determining which tasks run better on CPU and which can be better executed on the GPU. CPU's and GPU's are simply getting faster and more powerful and developers looking to take advantage of that in anyway they can. Getting data to and from the GPU has always been a bit of an issue (especially reading back from the GPU) so there's some good work going on in terms of developing new (and existing) techniques for making that transfer more efficient and hardware friendly. Thankfully the new shader models and GPU chipset features get better with each iteration and I think the next-generation of hardware will allow for some pretty cool (new) techniques to be developed... it's quite an exciting time to be a graphics/tech developer.

Your last comment is indeed true. :D The GPU(s) of next-gen hardware won't be having an easy life.

Always an education with your posts, m8! :)

I wonder how they will address the cooling issue of quicker, more capable GPUs? The general public have suffered on both the 360 and PS3 with overheating issues. Even the PC platform isn't exempt from getting a little toasty - COD MW3 being a prime example where I have an nVidia rule running that whacks the fans upto 100% speed on the gfx cards. If I don't, it blue screens within 10-15mins.

Water cooling seems a premium PC option at the minute - and the extra space required (maybe weight too?) would definitely put off the consolers. For the PC platform - they have a finite space in which to work - unless they consider triple-slot space for one gfx card to be the norm?

As you've said on many an occasion though - its the data access - both in terms of speed and quantity of memory onboard that tends to be the issue. Maybe if they can hold-off on the outright speed of GPUs and make a compromise in order to increase the amount of expensive gfx memory available - then that might be an option?

It's a similar issue for both consolers and PC owners to face. I just wonder how the manufacturers will address it??

D.
 
Looks amazing, but I'm not going to get excited until I see a game running at 60fps.

I remember back in the day when an early build of Half-Life 2 was leaked. It ran like warm butter on my machine. Then the game came out, with some actual content...
 
  Yaris Hybrid
I think it is going to be interesting for the next generation of consoles.

They are constrained by the maximum physical size of the casing, constrained by the permissible noise levels from the cooling and even with manufacturer subsidies they are heavily constrained in terms of cost.

Players are going to be asked to drop 300 notes on a new console and will expect something like the Watch Dogs demo running in 1080p at 60fps. Sure consoles are more efficient than PC's but I can't see that happening.

I showed my brother Skyrim running maxed out with the texture packs etc on my PC and with it connected to the TV and sat back from the screen with a gamepad he didn't seem that impressed and claimed that from where he was sat it wasn't "that much" better than the 360 version. He pretty much said the same of all my games.

Gonna be interesting to see what happens if people like that are expected to shell out a big wad of cash for a similar experience to one that they currently claim is "not that much" better.

It will certainly be a challenge for developers to meet the expectations of next gen console buyers given that they are expecting this UE4 experience in real games at 1080p/60fps with what amounts to the hardware equivelant of a mid range PC these days.
 

SharkyUK

ClioSport Club Member
Always an education with your posts, m8! :)
You mean I waffle too much?! LOL! It's ok, I know I do!

I wonder how they will address the cooling issue of quicker, more capable GPUs? The general public have suffered on both the 360 and PS3 with overheating issues. Even the PC platform isn't exempt from getting a little toasty - COD MW3 being a prime example where I have an nVidia rule running that whacks the fans upto 100% speed on the gfx cards. If I don't, it blue screens within 10-15mins.

Water cooling seems a premium PC option at the minute - and the extra space required (maybe weight too?) would definitely put off the consolers. For the PC platform - they have a finite space in which to work - unless they consider triple-slot space for one gfx card to be the norm?
The manufacturers have pumped a whole lot of money into the cooling systems that the new systems will utilise and I do believe that valuable lessons have been learnt. As to how effective these will be in the long run... as always, time will tell. Either way, it could be disastrous if the likes of Microsoft and/or Sony (or Nintendo) have a repeat scenario whereby units fail en masse with overheating issues. It hurt them bad with the current generation consoles and I don't think they had contingency plans that covered the level of failures they had to deal with.

As you've said on many an occasion though - its the data access - both in terms of speed and quantity of memory onboard that tends to be the issue. Maybe if they can hold-off on the outright speed of GPUs and make a compromise in order to increase the amount of expensive gfx memory available - then that might be an option?

It's a similar issue for both consolers and PC owners to face. I just wonder how the manufacturers will address it??

D.
As we've talked about before the lack of plentiful, fast and accessible memory on a console is (for me at least) a bit of a sore point. Whilst the consoles don't have to run heavyweight and resource-hungry OS's, etc. they could certainly benefit from a boost in the memory department. Mind you, even with an increase in the memory capacity it's not always quick and easy to get data back from the GPU. Admittedly things are getting better but there always (at least in the past) has been some level of trade-off. If you wanted to 'read' back data from the GPU then it was a given that you would have to sacrifice some level of performance to achieve this. That's why developers are always looking for new and better techniques to transfer data to and from the GPU-based subsystem(s) - whether through clever use of texture encoded transfers, shader constants, combinations of the two or indeed other techniques. As GPU's tend more towards general purpose duties it's not as much a problem as it once was but the GPU is generally much happier to have data thrown at it (for rendering or otherwise) than it is to report back to the host system what it's doing or providing data back to the host system for further processing.

We could potentially see a massive improvement in performance if the underlying hardware architecture was re-factored to suit...! But I can't see that happening as we'd have to pretty much bin everything we've had to date. I don't think the world is ready to move wholesale to a whole new computing architecture just yet. :D

Looks amazing, but I'm not going to get excited until I see a game running at 60fps.

I remember back in the day when an early build of Half-Life 2 was leaked. It ran like warm butter on my machine. Then the game came out, with some actual content...
Very true, mate. What you see is not what you always get!

I think it is going to be interesting for the next generation of consoles.

They are constrained by the maximum physical size of the casing, constrained by the permissible noise levels from the cooling and even with manufacturer subsidies they are heavily constrained in terms of cost.

Players are going to be asked to drop 300 notes on a new console and will expect something like the Watch Dogs demo running in 1080p at 60fps. Sure consoles are more efficient than PC's but I can't see that happening.

I showed my brother Skyrim running maxed out with the texture packs etc on my PC and with it connected to the TV and sat back from the screen with a gamepad he didn't seem that impressed and claimed that from where he was sat it wasn't "that much" better than the 360 version. He pretty much said the same of all my games.

Gonna be interesting to see what happens if people like that are expected to shell out a big wad of cash for a similar experience to one that they currently claim is "not that much" better.

It will certainly be a challenge for developers to meet the expectations of next gen console buyers given that they are expecting this UE4 experience in real games at 1080p/60fps with what amounts to the hardware equivelant of a mid range PC these days.
Some good points, mate.

I think the sheer size of the current console market means that there WILL be an awful lot of people out there who are happy to drop a wad of their hard-earned on the new consoles at their earliest opportunity (and I wouldn't be surprised to see the new consoles at the 400 quid mark in all truth - although I hope I'm wrong!) As with any new console release I expect there to be a wave of disappointment as many buyers realise that their new console isn't quite the capable beast they believe it to be, but I also believe there will be equally as many (or more) who think their console is the best thing ever (the fanboys and die-hard fans). Like it or not, the console gaming fraternity is a fair chunk larger than the PC and I think it's sheer size will ensure the next-gen consoles a healthy future. Whether or not they get their 1080p at 60fps is, I think, a moot point; in the sense that they'll buy a console at some point anyway... it's how these things seem to pan out.

I think that the technology that will be on offer, whilst not perhaps being as cutting-edge as some may hope, will provide opportunity for some cool and new features in game (visuals or otherwise). Impressive early demo's and news on the grapevine is encouraging but the important thing is to have realistic expectations as to what the next-gen can bring. As a serious/hardcore gamer (or developer) I think it's far easier to be more realistic in your expectations too.
 
Been playing with UDK recently... this looks f**king amazing though. Can that kind of engine run on current gen PC hardware? I would guess the machine he is developing on is seriously highly specced...
 

SharkyUK

ClioSport Club Member
That trailer looks... Unreal!

Is that really all being rendered by the engine?
Yes mate.

Been playing with UDK recently... this looks f**king amazing though. Can that kind of engine run on current gen PC hardware? I would guess the machine he is developing on is seriously highly specced...
UE4 is aimed at forthcoming hardware but can be scaled back to suit lesser-powered systems (and platforms). If you've got the money to spend though you can buy a cutting-edge system that will be of a similar spec to that seen in the trailer.
 
  Evo 5 RS
That Unreal tech 4 footage is a way off from actual in game stuff, I was actually more impressed by the Luminous engine if I'm honest. Neither engines have any real examples to showcase yet though. I was definitely more impressed with the Luminous engine than Unreal 4.0.

Epic Games (Unreal tech) have been pushing for new console hardware for at least 3 years now, and it does help to show what exactly is possible on the hardware that's available for the PC.
I will say this now though, next generation console hardware will be underwhelming to anyone who has had proper access and use of a DX11 capable machine. There is nothing these machines will have that will break any ground. I recently saw one of my nephews playing Crysis on Xbox 360. If you know what you're looking for you can see where Crytek have poured more than likely untold man hours into how the game renders objects, and when. It's clever tbh, but still ugly. The game engine will not render objects or terrain in areas that aren't in the players field of view so to save precious resources. It does this to such an extent that more often than not you will see objects such as trees and rocks randomly appear, or suddenly render (texture pop more commonly known as). I think people like Epic will be lucky to produce games at half the visual quality of what they would like from the next gen, and that we will only see a lesser hindrance than we see now.

:)


It's a similar issue for both consolers and PC owners to face. I just wonder how the manufacturers will address it??

D.

not sure if you know this already mate, but something like GDDR5 is massively expensive. You factor that into the price of an all in one off the shelf box and they would make a massive loss on each unit they sell. They did anyway with the 360
 
Last edited:
Impressive rendering but the point stands that it's not a game. Until I'm actually using a mouse to move around in an environment of that quality, inside an actual action game running in Retina-style resolutions, at 60 fps, then I won't believe it. Ten years away at least IMO. Still, the aim is obviously to trick the 'next-gen' console players with money in their pocket, into thinking that Pixar are gonna be buying a Playstation 4 to make their next movie on.

It's laughable.
 

Rubicon_

ClioSport Club Member
  Defender 110
I always here people saying why are games not the same graphics as cgi yet? I take it this is because the hardware just couldn't render it in real time? So in that sense are we a long time away from near "real life" graphics?
 
Yeah. A frame in a game has to be rendered at least 30 times a second. IMO 60, but that's a personally imposed minimum. CGI movies don't have to rendered in real time. It could take an hour, or a day, to render each frame, but then they're joined together and played back. At a wild guess, I would imagine it would take the average PC about a day to render one frame of Toy Story 3. That's some way short of the required one sixtieth of a second, and it's why people who think that CGI quality games are just around the corner, are living in fantasy land :eek:

Sharky is in a better position to say more. My ramblings aren't based on any real technical knowledge.
 

SharkyUK

ClioSport Club Member
Impressive rendering but the point stands that it's not a game. Until I'm actually using a mouse to move around in an environment of that quality, inside an actual action game running in Retina-style resolutions, at 60 fps, then I won't believe it. Ten years away at least IMO. Still, the aim is obviously to trick the 'next-gen' console players with money in their pocket, into thinking that Pixar are gonna be buying a Playstation 4 to make their next movie on.

It's laughable.
I have to agree, mate - impressive for sure but it's not in-game per se as you state. It's purely running a directed cinematic (albeit in real-time) but without the additional costs of game logic, without the costs of full in-game physics, yada yada. It's great as an indication of things to come though and it's not too far away if things continue at the current rate of evolution (although don't expect it on the next-gen consoles as this level of detail will purely be in the domain of the ultra-performance PC owners).

I always here people saying why are games not the same graphics as cgi yet? I take it this is because the hardware just couldn't render it in real time? So in that sense are we a long time away from near "real life" graphics?
I had an argument with a guy who was giving a lecture at a graphics symposium a few years ago as he was adamant that raytracing would be quick and usable enough for realtime gaming and simulation within 5 years. That was 2-3 years ago. And, in my opinion, we are probably still 10 years plus away from that being an attainable/feasible reality. Probably longer. For now we are stuck with rasterised graphics (as seen in all modern day games) for some time to come. To answer your question we simply don't have hardware powerful enough to render the scene in real-time to an accurate and naturalistic level (i.e. "real life") hence a whole host of rasterisation algorithms and techniques are used to achieve playable framerates on today's games; with much dumbed down simplifications of how light bounces around a scene, etc.

Yeah. A frame in a game has to be rendered at least 30 times a second. IMO 60, but that's a personally imposed minimum. CGI movies don't have to rendered in real time. It could take an hour, or a day, to render each frame, but then they're joined together and played back. At a wild guess, I would imagine it would take the average PC about a day to render one frame of Toy Story 3. That's some way short of the required one sixtieth of a second, and it's why people who think that CGI quality games are just around the corner, are living in fantasy land :eek:

Sharky is in a better position to say more. My ramblings aren't based on any real technical knowledge.
You're spot on, Roy. Pixar have two huge render farms consisting of thousands of processors dedicated to the tasks of rendering frames for their movies. These worked overtime when generating the final cut of Toy Story 3. I recall hearing that a typical frame took around 5-7 hours to render... and that's with the render farm doing the work! Some of the more complicated scenes took the best part of two days to render. On one of today's top spec PC's you would have to leave it running 24/7 for about 120 years if you wanted to produce Toy Story 3 from home... LOL!
 

Darren S

ClioSport Club Member
I had an argument with a guy who was giving a lecture at a graphics symposium a few years ago as he was adamant that raytracing would be quick and usable enough for realtime gaming and simulation within 5 years. That was 2-3 years ago. And, in my opinion, we are probably still 10 years plus away from that being an attainable/feasible reality. Probably longer. For now we are stuck with rasterised graphics (as seen in all modern day games) for some time to come. To answer your question we simply don't have hardware powerful enough to render the scene in real-time to an accurate and naturalistic level (i.e. "real life") hence a whole host of rasterisation algorithms and techniques are used to achieve playable framerates on today's games; with much dumbed down simplifications of how light bounces around a scene, etc.

Andy, I guess that comment above is the real hardware-kicker - but just how 'blurry' is that line between lighting accuracy to convincing the human eye that what it's seeing in front of it, is believable?

Although a shimmering water reflection via the sun, projected onto a building wall might look good in FPS - would I necessarily miss it when taking cover and firing pot shots back at the enemy? I'd guess not. BUT. If it was there, I wonder if my mind would notice it and actively think 'wow - that's pretty impressive'?

I'd suspect that the devs have a constant programming balancing act going on - with what is worthwhile eye-candy, without sacrificing frame-rate?

D.
 

Rubicon_

ClioSport Club Member
  Defender 110
Yeah. A frame in a game has to be rendered at least 30 times a second. IMO 60, but that's a personally imposed minimum. CGI movies don't have to rendered in real time. It could take an hour, or a day, to render each frame, but then they're joined together and played back. At a wild guess, I would imagine it would take the average PC about a day to render one frame of Toy Story 3. That's some way short of the required one sixtieth of a second, and it's why people who think that CGI quality games are just around the corner, are living in fantasy land :eek:

Sharky is in a better position to say more. My ramblings aren't based on any real technical knowledge.

Ah right understand that better now cheers. In that sense we certainly are years away then :(

I had an argument with a guy who was giving a lecture at a graphics symposium a few years ago as he was adamant that raytracing would be quick and usable enough for realtime gaming and simulation within 5 years. That was 2-3 years ago. And, in my opinion, we are probably still 10 years plus away from that being an attainable/feasible reality. Probably longer. For now we are stuck with rasterised graphics (as seen in all modern day games) for some time to come. To answer your question we simply don't have hardware powerful enough to render the scene in real-time to an accurate and naturalistic level (i.e. "real life") hence a whole host of rasterisation algorithms and techniques are used to achieve playable framerates on today's games; with much dumbed down simplifications of how light bounces around a scene, etc.

Toy Story really is impressive. I remember watching a programme on years ago about the making of Toy Story 1, to think back then in 1995 computers were rendering animations like that is just crazy. Puts it more in to perspective when you say it would take 120 years on todays computers if it was made from home lol.
 

SharkyUK

ClioSport Club Member
Andy, I guess that comment above is the real hardware-kicker - but just how 'blurry' is that line between lighting accuracy to convincing the human eye that what it's seeing in front of it, is believable?

Although a shimmering water reflection via the sun, projected onto a building wall might look good in FPS - would I necessarily miss it when taking cover and firing pot shots back at the enemy? I'd guess not. BUT. If it was there, I wonder if my mind would notice it and actively think 'wow - that's pretty impressive'?

I'd suspect that the devs have a constant programming balancing act going on - with what is worthwhile eye-candy, without sacrificing frame-rate?

D.
It's all smoke and mirrors, mate (no pun intended). To achieve the high frame rates we see today a whole heap of shortcuts are taken when it comes to visualising the gaming world in which we immerse ourselves. That's not to say that the techniques used don't have their roots in the real physical world; many do. But developers have to take huge liberties to ensure that the game is playable at a specific target frame rate on a particular hardware platform. Nowadays 30fps is really the minimum and many would argue that 60fps is the baseline figure to aim for. I guess it depends on whether you are more console or high-end PC aligned.... but that's a whole different argument! ;)

One of the interesting aspects of today's graphically-impressive games is the fact that you can actually get away with a lot when it comes to 'faking' it. You are quite right when you point out the little details and whether or not you'd actually notice them whilst in the throes of an enjoyable gaming session. More often than not, a hardcore (for want of a better word) gamer expects a graphically stunning title to take advantage of their rig/console (and why not?) but a lot of the visual make-up is actually lost when it comes down to what the game is ultimately about; i.e. playing it. It's not a passive movie, it's an interactive challenge. And consequently the focus of the gamer often means that he/she doesn't really care for this effect, or for that effect. They just want it to look good as they progress through the game itself. So yes, it doesn't matter if the reflections are faked or only updated every other frame, it doesn't matter that the bloom is not physically correct and so on. It just looks good and that is all that matters. The truth is that the talented folks out there are now getting very clever with their engines and rasterised graphics can be coerced into producing some truly stunning imagery, as we see from the likes of the tech demos that get released from time-to-time.

Personally, as stated elsewhere (I think), the key thing for me is in the improvements in lighting. Global illumination (and other such buzzwords). These newer lighting models (made possible by the increase in GPU and CPU performance) really do help when it comes to making a more consistent looking world and ensuring that a scene and it's entities gel together nicely.

Frame rate was/is always a bit of an issue, more so on the fixed platforms where upgrading to the latest GPU is not possible. Hence, yes - there is a need to strike a balance between the visuals and what is deemed an acceptable interactive frame rate. When I worked on my first game the target frame rate (to be classed as playable) was 14-15fps...
 
Andy, I guess that comment above is the real hardware-kicker - but just how 'blurry' is that line between lighting accuracy to convincing the human eye that what it's seeing in front of it, is believable?

Although a shimmering water reflection via the sun, projected onto a building wall might look good in FPS - would I necessarily miss it when taking cover and firing pot shots back at the enemy? I'd guess not. BUT. If it was there, I wonder if my mind would notice it and actively think 'wow - that's pretty impressive'?

I'd suspect that the devs have a constant programming balancing act going on - with what is worthwhile eye-candy, without sacrificing frame-rate?

D.

I would argue that light is everything if we are to make that leap into photo-realistic games. Considering that without light, we can't see anything, I imagine it's very difficult to trick our brains into thinking something is real without some extremely realistic and complex light calculations.
 

SharkyUK

ClioSport Club Member
Very true regarding the light - hence why I believe that it is the advancement in this area that will take games to that next level. The interplay of a single ray of light bouncing around the most basic of scenes soon becomes prohibitively expensive to calculate; and all - ultimately - to determine what the final colour of a pixel will be. To get around this massive computational overhead engines, such as the new Crytek and Unreal engines, use a plethora of offline (non-realtime) processes to effectively bake global lighting information into encoded data formats that can then be quickly accessed in-game to simulate light interplay in the world. I won't go into too much detail but here's a very basic example of how indirect lighting (as part of a global illumination system) is typically used in a rasterised game as opposed to doing it 'properly' via raytracing... (sorry if it's boring and feel free to skip it!)

Modern game engine:
Something called spherical harmonics can be used to encode indirect lighting in a scene - i.e. indirect light is light in a scene that has bounced off objects and has not come directly from a light source. In effect, this provides a base environment lighting colour on top of which other lighting models can be applied - such as the diffuse lights, specular highlights, reflections, etc. It is the indirect lighting that adds so much to the scene and that is responsible for getting a consistent lighting in the game. Spherical harmonics allow an area to be defined in the world, and 'light probes' can be placed in the world by the artist / level designer. Once the probes are placed the tool then effectively transforms any light sources with influence in that area and determines what the base colour contribution would be for the scene where the light probe is positioned. This can be determined by running some heavier lighting calculations to bounce lights off environments so that the indirect lighting picks up colour from floors, walls, sky, etc. As this is an offline process it can take seconds, minutes or hours depending on the detail and complexity. Once a colour value is determined it is encoded using spherical harmonics and these values are made available in game, to the GPU shaders, in some efficient data format. At this point, thousands if not millions of lighting calculations have been consolidated into a single colour for a known point in the world.

Now, when it comes to visualising that world in realtime the 3D geometry in the scene can submit it's world position to the shader and the indirect light information stored in the spherical harmonic encoded data can quickly be read. So for position x,y,z we can quickly determine what the indirect lighting colour should be (as a result of the offline processing that was done before). This means a lookup or two can be performed to get a base colour and we don't have to spend aaaaages calculating the ray's journey as it bounces time after time over thousands of polygons.

Raytracing:
Ok, we are in our 3D world and we need to determine the indirect lighting for that same x,y,z position. We shot a primary ray from our viewpoint for the pixel that corresponds to that x,y,z position and we calculated where the ray hit; that's the first hit point. At this point we have various things to do depending on the complexity of the material of the object (such as calculate reflections, refractions, etc.) But we're just interested in the indirect lighting. To do this we need to fire off a whole bunch of secondary rays - from the hit point - away from the object and see what they next hit in the scene. To make this look half decent we are probably looking at 500-1000 rays... and, of course, each and every one of those rays we've just fired out could also bounce from another object and spawn another 1000 rays each... When we hit another object we then query to see how much of an influence that object has on the scene for our very first hitpoint. That is, how much does the presence of the newly hit object affect our first hit object. Does it contribute some of it's colour for example? Ultimately we sample millions of rays and points to determine indirect lighting contributions for our original hit point and then use this to set that pixel to the correct colour. It takes a LONG time and is not feasible realtime. But it does mean we can get some great effects such as colour bleeding, radiosity like effects and so forth.

^ Apologies if that doesn't make sense - it's kind of difficult to explain and is a very basic outline of one particular aspect of the lighting pipeline! Here's a couple of pics.

(Realtime) Direct lighting only - i.e. only light contributions are from objects directly hit by rays from the light sources...

121019_without_gi.jpg


(Realtime) With indirect lighting only - i.e. with light contributions from light 'bounces' as well as direct lighting from the light sources...

121019_with_gi.jpg


(Non-realtime) With indirect lighing contributions from shooting 1000 sample rays per secondary hit. Notice how light bleeds so you can see the wall colours hinted on the ceiling, etc.

cbox_pathtracing.gif


Crap explanation - I'm sorry but I lost enthusiasm half way through as it's actually quite difficult to explain. :eek:
 
So basically, mimicking the real behaviour of photons in graphic applications is extremely complex and massively processor intensive. I think we're now at the stage where software and hardware can just about trick us in pre-rendered video, but real-time interactive games??

Blimey. It seems that twenty years would be a more realistic target.
 

Darren S

ClioSport Club Member
There's a pretty striking difference in your room pictures above, Andy. I'm assuming like Royston says, that there's a massive load dumped on the hardware to calculate the additional lighting 'bounces' from the other surfaces?

The 2nd pic really does look SO much better. The first pic is like a Doom 3 level.... ;)

D.
 
  Evo 5 RS
^ Not too shabby for a console. Certainly a step in the right direction.


Yep, although you can see a lot of the particle effects have been removed. That's within 2 months though, so I'm sure with a bit of work they'd be able to match or even better it. People on here don't seem to be grasping what an awesome bit of kit the PS is going to be this time round.
 

SharkyUK

ClioSport Club Member
There's a pretty striking difference in your room pictures above, Andy. I'm assuming like Royston says, that there's a massive load dumped on the hardware to calculate the additional lighting 'bounces' from the other surfaces?

The 2nd pic really does look SO much better. The first pic is like a Doom 3 level.... ;)

D.
Even with the current range of tricks and techniques for simulating global illumination / indirect lighting the amount of calculations crunched by the GPU(s) is phenomenal; hence the reason why dev's aim to pre-calculate as much information as possible beforehand (and offline) before sending to the hardware for processing.

Roy hit the nail on the head and summed up what I was trying to say in a sentence! Real-life is expensive! The processing requirements soon go exponentially high when trying to accurately and realistically simulate the physical world. So, in the meantime, we'll have to make do with clever approximations. :)

If you want to get a feel for where things are at in terms of real time raytracing with true reflections, multiple bounces, etc. then you might be interested in this demo that was released last week. Whilst the developers claim it is completely real in terms of tracing the light rays through the scene, I'm sure there are a few hacks in there to make it run fairly well on today's hardware. If you plan to run the demo at the highest resolution then you'd better have a beast of a machine... and, of course, it hardly looks realistic despite being extremely impressive from a technical point-of-view.

http://pouet.net/prod.php?which=61211

The download link is on the right side of the demo screenshot. Here's the video if you don't fancy downloading and running the demo...



Yep, although you can see a lot of the particle effects have been removed. That's within 2 months though, so I'm sure with a bit of work they'd be able to match or even better it. People on here don't seem to be grasping what an awesome bit of kit the PS is going to be this time round.
Yes - definitely some dumbing down of visuals as you say, but nevertheless impressive. I totally agree that the PS4 (and new Xbox for that matter) are going to be pretty impressive when they launch. Whilst I can't fault the impressive sales of the PS2 and relative success of the PS3, I do wish that Sony had gone for a more PC-like architecture years ago. I think it would have benefited them even more (as well as the game dev and games-playing world as a whole). In my opinion of course. :)
 
Last edited by a moderator:

SharkyUK

ClioSport Club Member
It's a bit of a cliché but I do like the popular "let's render a high detail head" demonstrations now and again... so I'll just throw this in here.

 
Last edited by a moderator:
  Evo 5 RS
^ Saw this, it's impressive - and rumour has it the technology will be used in the next generation of CoD games. (yay?)


A break down of the Unreal 4 engine for those interested

 
Last edited by a moderator:

SharkyUK

ClioSport Club Member
The Unreal 4 Engine has an incredible feature set and some of those features are fantastic; such as being able to jump into the code and making alterations on the fly. Stuff like this makes me want to go back to working on technology in the games industry.
 

SharkyUK

ClioSport Club Member
Loving the new CryEngine technology... some of the tech demos are pretty impressive! They are advertising for programmers at the moment and I'm tempted to see if they've got any roles in their tech team; especially with the redundancy issue that reared it's head out the blue last week! Anyways...

 
Last edited by a moderator:

Darren S

ClioSport Club Member
Loving the new CryEngine technology... some of the tech demos are pretty impressive! They are advertising for programmers at the moment and I'm tempted to see if they've got any roles in their tech team; especially with the redundancy issue that reared it's head out the blue last week! Anyways...

M8, that's a pisser. :( Is it just speculation at the minute though?

​D.
 


Top