ClioSport.net

Register a free account today to become a member!
Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

  • When you purchase through links on our site, we may earn an affiliate commission. Read more here.

GSync graphics cards and monitors



  BMW M4; S1000 RR
Is it worth it?

I've got an AMD 290 in my machine at the moment, but I'd quite like something shiny and I hate the tearing that I get on BF4 when turning quickly. Moving up to 2560x1440 at the same time would be quite nice too.

So it's really a toss up between a

2nd R9 290, don't really want to do this- ignoring crossfire scaling issues, I would rather sell the 290 while it's still worth a bit.
New AMD card maybe, but AIO cooler puts me off.
980Ti, proven performance in games, and I do like the reference cooler. Even if it's a bit louder on average than fans, it doesn't have the same whir noise that a normal fan has (my old HIS 7950 had the same style fan which I liked).
 

Darren S

ClioSport Club Member
It's definitely heading that way. I've not seen the actual results of a G-Sync'd monitor and card - but from the reviews I've read, it really is impressive.
Btw - G-Sync is a technology that's proprietary to nVidia - you would have to check if your 290 is compatible with a monitor that supports Freesync.

I would certainly be heading this way instead of 4K supported monitors. Again, it's only down to what I've read and not physically seen, but I've heard lots of niggles and issues with the current crop of 4K hardware.

With regards to BF4 as well - disable vsync in the game, but enable it via the ATI/AMD Catalyst Control Centre (if it's still called that?). There's a couple of titles I've had where screen tearing is common, despite vsync being enabled within the games' options menu. Switching it off and forcing it on by the graphics cards control panel usually works.
 
Don't bother with another 290.

I have 2 290's currently, and crossfire is absolutely dogshit. I won't be going crossfire again.

Half the time you get better performance disabling the 2nd gpu, and the other half of the time it isn't supported anyway.
 

Darren S

ClioSport Club Member
Don't bother with another 290.

I have 2 290's currently, and crossfire is absolutely dogshit. I won't be going crossfire again.

Half the time you get better performance disabling the 2nd gpu, and the other half of the time it isn't supported anyway.

I can't really get my head around why multi-GFX cards are an issue - especially when using two identical boards? Surely we've been using parallel processing and tasking for as long as we can remember - and simply splitting the load between two identical GPUs and reforming the image should be a given? Even if there is a GPU overhead to do this - so that you don't get a true 100% performance increase by adding another card - you should get a hefty hike in throughput.

What's worse is when mainstream devs like R* really do c**k-up games like GTAV with SLi - yet smaller companies seem to cope with it fine. Even taking my rare setup of two GTX690 cards into account - in SLi mode (for which there's a specific SLi profile for the game) - the performance is horrendous - completely unplayable. Disable SLi - and it plays fine.

Let's face it - gaming GPUs are now either nVidia or ATi/AMD based. On top of that, the vast majority of people are running Window 7 or Windows 8 - again with either DX10 or DX11 APIs. That's not a huge range of variance in order to cover all bases. The GPU manufacturers should either release code that makes multi-GPUs and their implementation literally faultless. Or they should start to withdraw SLi & Crossfire as being a viable option. You couldn't sell a V8 AMG Merc to a customer and then tell them that with their clever cylinder-bank switch-off ability, that you get more performance on track running it as a four-pot...
 
I can't really get my head around why multi-GFX cards are an issue - especially when using two identical boards? Surely we've been using parallel processing and tasking for as long as we can remember - and simply splitting the load between two identical GPUs and reforming the image should be a given? Even if there is a GPU overhead to do this - so that you don't get a true 100% performance increase by adding another card - you should get a hefty hike in throughput.

What's worse is when mainstream devs like R* really do c**k-up games like GTAV with SLi - yet smaller companies seem to cope with it fine. Even taking my rare setup of two GTX690 cards into account - in SLi mode (for which there's a specific SLi profile for the game) - the performance is horrendous - completely unplayable. Disable SLi - and it plays fine.

Let's face it - gaming GPUs are now either nVidia or ATi/AMD based. On top of that, the vast majority of people are running Window 7 or Windows 8 - again with either DX10 or DX11 APIs. That's not a huge range of variance in order to cover all bases. The GPU manufacturers should either release code that makes multi-GPUs and their implementation literally faultless. Or they should start to withdraw SLi & Crossfire as being a viable option. You couldn't sell a V8 AMG Merc to a customer and then tell them that with their clever cylinder-bank switch-off ability, that you get more performance on track running it as a four-pot...

I wish I understood it too, its mental.

I don't actually remember the last time I enabled the 2nd gpu, so it's been sat here disabled, but in the watercooling loop and with minimal power, for at least a year lol.
 
  BMW M4; S1000 RR
It does seem strange. But that's just the way it is for now sadly. I'll just sell my R9 290 and buy something new.

Don't suppose you're looking to go 3 way crossfire @Addicted ? ;)
 

SharkyUK

ClioSport Club Member
If only SLI were so easy... :) It's a real pain in the ass and the bottom line is that current architectures (hardware) are not best suited to it and a lot of game engines and rendering systems out there aren't best suited for SLI interoperability either. Rendering engines don't work on-demand and compositing the finalised frame buffer image from two (or more) discrete cards at the same time (where cards are already running several frames out of sync with the true state of the underlying engine) is not trivial; there's an awful lot of things that have to come together to make it work well. In a lot of cases, the overhead in ensuring everything is ready for SLI deployment results in lesser performance than sodding it off and running it on a single GPU! :)

To be honest, having been in this game since day one, I just wish they'd ditch multi-GPU setups. And instead focus on bad-ass single GPU solutions. It may pi$$ off the SLI/CF enthusiasts but they are the minority. Having faffed around with all manner of cards and various configurations I much prefer just to buy the fastest single GPU solution I can afford when upgrade time comes around. I don't think SLI/CF will ever become mainstream. People are getting lazier and don't want the hassles; unless they are overclockers, enthusiasts or PC willy-wavers. :tongueout: I had problems with SLI back in the days of the Orchid Righteous 3D (albeit a different interpretation of SLI) and still have issues these days, hence I don't bother. If I want 4K gaming (which I don't think is truly with us just yet) then I'll buy whichever single GPU solution allows it to happen in a year or so time. In the meantime I'll happily stick with what I've got and play at lower resolutions or slightly less than Ultra settings. :)

That said, GSync monitors are fantastic. Sooooo buttery smooth in a nicely spec'd system! :)
 
  BMW M4; S1000 RR
That said, GSync monitors are fantastic. Sooooo buttery smooth in a nicely spec'd system! :smile:

Cheers for that last bit :(

Was thinking "yeah, you know what- maybe I don't need 1440 graphics on ultra settings".. I just don't want to buy a Freesync monitor and then wish I'd gone Gsync down the line.
 

SharkyUK

ClioSport Club Member
Cheers for that last bit 😧
You're welcome! :tonguewink: It might be worth hanging on a bit if you can mate. I won't hide my bias; I prefer nVidia and will tend to recommend those over AMD. I've nothing against AMD but always had better experiences with nVidia; from developer relations to driver support and so forth. That stuck with me hence I'm still very much in the green camp today. I've not had any real experience of FreeSync technology at work so can't really comment.

One thing I will say is that it can get quite expensive. To see playable 2560x1440 with Ultra-like settings and 60+ fps is going to require a top-end card. On top of that a good quality GSync monitor will cost a fair few quid, too.
 
  BMW M4; S1000 RR
You're welcome! :tonguewink: It might be worth hanging on a bit if you can mate. I won't hide my bias; I prefer nVidia and will tend to recommend those over AMD. I've nothing against AMD but always had better experiences with nVidia; from developer relations to driver support and so forth. That stuck with me hence I'm still very much in the green camp today. I've not had any real experience of FreeSync technology at work so can't really comment.

One thing I will say is that it can get quite expensive. To see playable 2560x1440 with Ultra-like settings and 60+ fps is going to require a top-end card. On top of that a good quality GSync monitor will cost a fair few quid, too.
Worth hanging on, i.e. successor to 980Ti whatever that may be?
 

SharkyUK

ClioSport Club Member
Worth hanging on, i.e. successor to 980Ti whatever that may be?
I think it might be a while until the successor comes along mate, especially with both nVidia and AMD potentially looking to move to smaller fabrication in the not-too-distant future. I suggested maybe hanging on to see if the launch of AMD's new flagship card will affect pricing at the top-end of the market (although I suspect it won't given the relatively underwhelming reception the Fury X has received). It's also worth taking some time to read reviews of the current crop of G-Sync monitors. Some are very good, some have a few issues with excessive glow and suchlike.
 
  BMW M4; S1000 RR
Oh I see what you mean.

Yeah I won't be buying anything over the next month anyway which sort of fits your advice anyway. I could sell my 290 in the meantime since the only game I can't do without is HoN and I'm pretty sure that would play ok on HD4600.

Currently got my eyes on a reference 980Ti and Asus ROG Swift (as much as I hate the idea of paying Asus tax)

Slightly OT, I bought by Dad a PC upgrade for his birthday, got him a 4690K, bequiet PSU and a Noctua CPU cooler (plus RAM and mobo). It flippin flies. It's got a Crucial SSD in, but no word of a lie it boots in 7 seconds once you hear the BIOS beep. You don't even see the "Windows is starting", the screen just goes black for a few seconds and you're on the desktop. He's made up which is good :)
 
  MK4 Anni & MK5 Edt30
I'm currently rocking the Swift and a 980 and it's absolutely amazing. Been eyeing up the 980Ti's myself but might just wait for Pascal tbh. I'm smashing everything at 1440p as it is :)
 

SharkyUK

ClioSport Club Member
Slightly OT, I bought by Dad a PC upgrade for his birthday, got him a 4690K, bequiet PSU and a Noctua CPU cooler (plus RAM and mobo). It flippin flies. It's got a Crucial SSD in, but no word of a lie it boots in 7 seconds once you hear the BIOS beep. You don't even see the "Windows is starting", the screen just goes black for a few seconds and you're on the desktop. He's made up which is good :smile:
Great stuff! I'd have been tempted to keep it for myself... :tonguewink:

I'm currently rocking the Swift and a 980 and it's absolutely amazing. Been eyeing up the 980Ti's myself but might just wait for Pascal tbh. I'm smashing everything at 1440p as it is :smile:
I'm torn between upgrading my 780 Ti SSC to a 980 Ti, or waiting for Pascal. I think I'll probably wait now. I can't really justify the 980 when the 780 is still delivering solid visuals as 1200p (which is what I'm currently playing at). As much as I'd like to make the jump to Maxwell it's a lot of money for not a lot of difference (as things currently stand). I keep telling myself that at least...
 

Darren S

ClioSport Club Member
Love the Noctua coolers - assuming I stay on air only (and as CPUs get quieter and less power hungry, why wouldn't you? :smile: ) - I'll get another for my next system.

I think I'm just going to keep my setup for a while as I've already had my money's worth out of it, several times over! That is unless a wad of cash heads my way - though realistically that would go on the wedding next year!

I agree with Andy that for my next GPU purchase, I would probably go with a single card setup. It frustrates the hell out of me that multi-GPU isn't a 100% guarantee - given how long the principle has been around and lets face it - the non-too-shy investment from the manufacturers over the years. Motherboard manufacturers offer the slots. GPU manufacturers offer the hardware to work in parallel - and yet there's still the occasional (sometimes massive) gaming title that simply sucks in running multiple GPU setups. It's almost an industry lie that has been going on for two decades and people (like me) - still buy into it.

At least when DX12 arrives - there are strong rumours that GPU memory pooling will exist. Who knows, I may even get to use all of the 8GB memory that I have on my cards! :smile:
 
  BMW M4; S1000 RR
Oh for sure, such quality.

I had a H100i on my old build, it had a weird ringing noise coming from the radiator- it's currently with Corsair with an RMA and when it's returned will be going straight on eBay. Never buying an AIO cooler again tbh.

I see the appeal with a big waterloop, I think SilentScone posted pictures before of 3x cards being cooled by a loop which looked great and I'm sure it was quiet, but that's another level really.

I got a Noctua DH15 for my machine which in benchmarks performs better than the H100i, it's quieter and also- what's to go wrong? If the fan broke down, a replacement is £17 from Amazon. The one point I miss is the pretty LED colours on the Corsair, but I could always get some LED light strips to sit behind the fans for a similar effect I suppose.
 
  BMW M4; S1000 RR
Well I've got an RMA sorted for my MSI 290 on the fact that it can't seem to run without hitting it's thermal limit anymore and one of the fans intermitantly cuts out.

Going to pop it on eBay or here when it gets replaced. Hastily ordered a 980Ti reference design (I love the reference cooler, even when the actual volume starts becoming audible, it's nowhere near as annoying as traditional fans spinning) from Amazon, 1-4 weeks delivery but last time I ordered something that said that- it showed up by the end of the week. Still, it'll be a nice thing to look forward to at some point.

Then I just have to find myself a new monitor. Part of me is tempted to stay with 1920x1080... But the other part wants to ignore the bad hear-say about the Acer 4K GSync monitor and give that a go.... Not bad decisions to be made I guess!
 
  Yaris Hybrid
I'm currently using a ROG Swift with 980ti SLI.

I've found G-Sync to be a mixed bag. In Witcher 3 it was fantastic in the early days before optimisation as it would occasionally drop down into the mid 50fps range (with a 60 refresh) but there was obviously no v-sync stutter when it dipped under 60fps because g-sync did its job.

However with GTA 5 it is unplayable when using G-Sync for me as it constantly stutters. If I use v-sync it is fine. No idea what is causing that bug. I've done a clean install (got a new SSD) in that time and that didn't change anything. I also clean installed when I went from 980's to ti's and again that didn't cure it. I see others reporting the same bug too so it ain't my PC at fault and I just use v-sync in that game now.

To be honest I don't notice any improvement in input lag (I am not a pro-gamer) so for me if I had a PC that could maintain minimum frame rates above the monitors refresh, I'd just stick with v-sync. E.g. if I was still using 1080p I wouldn't be fussed about g-sync as v-sync stutter would never happen anyway and I never use 144fps because I don't think the benefit over 60fps is worth the heat/noise (personally speaking).

Ultimately I'd always go for the monitor with the best image quality with these clever sync techs coming second as a factor in my choice. I probably have some buyers remorse over this ROG Swift although it ain't bad for a TN.
 
  Yaris Hybrid
If only SLI were so easy... :smile: It's a real pain in the ass and the bottom line is that current architectures (hardware) are not best suited to it and a lot of game engines and rendering systems out there aren't best suited for SLI interoperability either. Rendering engines don't work on-demand and compositing the finalised frame buffer image from two (or more) discrete cards at the same time (where cards are already running several frames out of sync with the true state of the underlying engine) is not trivial; there's an awful lot of things that have to come together to make it work well. In a lot of cases, the overhead in ensuring everything is ready for SLI deployment results in lesser performance than sodding it off and running it on a single GPU! :smile:

To be honest, having been in this game since day one, I just wish they'd ditch multi-GPU setups. And instead focus on bad-ass single GPU solutions. It may pi$$ off the SLI/CF enthusiasts but they are the minority. Having faffed around with all manner of cards and various configurations I much prefer just to buy the fastest single GPU solution I can afford when upgrade time comes around. I don't think SLI/CF will ever become mainstream. People are getting lazier and don't want the hassles; unless they are overclockers, enthusiasts or PC willy-wavers. :tongueout: I had problems with SLI back in the days of the Orchid Righteous 3D (albeit a different interpretation of SLI) and still have issues these days, hence I don't bother. If I want 4K gaming (which I don't think is truly with us just yet) then I'll buy whichever single GPU solution allows it to happen in a year or so time. In the meantime I'll happily stick with what I've got and play at lower resolutions or slightly less than Ultra settings. :smile:

That said, GSync monitors are fantastic. Sooooo buttery smooth in a nicely spec'd system! :smile:
After Batman I am starting to think that way about SLI too.

The problem is that once you have SLI it is hard to go back. I mean when the next GPU comes out I could switch to a single system which would be a big power loss in some games but a gain in others.

I could wait until a single GPU would be as powerful as what I have now (when it works) but in the years until that time my system will be sub par on games that don't work well with SLI.

A part of me wishes I had never moved to SLI as it is a vicious little trap. Not to mention a £1000 bill every time new cards come out when I could just be spending £500. The more I think about that the more I wonder about going back to singles...
 
  BMW M4; S1000 RR
Ultimately I'd always go for the monitor with the best image quality with these clever sync techs coming second as a factor in my choice. I probably have some buyers remorse over this ROG Swift although it ain't bad for a TN.

Surely a single 980Ti can handle 2560x1440 @ 60+fps in any game that's currently out?

When you say buyers remorse for the ROG Swift, what would you go with instead? It's £550 on Amazon which is at least £100 less than it's initial RRP. I'm also learning that there's only a handful of GSync monitors atm and feel like a 1920x1080 monitor would be a bit of a waste but 4K is probably a bit too far the other way...

At least with Amazon I could theoretically just keep returning them until I find a nice compromise.
 

Darren S

ClioSport Club Member
I'm currently using a ROG Swift with 980ti SLI.

I've found G-Sync to be a mixed bag. In Witcher 3 it was fantastic in the early days before optimisation as it would occasionally drop down into the mid 50fps range (with a 60 refresh) but there was obviously no v-sync stutter when it dipped under 60fps because g-sync did its job.

However with GTA 5 it is unplayable when using G-Sync for me as it constantly stutters. If I use v-sync it is fine. No idea what is causing that bug. I've done a clean install (got a new SSD) in that time and that didn't change anything. I also clean installed when I went from 980's to ti's and again that didn't cure it. I see others reporting the same bug too so it ain't my PC at fault and I just use v-sync in that game now.

To be honest I don't notice any improvement in input lag (I am not a pro-gamer) so for me if I had a PC that could maintain minimum frame rates above the monitors refresh, I'd just stick with v-sync. E.g. if I was still using 1080p I wouldn't be fussed about g-sync as v-sync stutter would never happen anyway and I never use 144fps because I don't think the benefit over 60fps is worth the heat/noise (personally speaking).

Ultimately I'd always go for the monitor with the best image quality with these clever sync techs coming second as a factor in my choice. I probably have some buyers remorse over this ROG Swift although it ain't bad for a TN.

I honestly have no idea what the hell R* were/are playing at with GTAV and multi-GPU setups. Sure, it's still a slightly niche area - but it's 2015 now. And it's a demanding game graphically. Hello??? Anyone think of optimising it for multiple graphics cards?

I find it utterly ludicrous in this day and age that to get GTAV running at any form of decent pace - I have to run it on a single GPU - when I have a quad-Sli system ready to go.
 
  BMW M4; S1000 RR
Ok.

4K Acer monitor arrived today.

Screen quality seems ok, better than my £100 Iiyama that precedes it. No noticable defects initially either.

The resolution, is stunning. I haven't played with scaling options much, I just went to Display and changed the setting from 100% to 150% which I can only assume was a scaling option as it seems to have done the trick. The text on this webpage is so sharp it's rediculous!!!

The ROG Swift (at nearly £100 more) will have to be mighty impressive in games for me to send this Acer back. f**k me!

BRB to rinse some games!!
 
  BMW M4; S1000 RR
So performance doesn't feel amazingly smooth so far. BF4 with AA turned off runs at what it says is 60fps. Gsync on but I'm not feeling the smoothness that people describe.

I will play with some settings tomorrow. The scaling options in Windows causes some funny behaviour in full screen apps (invisible cursor or cursor not aligning with actual cursor coordinates)
 
  MK4 Anni & MK5 Edt30
So performance doesn't feel amazingly smooth so far. BF4 with AA turned off runs at what it says is 60fps. Gsync on but I'm not feeling the smoothness that people describe.

I will play with some settings tomorrow. The scaling options in Windows causes some funny behaviour in full screen apps (invisible cursor or cursor not aligning with actual cursor coordinates)
Just wait for the ROG and experience super butter :D
 
  BMW M4; S1000 RR
Mainly Dota 2 mate. Currently racked up 3000 hours over 2 years haha.
I play HoN.

Do you genuinely feel like 144hz makes a difference to Dota? I was thinking HoN was going to be the game that 144hz wouldn't matter on... I'll tell you now it looks bloody brilliant at UHD.
 
  BMW M4; S1000 RR
Currently, as in the monitor I was using last week:

1920x1080 @ 60hz.

So both monitors are going to be big improvements, but currently I'm wowed by the resolution increase. Staying at 60hz feels normal- if 144hz is really that good then I'll probably 'downgrade' to 2560x1440 so that I can get away without scaling Windows7 (as it doesn't play nicely with several games I have). But either way- it will truly be a sad moment packing away the 4K. (Which makes me think the ROG Swift will be going back as it stands)
 

Darren S

ClioSport Club Member
Depends imo - are you on 60hz or 120hz currently? The jump from 60 to 120 is quite substantial, I imagine it'd be a fair bit less going from 120 to 144.

I'm running at 59hz at the minute with a plan to go for one of these Asus ROG monitors at the end of the year. I think I'll notice a difference! :)
 
  MK4 Anni & MK5 Edt30
Currently, as in the monitor I was using last week:

1920x1080 @ 60hz.

So both monitors are going to be big improvements, but currently I'm wowed by the resolution increase. Staying at 60hz feels normal- if 144hz is really that good then I'll probably 'downgrade' to 2560x1440 so that I can get away without scaling Windows7 (as it doesn't play nicely with several games I have). But either way- it will truly be a sad moment packing away the 4K. (Which makes me think the ROG Swift will be going back as it stands)
When does your Rog come Jeff?
 
  Yaris Hybrid
Surely a single 980Ti can handle 2560x1440 @ 60+fps in any game that's currently out?

When you say buyers remorse for the ROG Swift, what would you go with instead? It's £550 on Amazon which is at least £100 less than it's initial RRP. I'm also learning that there's only a handful of GSync monitors atm and feel like a 1920x1080 monitor would be a bit of a waste but 4K is probably a bit too far the other way...

At least with Amazon I could theoretically just keep returning them until I find a nice compromise.

Guess I'm too late replying but anyway....

Depends what you mean by "handle"?

As a PC gamer buying a top of the range card you expect to play on ultra settings (albeit you can tolerate low budget AA solutions) and in that case the answer to your first question is "not even remotely". You need SLI to play a large number of the latest titles on Ultra/1440 even with FXAA or no AA and keep the minimum fps north of 60. If I disable SLI I have to slam the detail right down in games like Witcher 3.

Sure in fairness you can easily tweak the detail down and it will handle it and you may not notice much difference in the looks.

G-Sync of course can help as you don't religiously have to keep the fps above 60 to avoid v-sync stutter (I won't tolerate the tearing that comes with adaptive). In that sense g-sync offers greater benefits to owners of a single 980ti than those of us using multi GPU setups.

As for the second question, personally I'd like to go back to my old 1440p IPS screen. It had a better quality image but it also had more connectivity options. I could connect my PS4 to it and it actually looked fantastic despite not running native. It was 60hz and if a game is at least half optimised I can ensure it stays >60 on my PC. I hate being stuck with just one display port input.

If I had a single card I'd go back to a 1080p IPS panel. Others would opt for 1440p and just turn the detail down. Paying for 144hz in that scenario would be pointless though unless you are playing old games like CS competitively.

*These are purely my personal tastes*. Others have different preferences.
 
  BMW M4; S1000 RR
Guess I'm too late replying but anyway....

Depends what you mean by "handle"?

As a PC gamer buying a top of the range card you expect to play on ultra settings (albeit you can tolerate low budget AA solutions) and in that case the answer to your first question is "not even remotely". You need SLI to play a large number of the latest titles on Ultra/1440 even with FXAA or no AA and keep the minimum fps north of 60. If I disable SLI I have to slam the detail right down in games like Witcher 3.

Sure in fairness you can easily tweak the detail down and it will handle it and you may not notice much difference in the looks.

G-Sync of course can help as you don't religiously have to keep the fps above 60 to avoid v-sync stutter (I won't tolerate the tearing that comes with adaptive). In that sense g-sync offers greater benefits to owners of a single 980ti than those of us using multi GPU setups.

As for the second question, personally I'd like to go back to my old 1440p IPS screen. It had a better quality image but it also had more connectivity options. I could connect my PS4 to it and it actually looked fantastic despite not running native. It was 60hz and if a game is at least half optimised I can ensure it stays >60 on my PC. I hate being stuck with just one display port input.

If I had a single card I'd go back to a 1080p IPS panel. Others would opt for 1440p and just turn the detail down. Paying for 144hz in that scenario would be pointless though unless you are playing old games like CS competitively.

*These are purely my personal tastes*. Others have different preferences.
Cool thanks.

I've got the 4K Acer monitor on my desk, ROG Swift hasn't arrived yet.

Honestly I'm not sure I've noticed Gsync (other than making the screen flicker during the loading screen on HoN where it always dips to a few fps) and even considering sending this back in favour of the £260 AOC 4K monitor.

I only use my monitor for the PC, no consoles so connectivity doesn't matter. I've seen a few IPS panels and really love the colours- but I'm not sure I'll be able to say bye to this resolution. I have found myself playing new demanding games to see what it's like at that res (of course I need to turn settings down). But so far I'd take 3840x2160 @ Medium over 1920x1080 @ Ultra. I guess that's just a personal taste too.

I'm at the point I feel like cancelling the Asus, just because there's no way it will be good enough to make me want to send this one back.
 
  Yaris Hybrid
I think the merits of the ROG Swift are swings and roundabouts but one thing that I think everyone can agree on is that for a 1440p TN panel it is way too expensive when you think about what you can get for half the price without g-sync. I'd cancel it ASAP.

I had thought about 4k before but I heard that there are still scaling issues with regular windows applications such that text etc is too small in those where you can't zoom? Also if you don't run games at native on a 4k screen (e.g. you try 1080p or 1440p), how does it look?

I find PC games didn't look that good at 1080p on my ROG Swift although the PS4 looked amazing scaled up on my old IPS unit. Does it not look so bad on 4k as it is surely easier to scale it up accurately when you have more pixels?
 
  BMW M4; S1000 RR
I think the merits of the ROG Swift are swings and roundabouts but one thing that I think everyone can agree on is that for a 1440p TN panel it is way too expensive when you think about what you can get for half the price without g-sync. I'd cancel it ASAP.

I had thought about 4k before but I heard that there are still scaling issues with regular windows applications such that text etc is too small in those where you can't zoom? Also if you don't run games at native on a 4k screen (e.g. you try 1080p or 1440p), how does it look?

I find PC games didn't look that good at 1080p on my ROG Swift although the PS4 looked amazing scaled up on my old IPS unit. Does it not look so bad on 4k as it is surely easier to scale it up accurately when you have more pixels?

Ah see now Amazon says "Preparing for dispatch" so it'd be rude to not even try it! I've been talked into giving it a good go as I was remembering how smooth I remember 100hz CRT screens were back in the day, and my first experience of a TFT screen was "wtf it only goes to 60hz" and felt like it wasn't as smooth as the CRT, but that it was certainly tolerable.

I've yet to encounter the Windows scaling problems tbh. I didn't do anything particularly special, just turn the windows display setting from 100% to 150%. The issue comes when you play full screen apps- they behave a little strange. There is a compatibility setting that lets you disable windows display settings for the app, but then it doesn't alt tab properly on my favourite game. That's my only niggle so far.
 
  BMW M4; S1000 RR
Just booted up Witcher3 and playing with the graphics settings. Got it on 1080- you instantly notice the degredation in detail, trying to sit a bit further back and it still looks good overall, you just notice that it's not as sharp as it could be.

I think this is genuinely how 1080 would look at 28".

I've just turned it up to 1440 (leaving everything at ultra including post processing). Everything immediately looks sharper, big improvement and surprisingly things are still running perfectly smoothly..

When I minimise the game to type this, I think it's disabling the Gsync limit as GPU usage jumps to 100% and the fans increase which tells me that the game is being limited to 60fps.

Just switched up to 2160. That is quite honestly, like the jump from 1080 to 1440 again. Difference is massive, the detail on the character is the most obvious thing, but everything is razor sharp. Grass, roof tops all look great. Game is definitely not running well at ultra with post processing on high (highest setting if you've not played W3).

Just turned post processing down to low (graphics preset is still on ultra) and the game is running pretty damn well. More than playable, although as the fans/gpu usage is not changing in and out of game- I'd guess that it's not hitting the 60fps limit.
 
  Yaris Hybrid
I find the stock fan profile on my reference 980ti's is weird.

The fans literally stay at idle speed until it is at 100% load/max temp or something. On Witcher 3 I play it all maxed out (but hairworks off) and the cards are at 70% load or something but at 83c and totally silent.

So it basically lets the temps rise to the maximum 83 and then as you call on more performance after that point it finally starts waking up the fans.

I loaded up Afterburner and just enabled the default user fan profile and immediately you hear the fans spin up and the temps drop down to the 60's.

I'm a silence freak though so I leave it on the stock profile as I guess its designed to run at 83 24/7.
 


Top