For us long time PC gamers, those of us who grew up in a time where games were advancing so fast that yearly upgrades were a given, getting the most bang for your buck was often our primary concern. Often the key components would get upgraded first like the CPU, RAM and GPU with other components falling by the wayside. However over the past few years technological advances for some pieces of technology, like SSDs, provided such a huge benefit that they became the upgrade that everyone wanted. Now I believe I’ve found the next upgrade everyone else should get and comes to us via NVIDIA’s new monitor technology: G-Sync.
For the uninitiated G-Sync is a monitor technology from NVIDIA that allows the graphics card (which must a NVIDIA card) to directly control the refresh rate of your monitor. This allows the graphics card to write each frame to the monitor as soon as its available, dynamically altering the refresh rate to match the frame rate. G-Sync essentially allows you to have the benefits of having vsync turned off and on at the same time as there’s no frame tearing and no stutter or slowdown. As someone who can’t stand either of those graphical artefacts G-Sync sounded like the perfect technology for me and now that I’m the proud owner of a GTX970 and two AOC G2460PGs I think that position is justified.
After getting the drivers installed and upping the refresh rate to 144Hz (more on that in a sec) the NVIDIA control panel informed me that I had G-Sync capable monitors and, strangely, told me to go enable it even though when I went there it was already done. After that I dove into some old favourites to see how the monitor and new rig handled them and, honestly, it was like I was playing on a different kind of computer. Every game I threw at it that typically had horrendous tearing or stuttering ran like a dream without a hint of those graphical issues in any frame. It was definitely worth waiting as long as I did so that I could get a native G-Sync capable monitor.
One thing G-Sync does highlight however is slowdown that’s caused by other factors like a game engine trying to load files or performing some background task that impedes the rendering engine. These things, which would have previously gone unnoticed, are impossible to ignore now when everything else runs so smoothly. Thankfully most issues like that are few and far between as I’ve only noticed them shortly after loading into a level but it’s interesting to see issues like that bubbling up now, signalling that the next must-have upgrade might be drive related once again.
I will admit that some of these benefits come from the hugely increased refresh rate of my new monitors, jumping me from the paltry 60Hz all the way up to 144Hz. The difference is quite stark when you turn it on in Windows and, should you have the grunt to power it, astounding in games. After spending so long with content running in the 30~60Hz spectrum I had forgotten just how smooth higher frame rates are and whilst I don’t know if there’s much benefit going beyond 144Hz that initial bump up is most certainly worth it. Not a lot of other content (like videos, etc.) take advantage of the higher frame rates however, something I didn’t think would bother me until I started noticing it.
Suffice to say I’m enamored with G-Sync and consider the premium I paid for these TN panel monitors well worth it. I’m willing to admit that high frame rates and G-Sync isn’t for everyone, especially if you’re lusting after the better colour reproduction and high resolutions of IPS panels, but for someone like me who can’t help but notice tearing and stuttering it’s a dream come true. If you have the opportunity to see one in action I highly recommend it as it’s hard to describe just how much better it is until you see it for yourself.
If there’s one thing that I can’t stand in any game it’s visual tearing and stuttering. This is the main reason why I play all my games with v-sync on as whilst I, like any gamer, enjoy the higher frame rates that come with turning it off it’s not long before I’m turning it back on again after the visual tearing wreaks havoc on my visual experience. Unfortunately this has the downside of requiring me to over-spec my machine to ensure 60 FPS at all times (something which I do anyway, but it doesn’t last forever) or lowering the visual quality of the game, something which no one wants. It’s been an issue for so long that I had given up on a fix for it although there was some hope with a 120Hz monitor. As it turns out there is hope and its name is G-SYNC.
The technology comes by way of NVIDIA and it’s a revolutionary way of having the GPU and your monitor work in tandem to remove tearing and stuttering. Traditionally when you’re operating a monitor like I am your graphics card has to wait for the monitor’s refresh interval every time it wants to write a frame to it. In a highly variable frame rate game (which is anything that’s graphically intensive) this leads to stuttering where repeated frames give the appearance of the game freezing up. Flipping v-sync off leads to the other problem where the GPU can write frames to the monitor whenever it wants. This means that a new frame can start being written halfway through a scan cycle which, if there’s even a skerrick of motion, leads to the frames being out of alignment causing visual tears. G-SYNC allows the GPU to dictate when the monitor should refresh, eliminating both these issues as every frame is synced perfectly.
For me this is basically monitor nirvana as it gives me the advantages of running v-sync without any of the drawbacks. Better still all the monitors that support G-SYNC also run up to 144Hz, something which was going to be a requirement for my next monitor purchase. The only drawback that I see currently is that all these high refresh rate monitors are TN panels which aren’t as great when compared to the shiny new IPS panels that have been flooding the market recently. Honestly though I’m more than willing to trade off the massive resolution and better colour reproduction for solving my main visual gripe that’s plagued me for the better part of 20 years.
Unfortunately your options for getting a G-SYNC capable monitor right now are fairly limited. Whilst there are a good number of monitors that were recently announced as supporting G-SYNC none of them have become commercially available yet, with all of them scheduled for release in Q2 2014. You can, if you’re so inclined, purchase an ASUS VG248QE and then hit up NVIDIA directly for a G-SYNC upgrade kit (currently out of stock) and upgrade your monitor yourself but it will require you to crack it open in order to do so. There are places that will do this for you though but they too are out of stock. Still for something like this I’m more than willing to wait and, hopefully, it will mean that other components of my new computer build will come down a touch, enough to justify the extra expenditure on these new fangled monitors.
3D is one of those technologies that I’m both endlessly infatuated and frustrated with. Just over a year ago I saw Avatar in 3D and for me it was the first movie ever to use the technology in a way that wasn’t gimmicky but served as a tool to enable creative expression. Cameron’s work on getting the technology to the point where he could use it as such was something to be commended but what unfortunately followed was a long stream of movies jumping on the 3D bandwagon, hoping that it would be their ticket to Avatar like success. Since then I’ve only bothered to see one other movie in 3D (Tron: Legacy) as not one other movie demonstrated their use of 3D as anything other than following the fad and utterly failing to understand the art that is 3D.
Last year was the debut of consumer level 3D devices with the initial forays being the usual TVs and 3D enabled media players. Soon afterwards we began to see the introduction of some 3D capable cameras allowing the home user to create their very own 3D movies. Industry support for the format was way ahead of the curve with media sharing sites like YouTube allowing users to view 3D clips and video editing software supporting the format long before it hit the consumer level. We even had Nintendo announce that their next generation portable would be called the 3DS and boast a glasses free 3D screen at the top. Truly 3D had hit the mainstream as anyone and everyone jumped to get in on the latest technology craze.
Indeed the 3D trend has become so pervasive that even today as I strolled through some of my RSS reader backlog I came across not one, but two articles relating to upcoming 3D products. The first is set to be the world’s first 3D smartphone, the LG Optimus 3D. It boasts both a 3D capable camera and glasses free 3D screen along with the usual smartphone specs we’ve come to expect from high end Android devices. The second was that NVIDIA’s current roadmap shows that they’re planning to develop part of their Tegra line (for tablets) with built in 3D technology. Looking over all these products I can’t help but feel that there’s really little point to having 3D on consumer devices, especially portable ones like smartphones.
3D in cinemas makes quite a lot of sense, it’s another tool in the director’s kit to express themselves when creating their movie experience. On a handset or tablet you’re not really there to be immersed in something, you’re usually consuming small bits of information for short periods. Adding 3D into that experience really doesn’t enhance the experience at all, in fact I’d dare say that it would detract from it thanks to the depth of field placing objects in a virtual space that in reality is behind the hand that’s holding it. There is the possibility that 3D will enable a new kind of user interface that’s far more intuitive to the regular user than what’s currently available but I fail to see how the addition of depth of field to a hand held device will manage to accomplish that.
I could just be romanticising 3D technology as something best left to the creative types but if the current fad is anything to go by 3D is unfortunately more often misused as a cheap play to bilk consumers for a “better” experience. Sure some of the technology improvements of the recent past can trace their roots back to 3D (hello cheap 120Hz LCD screens) but for the most part 3D is just used as an excuse to charge more for the same experience. I’ve yet to see any convincing figures on how 3D products are doing out in the market but anecdotally it’s failed to gain traction amongst those who I know. Who knows maybe the LG Optimus 3D will turn out to be something really groovy but as far as I can tell now it’s simply yet another gimmick phone that’s attempting to cash in on the current industry obsession with 3D, just like every other 3D consumer product out there.