The future of gaming?
NVIDIA’s new breakthrough GPUs are here and promise to redefine the way you game. That’s the elevator pitch of NVIDIA for this year’s new GPUs, and aside from the usual jump in number crunching capabilities, the company is promising improvements that will redefine the way you play games.
The biggest promise here is real-time Ray Tracing. There’s a lot of technical explanations on the web about what ray tracing is, but the long and short of it is that it’s a way to create real-time reflections and convincing cinematic effects in your games. Ray Tracing is not a new technology, as it’s utilized by the film industry to produce convincing CGI effects for years. The difference here is that its uses were limited to pre-rendered scenes that required the use of large, gigantic server farms and not real-time applications.
Obviously, that’s changed with the arrival of their new Turing architecture and RTX cards. The RTX 2080 Founder’s Edition that we’re reviewing today is the middle child of a trio of newly-announced RTX cards which include the RTX 2070, RTX 2080 and RTX 2080 Ti.
What is it?
The RTX 2080 FE is the lower-end of the RTX 2080 lineup of NVIDIA, with its more expensive, more powerful RTX 2080 Ti sibling being the main contender for the title of fastest consumer video card in existence.
It looks quite different from FE cards of the past.
It sure does. NVIDIA has chucked their old Founder’s Edition aesthetic with the new RTX cards, and for the first time has included dual fans as well as a full-length vapor chamber that allows the card to run quieter and cooler than ever before.
Here’s the rub to all that nice cooling and hardware upgrades though: the NVIDIA recommends that users run at least a 650-watt power supply to feed this monster power, and the RTX 2080 itself draws around 225 watts on its own. Compared that to a regular GTX 1080 and its 500W recommended power supply and 180 power draw, then you can see that the RTX 2080’s power requirements are quite substantial.
Is it bigger than current generation cards?
Not really. The RTX 2080 is as big as the GTX 1080 graphics card in the ASUS ROG GL12 test machine that we utilized for this review. The RTX 2080 utilizes one 8 and one 6 pin connector for power, and has three DisplayPort 1.4a ports, single HDMI port as well as a VirtualLink USB-C connector for VR headsets.
Alright, enough teasing, how does it fare in gaming?
Pretty good. We had the chance to test the RTX 2080 against three games in two resolutions: 1440P and 4K thanks to ASUS’ excellent PG27UQ 27-inch monitor. Take note that the games were tested with the settings cranked to ultra.
ASUS provided our test platform for the review of the RTX 2080 in the form of the ROG GL12 that we reviewed a few weeks back. If there ever was a test bench that’s bottleneck-free, it’s the GL12. With an Intel Core i7-8700 processor, 4.3GHz (12M cache, up to 4.8GHz), 32GB of RAM installed plus a speedy M.2 SSD 512GB SATA drive, you couldn’t really ask for a better testbed to put NVIDIA’s new graphics card though.
All the tests were done with settings cranked on ultra where applicable. Three games were selected for the tests: For Honor, Far Cry 5 and Middle Earth: Shadow of Mordor.
Enough talk, let’s start talking numbers:
Starting with full HD comparisons, the RTX 2080 is indeed faster than a stock GTX 1080 that’s on the GL12 which is to be expected. You’re looking at a jump of around 40-50 FPS on average from the GTX 1080 to the RTX 2080, which isn’t pretty shabby for an upgrade.
Things get interesting when you start looking at 4K performance. The GTX 1080 struggles here, not even breaching the 50 FPS average in the three games we tested it with. While the RTX 2080 isn’t exactly a 60FPS 4K card, it came pretty close to nailing 60FPS. We’re sure most people will be happy to turn down a graphical option or two to keep their games in the 60 FPS sweet spot.
A good resolution tradeoff for people looking to game at a higher resolution than full HD without worrying about hitting performance issues at 4K is 1440P. Here the RTX 2080 performs admirably, hitting around 90 to 100 FPS without any issues at all. This is where the RTX 2080 shines.
How well does that Ray Tracing magic work?
Uh, we wouldn’t know. Unfortunately for early adopters, NVIDIA’s headline feature for the RTX 2080 isn’t available for consumers yet. Yes, there are tech demos of it in action, but as of this writing, there are no RTX-enabled titles for consumers to buy yet, which is a disappointment considering that’s the entire point of the card’s steep price tag in the first place.
Even NVIDIA’s AI-powered anti-aliasing, which uses the company’s Deep Learning Super-Sampling (DLSS) tech that intelligently supercomputer farms to scan games before they are released and work out the most efficient way to render graphics isn’t available yet, since it’s supposed to be included in the Windows 10 October 2018 update that got pulled unceremoniously. So if you buy either the RTX 2080 or 2080 Ti right now, you’re not getting any of the headline features that NVIDIA showed off during the RTX 2080 and 2080 Ti’s unveil.
NVIDIA is promising that 25 games will support DLSS, and at least 11 will have ray tracing in the coming months when they’re released. Knowing NVIDIA, that’s a promise that the company will be able to keep, though it’s something to keep in mind before you cop this or the more expensive RTX 2080 Ti.
Should you buy it?
Right now that’s a tough question to answer. While the RTX 2080 does manage to pull better numbers than the GTX 1080 it’s replacing, the cost of acquiring said card is hard to swallow. MSRP on Founder’s Edition cards are USD 699 , with cards made by their hardware partners like ASUS starting at around Php 56K/$1034. That’s an incredibly large amount of money compared to a regular GTX 1080 Ti that’s already overclocked from the factory that starts at around 45-50K./$831-$923
Still, the promise of AI-powered AA plus Ray Tracing is intriguing, though at this point that’s exactly what that is: a promise. To be fair no other company is as well positioned as NVIDIA to have their tech adopted by developers and publishers, though there’s still that off chance that the promise may end up unfulfilled.