How NVIDIA killed the GTX 1080

Share on FacebookShare on Google+Pin on PinterestShare on StumbleUponTweet about this on TwitterShare on LinkedIn

If you’ve seen review of the GTX 1070 Ti, then you heard us say that it offers similar performance to the GTX 1080. Now, let us show you what I mean by that. Here’s how NVIDIA killed the GTX 1080.

For this, we’re gonna compare the performance of the GTX 1080 to that of the 1070 Ti. The test setup we used for both cards is the same one mentioned in our review of the 1070 Ti. For reference, here it is.

CPU Intel Core i7-6700K
Motherboard Gigabyte Z170M-D3H
RAM 2x8GB Corsair Vengeance LPX DDR4 @ 2666MHz
Storage 250GB Samsung 850 Evo + 1TB WD Blue
PSU Corsair RM650i
Case Phanteks Eclipse P400 Tempered Glass Edition

We’re also throwing in the GTX 1070 for some benchmarks since we have access to them. We are using the results from our review of the Lenovo Y720 Cube Gaming PC, which used a Founder’s Edition GTX 1070, and a CPU and RAM configuration similar enough to our test setup, such that net performance wouldn’t be significantly throttled. It packs an Intel Core i7-7700HQ and 16GB of DDR4 RAM.

Now that our setups have been disclosed, let’s begin with the synthetic benchmarks.

Note: Synthetic benchmarks are run at the highest settings at 1080p. The graphics cards were not overclocked.

It’s so close. The scores of the GTX 1070 Ti sit almost exactly in between those of the GTX 1070 and 1080, but the gap between those two cards was already pretty good in terms of placement within the stack, as well as pricing. Judging from these numbers, getting the GTX 1070 Ti to match the 1080 is only a matter of basic overclocking. Let’s move on to everyone’s favorite, gaming benchmarks.

At 1080p, the performance jump is fairly significant. Especially if you’re aiming for 144Hz or higher refresh-rates, then the 15-20fps jump is worth it. That isn’t to say that the 1070 Ti can’t achieve those numbers by bumping up the clock speeds. Also, consider that we ran these games at the highest possible graphics settings, so settling for High instead of Ultra will still give you a nice looking game, but exponentially increase your frame rates.

At 1440p and 4K however, you start to see diminishing returns. The performance increase is now generally only <15fps for 1440p, and <10 for 4K. We know that this will vary from game to game, and you can actually see that in the results. However, there is definitely a trend here.

This is actually great for us, the consumers. We are now in a better position than ever when it comes to high-tier graphics cards. A Founder’s Edition GTX 1070 Ti is priced at $449 , which is $100 cheaper than a Founder’s Edition GTX 1080. The price difference for custom cards should be the same. Is the slight performance bump worth that extra $100? I think not. The 1070 Ti arguably has the better value.

From a macro perspective, we can’t help but ask. Why was there a need to compete at this price point? Was it worth cannibalizing the GTX 1080?

The biggest reason on peoples’ minds, is that this price point was previously only occupied by AMD’s Vega 56. This is also the most logical business move for NVIDIA, as AMD is their biggest competitor in the GPU space. You could, however, argue that NVIDIA isn’t at a loss at all, since the GTX 1080 has already been out for a year and a half, and has already sold well.

The “Ti” in NVIDIA’s graphics cards is supposedly short for “Titanium”, which is supposed to signify improved performance and power efficiency. We don’t feel this as much with the GTX 1070 Ti. It’s less of a GTX 1070 Ti, and more of a GTX 1080 Lite. Is the “Ti” losing its meaning? Has it become just a designation for filler cards?

ASUS ROG Strix GeForce GTX 1070 Ti

We sincerely hope not. With all this, there is nothing more we can do. Board partners like ASUS, MSI, Gigabyte, Zotac, and others already have their custom cards out. If you’re in the market for a new graphics card, now you know that the 1070 Ti arguably has a better value than the 1080.

(yugatech.com, https://goo.gl/4j3iNp)

Comments

comments

Share on FacebookShare on Google+Pin on PinterestShare on StumbleUponTweet about this on TwitterShare on LinkedIn