Ray tracing is good and good, but how many frames for your buck you get from the new Nvidia RTX 2080 Ti and RTX 2080 cards?
After the most successful launch of Gamescom this year, I was slowly trying to test what most people would think of as three flags of the game card: publisher of the RTX 2080 Ti and RTX 2080 founders who cost $ 1899 and $ 1299. I've run them against the GTX 1080 Ti, the previous flagship that you can capture locally somewhere between $ 1150 and $ 1250 depending on the brand.
Part of Nvidia Spacing: Games can get a better view over time
About hundreds of journalists, YouTubers and other tech media just sat for about three hours of dense presentation. It was in the middle of the day of Nvidia Redakce, which was basically the day when Nvidia executives split the architecture of their upcoming graphics cards into exhaustive detail.
However, when Nvidia cards were launched for the first time, two parts were missing: games that supported beam tracking, and too much Windows update to enable X-ray monitoring in DirectX 12. Windows Update turned out to be a horror show for Microsoft. that the company must return the update not once, but twice, due to the stability of the system.
So when you step out of the equation – what will I do for the purposes of this article – how do RTX cards stand against modern games? Nvidia has released several preliminary charts and screenshots that show that RTX 2080 can handle games like HITMAN and Final Fantasy XV over (or at) 60 fps at 4K and HDR.
Some benchmarks (type) from Nvidia RTX 2080
When Nvidia introduced its RTX 20 cards before launching Gamescom, there was a missing element: benchmarks. Specifically, video game benchmarks, a reliable solution for people who value the value of the new GPU. After a door-to-door closed session with the press, the GPU produced further figures on how their cards are behaving in the real world. Howse.
But in the real world, with real world drivers, relaxed games and publicly available synthetic tests, how much will your $ 1300 or $ 1,900 go?
Before we get to the test, the system is used here. This is what most people think is a good gaming platform, but it is not the best. The weakness is especially the 7900X processor, Intel's 10-core offer. It plays games well, but the results do not get as a six or eight-quadrant offer that can run at higher speeds (for example, recently released i9-9900K or popular i7-8700K with less physical kernels but higher turbo clock speeds than other chips). Keep this in mind when you identify the results below.
- CPU: Intel i7-7900X (device speed)
- RAM: 32 GB DDR3 3200 MHz G-Skill TridentZ RGB RAM
- GPU: GTX 1080 Ti Founders Edition / RTX 2080 / RTX 2080 Ti Founders Edition
- Motherboard: Gigabyte AORUS Gaming 7
- Monitors: Acer X27 4K HDR 144 Hz / EIZO 23.5 "240 Hz monitors
- PSU: EVGA Supernova G2 850W
- GPU Drivers: 416.16 (October 4, 2018)
Many thanks to Nvidia for supplying the Acer X27 Predator screen for this testing.
For clarity: The 7900X works on the operating hours of the Corsair H100i, while RAM runs at 14-14-14-34-1.35V (CPU-Z confirmed). G-SYNC was deactivated for all tests, and the GPU graphics processor was set to maximum performance in the Nvidia control panel.
Tests and games were:
- 3D Strike (Fire Strike, Fire Strike Extreme)
- Forza Horizon 4 (DX12)
- Overall War: Warhammer 2 (DX11)
- Shadow of Tomb Raider (DX12)
- Midland: The Shadow of War
At the time of writing, Final Fantasy XV The DLSS benchmark was available privately, but not publicly. Since then, it has been released publicly, but DLSS support has not (and has never) been filled into full play. I will also run some 4K specific tests with RTX enabled games and other recent AAA titles such as Battlefield 5, about newer drivers later.
As for selected games, I've decided for this mix because a lot of engines are running. There are many uses of DX11 and DX12 – some support both Warhammer 2 and Shadow of Tomb Raider – and each game is built using its own engine. Almost all the games in this test are also relatively well optimized, except Total Warhammer 2. Creative gathering Warhammer RTS is more tied to the processor, but it's also a type of game that attracts players who spend more on their computers than most, and now I've kept them in rotation.
Due to the time of year and the general daily workload, I was unable to extend Ubisoft or Unreal Engine and Battlefield 5 was not available at the time I conducted these tests. I will try to run more coverage by presenting some of these games soon though.
All games were tested on 1080p, 1440p and 4K using the three highest preferences available in each game. The 3D tag does not have any of these preferences, but since different tests run with different rendering resolutions, you get the same effect.
All tests were also run several times, with postponed results. Some tests are more consistent than others – The shadow of war tends to return similar results regardless of whether 17 or 70 runs, but this has been done to avoid scattering problems. I also banned automatic updates wherever possible in any game (which is easy to do for games running through Steam) to avoid discrepancies with future performance improvements.
This has proved to be particularly beneficial Shadow of Tomb Raider: Future updates have actually caused stability problems for Nvidia owners, which led to the game being complained about the memory mistakes I discovered after I fixed the game after testing. Fortunately, Square Enix allows people to return their game to older versions via Steam's beta settings, a step that other developers should consider as support.
One important factor and one that I explain after the results: these tests were done without HDR. However, I will get to the end. Dynamic resolution was also manually deactivated in games where it was possible to ensure consistency.
Let's start with synthetic numbers. By clicking on the graphs below or pressing the graphs button, you need to expand them for readability. All values are given on average per second.
3D Mark of Fire
The 3D tag is a standard test de jour in terms of gaming kits. Take part in a number of tests that stretch the different parts of the system and the GPU before a combined combat test you probably saw once in the boom as a child.
RTX 2080 Ti is the king of the package and remains the rest of the tests. The advantage that the GTX 1080 Ti has over the RTX 2080 – which will be largely neck and neck for the remaining results you are about to see – is another 3GB VRAM, a bit more bandwidth memory (484GB / s versus RTX 2080 is 448GB / s) and a wider memory bus.
The RTX 2080 is higher, but in most cases it protrudes in front of the GTX 1080 Ti and has bonus hardware for future isolation. Fire Strike tests, however, are one area where they dropped just behind each other. But it is very fractional and within the error limits. It's also common to see incremental leaps in performance with future riders, so remember when we move forward.
Shadow of Tomb Raider
Lara's latest adventure was scheduled to be one of the first shadow-watched games. Shadow was not updated at the time of testing using shadow tracking shadows, but the comparison in games is better than the one that was delivered The rise of Tomb Raider, offering more representative recreation in game performances as Lara crosses the city center and the jungle.
When Nvidia declares that the dream "4K 60fps" has been realized, it is generally the kind of result it talks about. Frame rate has dropped below the horizontal frame of 60 frames per second for ultra preset, but I still notice everyone: The 7900X is not the best game controller in the neighborhood. If these tests were run with the i7-8700K, one of the newer i9 processors or the star model Ryzen 7 2700X all-rounder, the 2080 Ti would have more Ultra settings.
Regarding the RTX 2080 and GTX 1080 Ti models, in fact, I would just like to stay at 1440p. Having extra overhead is important for the most intense scenes you need to take into account when averaging the scale and only the general nature of the game. The fixed 60fps when the sun is over the horizon is nice. The 60fps fixed power in the battle is much, much better.
Midland: The Shadow of War
Monolith's orc-killing / dominating simulator can be quite looking when all the textures are growing to their highest. It's also a fun game in itself, especially now that even bigger things have been released and some massive expansions have been released.
The shadow of war has a built-in scale that passes through an unbroken scene and passes through some vegetation before sinking into the bloody medieval battle of the castle and approaching the Orheus boss in a full-strung armor. The game also supports HDR, provided you have enabled the required settings in your Windows view settings.
A well-optimized game, and the fact that all three cards would have no problem to look forward to 4K. The RTX 2080 Ti is by far the highest level, though it's still a great game in high settings, and a massive smooth target of 144fps (relevant for those with high-powered game monitors) is well-available for all flags here.
Chain armor looks real nice to 4K. I look forward to this later this year, when I have some time.
Forza Horizon 4
It did not take as much praise as it should, but holy shit Yippee Forza Horizon 4 well optimized. It's almost on FATE performance levels for how well it works over these three cards and I would expect similarly great results for users with RTX 2070, GTX 1070 and AMD cards too (given that Turn 10 would have a lot of AMD hardware optimization experience in the Xbox console One X).
Even better: Forza Horizon 4 has one of the best gaming benchmarks, copying a short race with AI, which is not very different from actual gameplay. And it's a great demonstration of how well all three Nvidia cards work: all three are able to keep up good over 60fps at 4K at any preset.
The gap between the GTX 1080 Ti and RTX 2080 narrows like Forza Horizon 4 uses more VRAMs, which can be expected when the resolution begins to pick. It's also a good reminder of the speed of the image that separates the highest possible presets from the second or third best option.
The game running at 4K on High looks better than 1440p on Ultra preset – you get sharper textures, aliasing algorithms do not have to work so hard, and clarity will be nicer because you're playing native screen resolution, assuming you're playing on the 4K screen.
And even then, I would recommend re-sampling if the results were good.
Overall War: Warhammer 2
An old favorite, Total Warhammer is a CPU-heavy game that throws tons and tons of units on the battlefield, while all sorts of blasts, effects and spells decimate the ground. For these tests, I used the heavier Skaven combat scale, rather than the original combat or campaign.
Warhammer 2 supports DX11 with "beta" support for DX12, although Nvidia test cards usually perform better in DX11 mode, so I left it.
Here you see a clear constraint on the whole result: it's a processor, not a GPU, which helps explain why the Ultra setting has basically led to no power difference between 1080p and 1440p for all three cards. Things will change a bit Warhammer 2 starts eating more VRAMs at 4K, but generally worse performance here is the level of optimization that is not as refined as other titles.
This can be expected: this is the oldest of the games in this set, and I will want to see what improvements the Creative Assembly will do with another Total War games, especially for their implementation of DirectX 12. Within the DX12 there are many benefits for multiple threads that would be natural Total War games, so we'll have to sit tight until we do Total Warhammer 3 roll around.
The word on HDR
HDR games were made possible with the latest GPUs for a while. Support was supported for the 900 GeForce Series GPU, though through HDMI, while each AMD from the R9 380 and RX 460 supports HDR via DisplayPort and HDMI. It's softer if you have a G-Sync monitor: only the GTX 1050 or higher is supported.
HDR support among computer games is becoming increasingly standardized among AAA games, especially as many of these studies are already working on their preferred HDR console design. Games like Fate 2, Battlefield 1, last Assassin's Creed games, ARK: Change in survival and HITMAN there are just some of the titles with HDR support. In the above tests, The shadow of war, Forza Horizon 4 and Shadow of Tomb Raider all support HDR while Overall War: Warhammer 2 is not.
So you can ask: why not try everything in HDR?
The reasons are twofold. First, the vast majority of PC players still do not have their own primary or secondary monitor that supports HDR. Priority is still very important for high refresh rate monitors or for higher color reproduction panels than HDR monitors. Monitors that support all these things – such as the Acer X27 Predator that Nvidia has supplied for testing – are extremely expensive. The Acer X27, which supports G-Sync, 144hz, HDR, and 4K, sets you $ 2800 or $ 3500 at the time of writing if you want the ASUS ROG Swift 27 "screen.
If you want a 4K screen that most of it without 144hz support, you're looking for about $ 770. But 144hz is the pinnacle for many PC gamers and with good reason, and because they owned high-end monitors from the first models that were available in Australia almost ten years ago, we will not oppose to owning one.
HDR panels have demanded expansion in the field of gaming computers for some time, mainly because the manufacturers have been concentrating on the other ends of the market: smaller screens for telephones and larger display screens for televisions. Computer monitors are a smaller market with a smaller profit margin than any of these two extremes, and as a result, many PC players still do without.
Another hurdle to HDR is Windows. Windows HDR support has not been fantastic over the last 12 months, and while this April update has improved the way Windows processes SDR content, it's still terribly horrible. Content other than SDR looks faded and you have different HDR implementations: some games support Dolby Vision, others support HDR10, and others have sliders to adjust brightness to keep your eyes out of blood.
But I did a short test to illustrate one thing: the lack of power difference between HDR and non-HDR. The GTX 10 series supports HDR, but it is always a mild performance. This is still a bit noticeable in the reduced testing I ran but for the most part if you want to run the game in HDR and you can get visuals on a comfortable and enjoyable point, performance should not be a problem.
Before we get to the final pitch and divide the prices of all these cards, there's one more feature we need to talk about: AI.
Super Sample Deep Learning (DLSS)
The range of technologies with AI technology in RTX cards, especially the update made by Ansel, is rather cool. But of all it is DLSS, Nvidia is a neural network powered anti-aliasing technique that will have the most impact on performance now.
At the time of writing, two synthetic tests were available, but tests only work with Nvidia RTX cards. One of these is the 3D Label, the Epic's Infiltrator demo. You can watch a video of what's going on from the Guru3D on YouTube below to show you what we're talking about:
The second was a separate building Final Fantasy XV benchmark that supports DLSS. You can get a reference value for yourself, with DLSS or without DLSS, through FFXV here.
At the time of writing, it is closest to achieving benefits with DLSS. This means there are some strong arguments why you should not be thinking about testing.
When it was released, FFXV The benchmark was released with serious retirement and stuttering problems that were reported by the Gamers Nexus earlier in the year. The general nature of these problems was that the benchmark incorrectly depicted objects and models good beyond the vision of a player and Square accepted on February 6th that the benchmark was hit by the sink and level of detail problems that "will be solved in the nautical game."
… for a wide range of settings.
Benchmark gives you an idea of how beautiful the game will be after release, but for the reasons outlined above, the benchmark may not accurately reflect the end of the game. (2/2)
– Final Fantasy XV (@FFXVEN) February 6, 2018
For the most part, these are issues were addressed in the final version of the PC. They simply were not solved in the benchmark that causes all these scams.
So while FFXV benchmark shows a remarkable performance improvement with DLSS, it really is a really bad benchmark. It still reports rather than any standard metrics that are consistent with other reports, and the above problems cause it to be too unreliable to gain any comfort in using as a real-world performance gauge.
When I saw the DLSS in action at Gamescom at the beginning of this year, I'm still promising that it will be a powerful advantage for the RTX owner when it starts developing in games. I just do not think FFXV benchmark complies with this standard and with PC development FFXV because it was canceled, it seems unlikely that the DLSS will ever be implemented in the full game. I think it's still worth seeing how FFXV especially given that Nvidia helped develop the PC version before release, but that's for a future article.
Whether it's the three flag GPUs you chose, you'll spend at least $ 1150. The situation varies internationally, but in Australia, local retailers value the RTX 2080 at approximately the same levels as the GTX 1080 Ti, which constantly brings some of the value arguments abroad where the prices for the GTX 1080 Ti have become more competitive.
More importantly, the RTX 2080 is more affordable. I've even seen cases – though limited – RTX 2080 at $ 1150, though you'll need to buy Newegg for it.
But for someone who buys today, someone who really thinks about investing in a card that will last for at least three years, I would think.
There is a much stronger offer that can run the 2018 puzzles in the highest conditions – with overheads – than still spend over grand for a card that will mostly to get there. When you look at natural technology degradation, especially when it comes to more popular beam tracking – Nvidia is not the only investor in this area – someone with Intel's 8th game device or the second Ryzen setup genre is going to get more miles out of the RTX 2080 Ti that should not no problems at 1440p and even 4K (with some drops of setup) for a few years.
This is the best way to think about these cards. How much do you want to invest in the next few years? It's one thing to spend $ 600 or $ 700 on the graphics card now. In the end, however, you have to think about how long your feet will be on this purchase, when you will probably upgrade and get the best kilometer performance from the rest of the system.
If the money was not an object or I had a reasonable GPU system that was two or more generations – people still on the 900 Series GPU, or the best of the AMD RX 480 or R9 390X – the GTX 2080 Ti offers significant progress in performance, which will hang for years.
If you are talking about a purely value offer of what you can buy today, the RTX 2080 offers better value for Australians. This is not the case abroad where the GTX 1080 Ti is more affordable and competitive, but you can only play cards that you will encounter. Besides, it's a better player situation: the more advanced the technology is on steam, if not slightly better than the 1080th bar, a slight decrease in memory and VRAM, and you get the benefit of upgrades to the NVENC encoder (which streamers will like), dedicated RT and tensor core and AI, and a more energy efficient card.
But it is completely dependent on one thing: the fact that you are looking at the three cards yourself. It does not apply, for example, to the fact that it is now worth $ 500 or $ 600 (with a view to buying two-generation RTX cards in two or three years). Or what impact AMD will have on 7nm next year.
And this is the terrible presence of AMD, which could eventually reinforce the argument for RTX cards, especially if AMD will be guided by supporting real-time beam tracking in a convincing way. Even when Nvidia's performance does not match – and previous experience suggests that at least in the early phase, support from both manufacturers will not support technology in some form, which will help boost developer support on the road.
And then there is pressure to reduce the price to be taken into account.
So I'll leave it. If you are lucky enough to consider purchasing someone of these cards and the design of the raw value is less disturbing, then you could all come out. RTX 2080 Ti is a fantastic card with plenty of overhead over a full range of games in full set. If you are loaded, you will not be disappointed – at least not in raw power. Ray tracing is another thing altogether, though the Windows nightmares running there do not help.
If you are up to the latest upgrade but are unsure whether the GTX 1080 Ti is better to buy, the RTX 2080 is a better choice. It's price-balanced, more local delivery, and you'll have more future testing over the next few years when developers get used to beam tracking and AI technology in general. The only qualification I can do is for people who do a lot of work with Adobe or professional rendering – the extra VRAM and CUDA GTX 1080 Ti could be much more practical and you do not have to sacrifice too much in the game. But that's a question if the offer remains limited.
If you are such a person for whom these cards are aspirational and you have been avoided at the price of the GTX 1080 and 1080 when you first fell: continue like you. They are the best cards on the market, but hardly affordable.
As a player who grew up poor and played with many systems of aging (courtesy of local banks who did not want them or know what to do with them), I always rely on the best bang for money. And this is going to be the next year as Nvidia will have more competition on the market and prices on AIB models will fall below four digits. RTX 2080 around $ 800 or $ 900 is not a price tag.
This means that there will always be a player who has the money today to be spoiled. And for who buys RTX 2080 Ti?
Just make sure you have a nice screen.