Nvidia RTX 2000 Series: A Tera-FLOP?

By Daniel Brown |

After much anticipation, speculation and countless leaks, Nvidia finally officially revealed their new graphics card line-up: the GeForce RTX 2000 series. This series is a huge deal, as the new Turing architecture was designed with more in mind than just an incremental increase in performance as with previous generations.

The Turing architecture was developed with the implementation of two newly designed processing cores Ray Tracing cores and Tensor cores, as well as the inclusion of new faster GDDR6 memory.

The ray tracing cores are the main new attraction for these GPUs, designed specifically to use a real time ray tracing technique to render scenes with more detail. By tracing the path of light rays (hence ray tracing), the rays are reflected/absorbed by objects in the image and the light rays behave accordingly. The product of this technique is an incredibly realistic image. However, the computing power required for ray tracing is immense, so it’s only been used on CGI so far. The idea proposed by Nvidia is interesting and it has managed to succeed in that these new GPUs easily beat the competition and their predecessors in situations where ray tracing is required or optionally set.

The Tensor cores have been seen on previous cards from Nvidia, but this is the first implementation of the cores on more consumer style cards. Tensor cores are able to do small matrix calculations in a single GPU clock cycle, this dramatically increases the Deep Learning capabilities of the cards.

At the Gamescom announcement, Nvidia revealed the three top end cards, the RTX 2018Ti; the RTX 2080; and the RTX 2070. Specs for these cards include 4352 CUDA cores and 11GB of that new GDDR6 RAM in the 2080Ti flagship which is clocked at 1350MHz, however this card comes with a complimentary price tag of $1200 (to drop to $999 after initial launch). The more accessible card, the 2070, is equipped with 2304 CUDA cores and 8GB of memory and is clocked at 1410MHz, with a launch price of $599.

With all this promise of superior Ray Tracing and the incredible hardware inside the cards, can they actually achieve what Nvidia proclaim? And, possibly more importantly, are they worth the new big price tags?

In short, there isnÔÇÖt a yes or no.

There is just that annoying ÔÇ£dependsÔÇØ; it may be worth getting if youÔÇÖre in the niche community that genuinely needs that processing power for the intended use of the cards (ray tracing and depp learning). However, if youÔÇÖre getting them to up the performance and look of your triple-A title gaming experience, then actually the answer is no. As with all new hardware technologies, software that takes advantage of the advance in hardware capability doesnÔÇÖt exist yet. Yes, there is the option to enable ray tracing in options for some very new games, but itÔÇÖs really not optimised for the processing cores in the card. The 2000 series does offer an increase in performance in terms of frame rates, but I donÔÇÖt think the high price is worth paying if thatÔÇÖs all you want it for.

Personally, I would say if you were holding off upgrading in anticipation for the new line up ÔÇô unless youÔÇÖre in that niche that I mentioned earlier ÔÇô get yourself a great deal on one of the high end GTX 10 series graphics cards. The prices for these have dramatically dropped in the wake of the unveiling of the 2000 series alongside the crypto mining boom quieting down, and they’re more than enough to handle the demands of the current consumer market.