Like any PC gamer, I have plans to upgrade my PC, to make it better than it already is. That essentially means that I have to decide on what parts to buy, if they’re compatible, if the power supply is big enough, etc. I put a lot of thought into what parts I want and how long it’ll take me to afford them, either buying them piecemeal or all at once. Earlier this year I was planning to completely rebuild my PC because it’s starting to have problems more and more frequently. However, I never went through with it because this years parts were on the horizon. There were rumors about Intel’s 9th generation CPU’s and rumors about NVIDIA’s 11-series cards and so I figured “Ok, I’ll wait until later this year and then upgrade. Hopefully my PC can hold out that long.” Well my PC has held out, but the performance of NVIDIA’s 20-series GPU’s has made me hesitate.

While we have no unbiased third party benchmarks, NVIDIA has released some numbers of their own showing that without ray-tracing turned on, games can hit 4K at 60fps on the 2080, no numbers for the flagship 2080 Ti. But as demonstrated via the demos for RTX games, ray-tracing has such a performance hit that you’ll likely end up playing at 1080p with 60 frames per second once the final retail code is out in the wild since the demos were very early, very crude implementations.

But see, therein lies the rub of it all: The performance gains over the 10-series aren’t that big of a leap. Ray-tracing in and of itself, running in real-time at 1080p30 is still one of the greatest things to happen to gaming and obviously it helps out other industries as well, but for the cards themselves to be marginally better without RTX features turned on, is still somewhat of a bummer. Obviously we’ll have to see what third party benchmarks have to say about performance, but even NVIDIA’s own charts weren’t game changers in the performance department. The king of leap that I’m assuming gamers expected was to be able to play games at, at least, 4K60 with RTX turned on. And then there were the outliers who saw the 4K G-Sync HDR 144Hz monitors and thought the new cards would be able to attain that kind of performance, which personally I felt was a tad unrealistic especially since you’d then be expected to pay $2000 for a computer screen.

So now I’m looking at the 20-series GPU’s specifically the 2080 Ti, and I’m no longer dead set on having it. A part of me wants to wait until next year, or the year after that, to get the next set of cards which should be on 7nm architecture, and being the second generation RTX cards, should perform better with RTX turned on since the tech will have matured some more. But because my computer is on the verge of death, there is pressure to just toss out my 980 Ti and get the 2080 Ti. My current GPU isn’t the part that’s going bad, but I like having all the visual bells and whistles turned on, so as newer games come out, the performance starts to dip bit by bit. It could probably last me a few more years before newer games start to fall below 30fps if I don’t turn any settings down. And for me personally, I’m a self-admitted graphics whore, so ray-tracing really excites me. As long as the game can maintain a minimum resolution of 1080p, at 30 frames per second, with every setting cranked to maximum and RTX switched on, I’ll be a happy camper, but because of the performance cost of ray-tracing, a part of me is concerned that the 2080 Ti won’t last as long at that standard as I would want it too and thus I question if it’s even a good investment. Upgrading to an i9-9900K which should be releasing in November for an estimated $450 feels like an easily more worthwhile investment since I’m upgrading from an i7-3770.

I’ll likely still be getting the 2080 Ti at some point, but NVIDIA definitely isn’t making it more difficult to just wait for better performing refreshes or even a next generation.