MSI GeForce RTX 2080 Ti GAMING X TRIO Picture Leaks

I'd like to thank all the anti-GPP folks for letting the rest of us get a chance to buy the new Nvidia cards.

I wish I shared your optimism. Unfortunately a lot of those people that were making grand declarations of "never again Nvidia cuz the GPP!" will be launchday hammering F5 on Newegg like the rest of us. Very few people have real convictions. Not much you say online is particularly binding. Oftentimes its just easy/fun to hop on a bandwagon or join in on an echo chamber because it feels good to be a part of some movement, real or imagined.

As always, it comes down to that one universal truth: framerate is king.
 
Last edited:
no games that use RTX????....wrong


So far, the only confirmed game to feature real-time ray tracing is Metro Exodus, now due to release on February 22nd, 2019. We’re hoping more game developers will come out and announce their support of the technology during next week’s Gamescom 2018. Our intrepid reporters Chris Wray and Rosh Kelly will be there, by the way, so expect plenty of articles on all the most exciting games in development showcased in Cologne.

https://wccftech.com/nvidia-rtx-ray-tracing-metro-exodus-demo/

https://wccftech.com/nvidia-rtx-demonstrated-cinematic-demo/
 
Last edited:
Ray tracing is the next step in graphics realism for gaming. They have to start somewhere. Besides, it looks like the Ti will be faster than any other card currently on the market, including the Titan V.

Nvidia future proofing a retail gpu? Does Pascal even have full DX12 support to it yet?
 
It seems like NV is going straight for the jugular with a Ti launch card, hit AMD hard while they're limping along in the GPU space to lock down a little more market.
 
It seems like NV is going straight for the jugular with a Ti launch card, hit AMD hard while they're limping along in the GPU space to lock down a little more market.
I think it's more about going ahead and releasing nearly the whole product stack because there's not a huge Improvement over the previous cards in some ways. No one would even consider the 2080 if they have a 1080ti and give up 3 gigs of video RAM for just a small boost in performance. And I'm almost certain that they will be a refresh if not new cards next year because of 7 nanometer. Plus I'm pretty sure Nvidia has been sitting on the 2080 cards for quite a while.
 
I wish I shared your optimism. Unfortunately a lot of those people that were making grand declarations of "never again Nvidia cuz the GPP!" will be launchday hammering F5 on Newegg like the rest of us. Very few people have real convictions. Not much you say online is particularly binding. Oftentimes its just easy/fun to hop on a bandwagon or join in on an echo chamber because it feels good to be a part of some movement, real or imagined.

As always, it comes down to that one universal truth: framerate is king.
Wasn't GPP withdrawn? If so, then that removes most of the objection people were talking about.

Note, I'm assuming GPP was the BS where they said, for example, ROG could only be a Nvidia.
 
If indeed there is going to be a Ti model right at launch it will definitely be interesting. How fast and how much? Wouldn't it cannibalize the new 3K Titan card as well? Also these chips are all fabbed on the 12nm process I believe? And TMSC's 7nm process is going live early in the new year meaning 7nm chips should be available spring/summer meaning will there be a fast refresh next year, somewhere around 3 quarters from now? I had expected the 2080Ti to be held off and released as a 7nm product but maybe not?
 
I think it's more about going ahead and releasing nearly the whole product stack because there's not a huge Improvement over the previous cards in some ways. No one would even consider the 2080 if they have a 1080ti and give up 3 gigs of video RAM for just a small boost in performance. And I'm almost certain that they will be a refresh if not new cards next year because of 7 nanometer. Plus I'm pretty sure Nvidia has been sitting on the 2080 cards for quite a while.
Not so much sitting on them, as AMD's incompetence + Q1's crypto mining surge has afforded them extra time to improve yields. Hence the 6+ months of slack we've traditionally seen between a new gen launch and Ti variant, has been taken up.

It was folly for those people that were assuming "theyre just milquing it and not releasing new cards cuz the mining" meant Nvidia was sitting around doing nothing. Behind the scenes they never took their foot off the gas.
 
Last edited:
Still hoping we see some 16GB configurations. I figured the lowest-end cards from this new line would have 8GB of vRAM, I was hoping most would have 12GB, and then maybe the Ti with 16GB, at the very least. Now if they wanna put 16GB on cards all the way down to the 2070 I'm game. Let's see what the AIBs end up doing. GTX 970 didn't have too bad a launch price, can we go back to that? 980 Ti was $650 at launch. I thought that was insane, but it keeps going up. I wonder if RTG really will have 7nm GPUs out next year. I'm ready to upgrade from my 970 now, but who knows when I'll actually have the cash for that. So for now I sit and wait and see how things play out over the coming months.
 
Still hoping we see some 16GB configurations. I figured the lowest-end cards from this new line would have 8GB of vRAM, I was hoping most would have 12GB, and then maybe the Ti with 16GB, at the very least. Now if they wanna put 16GB on cards all the way down to the 2070 I'm game. Let's see what the AIBs end up doing. GTX 970 didn't have too bad a launch price, can we go back to that? 980 Ti was $650 at launch. I thought that was insane, but it keeps going up. I wonder if RTG really will have 7nm GPUs out next year. I'm ready to upgrade from my 970 now, but who knows when I'll actually have the cash for that. So for now I sit and wait and see how things play out over the coming months.

Having 11GB of vRam is a good thing as it means it is a 352 bit bus with GDDR6 meaning that bandwidth will be considerably higher than the 1080ti. If it was 16 GB, it would be a 256 bit bus meaning that bandwidth would not be much higher than the 1080ti depending on clocks. A 512 bit bus would be insanely expensive and power hungry.

11 GB is way more than enough for any games in the foreseeable future. If you need more for professional applications, the Titan V is always available.
 
You can max out the 11 GB in some games though...
Having 11GB of vRam is a good thing as it means it is a 352 bit bus with GDDR6 meaning that bandwidth will be considerably higher than the 1080ti. If it was 16 GB, it would be a 256 bit bus meaning that bandwidth would not be much higher than the 1080ti depending on clocks. A 512 bit bus would be insanely expensive and power hungry.

11 GB is way more than enough for any games in the foreseeable future. If you need more for professional applications, the Titan V is always available.

You can max out the 11 GB in some games though...well maybe one or two just through the game allocating it if you have it. It is cool to see all 11 GB almost tapped out though.
 
What games use all 11gb?
re7_2018_08_17_22_40_17_218.jpg
 
Curious too... I am hoping 11gb will be good for a few years at 4k.
Oh it will be. Again some games will allocate more vram if you have it so I was just saying it seems kind of cool and shocking though when you see that much vram being used.
 
I can't believe we are going through this silly argument for 11 GB of vRAM. THERE ARE NO GAMES THAT NEED 11 GB OF VRAM! Some games will show insane utilization as they are being over-cached, but the only way to prove this is to test the game using an identical card with less vRam and compare FPS and frametimes. Some games will even utilize system ram for cache without any penalty. Please stop with the nonsense. The extra bandwidth will be WAY more beneficial.
 
I can't believe we are going through this silly argument for 11 GB of vRAM. THERE ARE NO GAMES THAT NEED 11 GB OF VRAM! Some games will show insane utilization as they are being over-cached, but the only way to prove this is to test the game using an identical card with less vRam and compare FPS and frametimes. Some games will even utilize system ram for cache without any penalty. Please stop with the nonsense. The extra bandwidth will be WAY more beneficial.
And where the fuck did anyone argue? I made it 100% clear that a couple games out there will allocate that much if you have it. Maybe learn to read next time before getting your panties in a bunch. :rolleyes:
 
And where the fuck did anyone argue? I made it 100% clear that a couple games out there will allocate that much if you have it. Maybe learn to read next time before getting your panties in a bunch. :rolleyes:

Because you use the term allocated (to assign or allot for a particular purpose), 'max out', and 'tapped out' in a previous comment:

You can max out the 11 GB in some games though...

You can max out the 11 GB in some games though...well maybe one or two just through the game allocating it if you have it. It is cool to see all 11 GB almost tapped out though.

You also show a screenshot of a dark wall and a chair using 11 GB of vRam in a game. Maybe that is why my 'panties are in a bunch'.
 
Because you use the term allocated (to assign or allot for a particular purpose), 'max out', and 'tapped out' in a previous comment:



You also show a screenshot of a dark wall and a chair using 11 GB of vRam in a game. Maybe that is why my 'panties are in a bunch'.
So maybe work on your reading comprehension and paying more attention to context. And allocate does not mean "need" so its your own fault for not comprehending such a common term when discussing ram/vram usage.
 
Good for you. You still used terms like max out and tapped out. Perhaps they were not in the exact context, but these are forums. People typically scan the comments. When you follow it by a big screen of a game using 11gb, it is clearly MISLEADING for the casual viewer.
 
Good for you. You still used terms like max out and tapped out. Perhaps they were not in the exact context, but these are forums. People typically scan the comments. When you follow it by a big screen of a game using 11gb, it is clearly MISLEADING for the casual viewer.
There was nothing misleading at all. I made it clear to anyone with basic reading comprehension that a couple of games out there can allocate 11 GB of vram if you have it and nothing more. And AGAIN if you could actually pay attention the context, you would understand why the words maxed and tapped out were used. Maybe get someone to help you if you are still confused.
 
Whatever you say. There was absolutely no reason to throw in the other terms. People don't read forums like a text book, Mr. Reading Comprehension.

It is clear that you are the one who is butt hurt.
 
Whatever you say. There was absolutely no reason to throw in the other terms. People don't read forums like a text book, Mr. Reading Comprehension.

It is clear that you are the one who is butt hurt.
You are making my head hurt not my butt.
 
What game is that?
Resident Evil 7. It does use a lot of vram on the very high texture setting and even a 6gb card will have some stutter above 1080p from what people on the forums say. It seems to "need" about 8gb of vram on very high textures especially at 4k but will allocate more if you have it.
 
It's a damn shame that MSI's RMA department eats the ass out of dead people, else I might consider their cards for an upgrade. But after four RMAs with my X99 board, each one in more and more of a craptastic condition. Not to mention the NA head of their RMA department being the biggest douche satchel I ever spoke to. MSI can eat a dick.
 
Resident Evil 7. It does use a lot of vram on the very high texture setting and even a 6gb card will have some stutter above 1080p from what people on the forums say. It seems to "need" about 8gb of vram on very high textures especially at 4k but will allocate more if you have it.
I've only walked up to the gate of the house so far. Planning on playing it when my vega gets returned. Had no idea it was such a memory hog.
 
Back
Top