2080 Ti for $1000 confirmed, HDMI 2.1/VRR confirmed dead for another year

I am SO glad I picked up a 5 month old EVGA 1080Ti on eBay for $435 after 15% discount about 2 weeks ago. I KNEW nVidia was going to get stupid with the pricing.
Yup, glad I have one. Going to wait a long time until those prices drop until I pick up one.
 
I'm curious if I should start digging for 1080 Ti deals or wait for the 2080? The specs are shit and idgaf about ray tracing.
I don't want to wait a month for Nvidia to release a slower GPU that costs like $200 more.

What a weird launch.

Just sit tight and wait for some reviews/benchmarks to come out. Then it should be pretty clear.
 
I'm a sucker so I pre-ordered the 2080 FE card and am going to return my recently purchased 1080ti to Amazon. I wonder how long it will take EK to make water blocks for the new cards?
 


Seems according to dice the effects in bfv will be in the game, and this wasn't some specially concocted demo to showcase these effects.
 
Didn't we go through this same bullshit with the 680 when it was realized that the 680 was using the smaller chip (GK104) instead of the bigger gk110 or something?

Yeah we did. But consumers let it happen, and kept paying for cut-down GPUs, so of course they're doing it again. They'll keep doing it until people stop paying, or until the competition actually competes. It's why, as annoyed as I am at Intel, I'm excited about their entering the market in 2020. We need disruption, badly.

Welp, 2019 will be my DCI P3 monitor purchase year. My GTX 1060 is holding up just fine at ultrawide 1080p.
 
Definitely waiting for benches and not buying an FE anyway - much prefer a Strix or FTW or something of that nature. I know we're all dying to see those benches though.
 
Yeah we did. But consumers let it happen, and kept paying for cut-down GPUs, so of course they're doing it again. They'll keep doing it until people stop paying, or until the competition actually competes. It's why, as annoyed as I am at Intel, I'm excited about their entering the market in 2020. We need disruption, badly.

Welp, 2019 will be my DCI P3 monitor purchase year. My GTX 1060 is holding up just fine at ultrawide 1080p.


Cut down? Didn’t he say the 2080ti is the second largest chip ever produced?
 
Hm the more I think about the specs, the more I think that for $1200 they could have given us 16GB of GDDR6, especially considering the sheer amount of compute power.
 
  • Like
Reactions: mikeo
like this
Hm the more I think about the specs, the more I think that for $1200 they could have given us 16GB of GDDR6, especially considering the sheer amount of compute power.

Of course they could have... But $.

Gotta have that baked-in, planned obsolescence (that's a weird looking word).
 
Hm the more I think about the specs, the more I think that for $1200 they could have given us 16GB of GDDR6, especially considering the sheer amount of compute power.
I agree. I do know that I played a heavy demo on my Rift and it went to 10.8gb's of memory and the 1080ti only has 11..... The frame rates were the minimum, so it was a pretty intense demo.
 
Of course they could have... But $.

Gotta have that baked-in, planned obsolescence (that's a weird looking word).

Yeah... This is all because of lack of competition. Luckily, Intel is stepping into that space in 2020, let's hope they release a compelling product. My guess is Nvidia's next GPU will launch around that time, possibly just after Intel's. Likely, that cold war has already begun.

You saw what Intel did for 10 years without competition from AMD, $350 quad cores for a decade with 5% generational improvement. A Core 2 Quad is still decent for 1080p gaming and general use. Let's hope this doesn't happen to GPUs. AMD made a killing last quarter, so maybe they'll devote more resources to their own GPU division.
 
How's ZOTAC as a company? Can we expect third party water blocks for the AMP models?
 
I think zotac is underrated. Their amp extreme line has been very good.

This. I was super impressed with my 980ti AMP Extreme. Built like a tank, ran quiet, and was the highest factory OCd card on the market.
 
not much was talked about NVlink, didn't he say that now you can add vram across multiple cards?
 
Spending $1200 on a GPU that could be outdated in two years is dumb in my book. I'm better off buying another KRISS Vector for that much.
 
Everyone has cards for preorder now. I am amazed no reviews no nothing and people would be buying these. I mean when was the last time we saw this? Nvidia hypes a feature with no direct performance numbers, no reviews and allows preorders and people buy! I wanna see some damn reviews. 1200 on hype alone? No thanks.
 
Everyone has cards for preorder now. I am amazed no reviews no nothing and people would be buying these. I mean when was the last time we saw this? Nvidia hypes a feature with no direct performance numbers, no reviews and allows preorders and people buy! I wanna see some damn reviews. 1200 on hype alone? No thanks.

Prepare for some serious disappointment :)
 
Everyone has cards for preorder now. I am amazed no reviews no nothing and people would be buying these. I mean when was the last time we saw this? Nvidia hypes a feature with no direct performance numbers, no reviews and allows preorders and people buy! I wanna see some damn reviews. 1200 on hype alone? No thanks.

Agreed. I enjoyed the presentation even though it was way more technical on lighting dynamics than I personally cared for. It's fine to change benchmarks on a go-forward basis once a base has been established, but that also means the same comparison needs to be done on the same metrics against what is currently available today. Yes, the RTX 2080/2080TI sounds incredible, on paper, but show me it next to a 1080TI so I can see whether or not it is a justifiable upgrade at $1k+, not factoring in the resale value of the 1080ti.
 
It's funny to me how as much as people complain about the prices, nearly every preorder is sold out, and every Ti preorder is sold out on Newegg....and people wonder why they get away with their pricing??
 
Agreed. I enjoyed the presentation even though it was way more technical on lighting dynamics than I personally cared for. It's fine to change benchmarks on a go-forward basis once a base has been established, but that also means the same comparison needs to be done on the same metrics against what is currently available today. Yes, the RTX 2080/2080TI sounds incredible, on paper, but show me it next to a 1080TI so I can see whether or not it is a justifiable upgrade at $1k+, not factoring in the resale value of the 1080ti.

Call me a 4k Nvidia fanboy, but I pre-ordered. I didn't regret the 980Ti, or the 1080Ti. Hard to imagine I'll be disappointed when I'm wanting more beef for 4k and BF 5 is around the corner. Nvidia has not disappointed me in a while, not to say this isn't the generation they do, but we will see.
 
Call me a 4k Nvidia fanboy, but I pre-ordered. I didn't regret the 980Ti, or the 1080Ti. Hard to imagine I'll be disappointed when I'm wanting more beef for 4k and BF 5 is around the corner. Nvidia has not disappointed me in a while, not to say this isn't the generation they do, but we will see.

True but they showed absolutely no benchmarks. How about games that don’t have ray tracing. Would you be disappointed if it’s only 20% or so fast? I think that might be the case because nvidia didn’t give brute force numbers. When pascal launched they gave out exact performance number, as to how much faster it would be one average. I think a lot of people bought in to 6x performance increase that was only related to ray tracing. I would order but for the first time not seeing hard game comparison chart it was iffy to me. I would love to be disappointed if it’s faster. But to me it seems actual performance won’t be mind blowing. It’s just that pascal won’t support ray tracing.
 
What's to stop crypto from gobbling up these new RTX series cards and escalating the price even higher and making them hard to find in stock anywhere.

Right now crypto is only 1/2 or 1/3 of what it was last year but it's bound to go up again and if that happens new cards will be swept up from all stores.
 
If the GPU has the horsepower to make use of more than 11GB of RAM (which it seems even the 1080Ti was able to do), can you explain to me what else it could be other than planned obsolescence?

I've been a part of this hobby for far too long and seen too much to give Nvidia the benefit of the doubt.
Segmentation, it is really that simple.
What games uses 11GB of RAM?

Are you confusing gaming card with professional cards?
Aka you cannot have your cake and eat it too.

My guess is you want a Pro card at the cost of a gaming card.
This has always been the case, so I guess your memory is failing you.
 
True but they showed absolutely no benchmarks. How about games that don’t have ray tracing. Would you be disappointed if it’s only 20% or so fast? I think that might be the case because nvidia didn’t give brute force numbers. When pascal launched they gave out exact performance number, as to how much faster it would be one average. I think a lot of people bought in to 6x performance increase that was only related to ray tracing. I would order but for the first time not seeing hard game comparison chart it was iffy to me. I would love to be disappointed if it’s faster. But to me it seems actual performance won’t be mind blowing. It’s just that pascal won’t support ray tracing.

Oh come on, only believe what the Wizard shows you and dont look behind that curtain. I mean no one in marketing has ever stretched the truth and it means only good things when they wont show hard numbers or at least exaggerated charts on how much better these new cards are.
 
True but they showed absolutely no benchmarks. How about games that don’t have ray tracing. Would you be disappointed if it’s only 20% or so fast? I think that might be the case because nvidia didn’t give brute force numbers. When pascal launched they gave out exact performance number, as to how much faster it would be one average. I think a lot of people bought in to 6x performance increase that was only related to ray tracing. I would order but for the first time not seeing hard game comparison chart it was iffy to me. I would love to be disappointed if it’s faster. But to me it seems actual performance won’t be mind blowing. It’s just that pascal won’t support ray tracing.

Why would I be disappointed? All I need is 20% more or so to keep riding the 4k wave. The 6x was a horseshit raytracing graph. It's up to games to support it - I'm sure it will be like hairworks or something and be a pillar of BF5 eye candy.
 
Why would I be disappointed? All I need is 20% more or so to keep riding the 4k wave. The 6x was a horseshit raytracing graph. It's up to games to support it - I'm sure it will be like hairworks or something and be a pillar of BF5 eye candy.

Damn! You would pay 1200 for 20% more performance. Respect man. If you know what your getting all power to you. 4k 60hz probably work fine.
 
Why would I be disappointed? All I need is 20% more or so to keep riding the 4k wave. The 6x was a horseshit raytracing graph. It's up to games to support it - I'm sure it will be like hairworks or something and be a pillar of BF5 eye candy.

Looks like a lot more games will use it than you think:
vf3cu0bpfah11.png
 
Raytracing looks impressive to me and elevates lighting, shadows and reflections to another level. If we want more realism and evolution things like that need to be supported and implemented. Otherwise we'll be stuck forever to fake environments.
 
Back
Top