2080 Ti for $1000 confirmed, HDMI 2.1/VRR confirmed dead for another year

Raytracing looks impressive to me and elevates lighting, shadows and reflections to another level. If we want more realism and evolution things like that need to be supported and implemented. Otherwise we'll be stuck forever to fake environments.

And more resources can be moved ways from shadow-maps, pre-baking lighting etc. and put into stuff that adds value...I keep saying this, but this reminds me of the G80 launch...11 years ago.

And just to put things in perspective:
https://www.hardocp.com/article/2007/05/02/nvidia_geforce_8800_ultra_sli/

When the GeForce 8800 GTX was introduced it had a suggested retail price of $599. You can currently find GeForce 8800 GTX cards online from $529 all the way up to $939 for an overclocked and water-cooled BFGTech GeForce 8800 GTX.

the GeForce 8800 Ultra has an MSRP of $829 USD

People have very short memories...
 
And more resources can be moved ways from shadow-maps, pre-baking lighting etc. and put into stuff that adds value...I keep saying this, but this reminds me of the G80 launch...11 years ago.

And just to put things in perspective:
https://www.hardocp.com/article/2007/05/02/nvidia_geforce_8800_ultra_sli/





People have very short memories...

Maybe they have short memories but the sentiment probably hasn't changed.

What really taints this entire launch is the price of the GeForce 8800 Ultra, $829. What was NVIDIA thinking with an MSRP like this? Granted, when cards show up into stores and online retailers the prices may be different as manufacturers and retailers set those accordingly. However, from what we hear there may be limited availability of the GeForce 8800 Ultra according to NVIDIA because of آ“high demandآ” of آ“the most powerful graphics solutions available.آ” That could certainly keep the price inflated. We hope this doesnآ’t turn out to be another video card that is hard to find and when you do find one is way overpriced. Availability is quoted to be before May 15th.
 
Damn! You would pay 1200 for 20% more performance. Respect man. If you know what your getting all power to you. 4k 60hz probably work fine.

I always figure I sell my previous card and recoup some. It ends up being 500-700 every year to two years to put me on the bleeding edge. I typically try to stretch to 2 years but the 1080Ti was barely enough to do what I wanted at 4k.
 
I always figure I sell my previous card and recoup some. It ends up being 500-700 every year to two years to put me on the bleeding edge. I typically try to stretch to 2 years but the 1080Ti was barely enough to do what I wanted at 4k.

Oh. Did you come from 1080ti or prior gen?
 
At this point, I feel like if AMD comes out with something with equal performance to a 1080ti, I'd probably buy the darn thing even though I already own a 1080ti just to switch teams. I'd also go pick up a nice Freesync monitor and throw some fingers up at Nvidia. Unfortunately, AMD doing that is probably a pipe dream.
 
Am I reading this right? The 2080 costs about the same as this 1080ti I've had for over a year, but is actually 15% slower (without knowing the effect of faster memory) in existing games?

Yeah, I'm gonna need to see a "killer app" utilizing raytracing before I consider a deal like that.
 
As of right now this is a no for me. GTX 1080 Ti handles all games just fine. That pricing is insane.
 
As long as Ray Tracing is an on/off setting - I won’t mind to wait for this technology to mature and just keep my 1080 till later series.
 
I feel a tad antsy not taking part in the pre-order madness, but I struggle to think what a review might show that would convince me to buy one. I guess the simultaneous launch of a VR headset with double the resolution of current models (Vive Pro and Pimax included), and proof this card can handle any game at that resolution at 90hz?
 
Only way I would even consider going next gen is if they added RT to Overwatch and made it look really cool. Otherwise. I'm still good on Gtx1080 for now. I just wanna see some benches though!!
 
Why would I be disappointed? All I need is 20% more or so to keep riding the 4k wave. The 6x was a horseshit raytracing graph. It's up to games to support it - I'm sure it will be like hairworks or something and be a pillar of BF5 eye candy.
LOL then what the fuck would you not buy?
 
Am I reading this right? The 2080 costs about the same as this 1080ti I've had for over a year, but is actually 15% slower (without knowing the effect of faster memory) in existing games?

Yeah, I'm gonna need to see a "killer app" utilizing raytracing before I consider a deal like that.
Where did you see this. I have a 1080 and pre ordered a 2080, but if it’s somehow slower....
 
That screenshot up above, showing In Death as one of the new RTX titles....If you guys have a VR headset and haven't played In Death, may I HIGHLY suggest you stop what you're doing and get it right now. It's absolutely spectacular. I'm stoked to see a small game like that being showcased by nVidia.
 
I'll wait fact in Canada prices are insane should be about 1100-1200 bucks here. Until alot games come out..might as well wait because I have 1080ti for now.
 
The Ray Tracing is going to look fantastic . That said I will probably only have it on for single player portions of the game. Why?! Because it divides the player base into Haves and Have Nots.

At the conference they talked about how you can hide things now in shadows that global illumination didn’t have. In the BFV demo the glass reflected and refracted light often obscuring vision. This allows lower settings to have a competitive advantage.

It’s like when in Battlefield Vietnam, if you played in High settings there was tall grass that completely hid your character until a player with low settings took you out because his system didn’t draw the grass. Except now we will have it with shadow and reflection.

Also I want to see benchmarks before I give anyone an interest free $1000 loan.
 
All this together at once though? Nvidia can get fucked.

tenor.gif
 
It’s like when in Battlefield Vietnam, if you played in High settings there was tall grass that completely hid your character until a player with low settings took you out because his system didn’t draw the grass. Except now we will have it with shadow and reflection.

Sorry that I shot you, bro.
 
how much of a performance hit will enabling ray-tracing add?...what does ray-tracing do that VXAO (ambient occlusion) can't?...how much ray-tracing will actually be in these 21 games that Nvidia touted at the announcement?...5 minutes of a glorified demo section or throughout the entire game?

it's going to take years before ray-tracing is refined and supported in most games...
 
The 2080 Ti looks tempting but I think I'll be dropping $1200+ on an 8 core Intel CPU + new mobo in the future instead. I'll wait till Turing hits 7nm and grab that.
 
Nvidia decided to try to push graphics development in the direction of ray tracing, and spent significant die area budget on this. So that's what you're paying for, the CUDA core counts are just to ensure that these GPUs aren't beaten by their Pascal counterparts. Given the prices, does that make these cards lame for current games? Probably, yes.

Here's the thing, though. There's no competition. AMD does not plan to release another high-end GPU until 2020-2021. Navi 10 in 2019 on 7nm will not come close to the 1080TI, heck there is a good chance it will be 1080 class at best. So if you want the best card you're buying a 2080TI and that's all there is to it. Either that or just wait until 7nm refresh of Turing (probably in 12+ months), which may provide bigger gains.

Honestly, if Nvidia was going to blow a bunch of die area on something that won't benefit benchmark charts tomorrow, this is a good time to do it. No one can challenge them on peak performance even when they tie one hand behind their back like this.
 
Segmentation, it is really that simple.
What games uses 11GB of RAM?

Are you confusing gaming card with professional cards?
Aka you cannot have your cake and eat it too.

My guess is you want a Pro card at the cost of a gaming card.
This has always been the case, so I guess your memory is failing you.

Honestly... Nothing I'd be interested in playing (at least not at the settings that would require that much VRAM). At this point I think I'm just bitching to bitch because of my previously mentioned reasons.

Games that use high amounts of texture memory like; Wolfenstein II, GTA V, Shadow of War, etc and some VR games/demos have been shown using every scrap of the 12gb available on the Titan X in some situations though, which makes me wonder if games making use of all that memory would be more common if GPUs had more VRAM across the board.


Almost guaranteed.
 
Nvidia decided to try to push graphics development in the direction of ray tracing, and spent significant die area budget on this. So that's what you're paying for, the CUDA core counts are just to ensure that these GPUs aren't beaten by their Pascal counterparts. Given the prices, does that make these cards lame for current games? Probably, yes.

Here's the thing, though. There's no competition. AMD does not plan to release another high-end GPU until 2020-2021. Navi 10 in 2019 on 7nm will not come close to the 1080TI, heck there is a good chance it will be 1080 class at best. So if you want the best card you're buying a 2080TI and that's all there is to it. Either that or just wait until 7nm refresh of Turing (probably in 12+ months), which may provide bigger gains.

Honestly, if Nvidia was going to blow a bunch of die area on something that won't benefit benchmark charts tomorrow, this is a good time to do it. No one can challenge them on peak performance even when they tie one hand behind their back like this.

I am not so sure about that. They might leave Navi to consoles if its like that. I mean shrinking Vega to 7nm can bring them above 1080ti easily. So I am on sure I buy the fact that AMD won't release anything in 2019. If navi is designed for consoles it really depends on how it can scale. If its only designed to be console only then vega 7nm will likely come to gamers. But I do think Navi will atleast match 1080ti on a budget. I am not sure how you can possibly know for a fact that Navi won't even come close to 1080ti. Amd has to have something in their pocket to have a decent card next year. Not saying it will be turing killer but I think they will get it past 1080ti if they wanted.


I wouldn't quote wccftech on anything. They make an article about anything and everything. Now they are barely starting to give credit to videocardz.com, remember their claim about rx 480 being an overclocking beast? it was all hype.

Just go look at his articles now. He recycles articles and copy pastes from old articles and right now he is using words like monumental leap in graphics performance and there is not one frickin bench yet. lol. They like hits, the more articles the more hits.

Right now wccftech is busy talking about how great the nvidia cooler is without actually having benched it and how it has crazy amounts of OC and runs so cool and quiet. Well because he heard it from PR event. Like I said they talk about anything. Hype is unreal with them lol
 
Last edited:
Where did you see this. I have a 1080 and pre ordered a 2080, but if it’s somehow slower....

I'm just going directly off official 2080 specs: 2944 cuda cores and 1800 boost clock, vs my 1080ti which has 3584 cuda cores and 1700 boost clock. 2080 has GDDR6 though.
 
I dunno how you can jump to that conclusion and so easily forget that the 1080 outperformed a 980 Ti while being at a memory bus/shader disadvantage.

These cards are priced @ where they perform + some. Expect a 1080 Ti to land somewhere between a 2070 and 2080.

I don't think you will see the same jump! Its not a full node shrink. They all have little more cores and added ray tracing core. So we will see. Unless nvidia is using tensor cores to boost performance we will see. But I think this is more of a side grade when it comes to overall performance. But yea when using ray tracing you will see a bigger difference but the thing is that feature likely won't work on pascal anyways so there probably wont be a direct comparison. So you are probably just paying for ray tracing at this point.

Also clearly you can't claim that. They are priced at "where we don't know how they perform" until we see some benches I am not buying it. Nvidia at no point gave direct comparison to actual gaming other than RTX opps.

Its funny you are telling him he is jumping to conclusion but you are doing the same yourself because even nvidia didn't give direct overall gaming number. No reviews, just take their word at it and pre order.
 
Impressive package altogether, although I'm curious to see how the performance looks against Pascal without the "RTX ops" thing. Out of my price range though if they're really nice I might consider a used one in a couple years. :)

All this together at once though? Nvidia can get fucked.

[...]

I think I'm just going to say fuck it, and go with just a 2080.

:rolleyes: yeah, give them $800, that'll learn 'em.

IDK you guys but I actually LOVED BF5 demo... was awesome, really..

Agree, that was a pretty impressive demo. Really liked how the effects looked on it, although it seemed to me they were going from "effects with raytracing" to "no effects" to really up the "wow" factor of the demo. Still, can't deny that the reflections were really neat. I don't know how much I'd appreciate it in the middle of a match, but if it makes it easier on the developers and we get "free" extra image quality out of it, well, let's see some more raytracing hardware!

...still confused by: https://www.hardocp.com/news/2018/06/04/dont_expect_new_geforce_gpu_for_long_time

Is that because of the change in branding, moving away from GTX?

No, it's because of marketing. If you tell your target audience that you are going to release something new in two months' time and it will be 4x faster than your current top end card, it will hurt your sales. So of course when they are asked about it they will say "no, we don't have anything planned for a while."
 
the Battlefield demo was underwhelming...shadows and reflections don't look so detailed and pronounced in real life...looked too fake to me...
 
i thought it was a nice looking 1st attempt at what theyre going to be doing with the new hardware. it will get better...
 
I dunno how you can jump to that conclusion and so easily forget that the 1080 outperformed a 980 Ti while being at a memory bus/shader disadvantage.

These cards are priced @ where they perform + some. Expect a 1080 Ti to land somewhere between a 2070 and 2080.

The 1080 has a 60% higher clock than 980ti. The 2080 has a 6% higher clock than a 1080ti.
 
:rolleyes: yeah, give them $800, that'll learn 'em.


Yup. I want to game @2560 x 1440 at graphics level that's acceptable to me and they're the only player in the game.

The point was to not give them anything afterwards. I'll probably give in and pick up a 2080Ti in the end just so it lasts as long as possible, but I'll likely be bowing out of enthusiast PCs for the foreseeable future after that.
 
plenty of 1080ti's floating around for $550-6 which at this point with absolutely ZERO RT games out seems like a steal. But people get bored or get envious and want the latest even if it isnt the smartest purchase yet.
 
the Battlefield demo was underwhelming...shadows and reflections don't look so detailed and pronounced in real life...looked too fake to me...

They likely didn't account for edge diffraction, atmospheric light scatter, and room reflections.
 
Well, that price.will get me both PS5 and Xbox Scarlet. With games like Horizon, FC5 or Origins looking at X Enhanced at 55" 4k better than on 1080 ti@1440p@27" I said goodbye to "master race"

1080 ti is still enough for the games I play, will be playing on PC - strategies and some RPGs, for rest I have consoles and One X as my now main gaming platform.

Not going to spend 1200 euro a year to keep in top of GPU race. That's like 2 weeks of decent holidays all-inclusive, so NV could go to hell with those prices.
 
Back
Top