2080 Ti for $1000 confirmed, HDMI 2.1/VRR confirmed dead for another year

They likely didn't account for edge diffraction, atmospheric light scatter, and room reflections.

When the flamethrower first turned on, I was surprised at the lack of light created by the flame. The reflections look good, but the fire didn't emit as much light and cast shadows as I expected.

It's a nice new feature, but just one of many - and still in it's infancy. Not worth an upgrade from my 1080ti on RT alone.
 
Exactly. And honestly it’s a pretty cheap hobby.

nVidia’s gross profit has been flat since 2012.

Which is incredible considering what they have produced. Hence part of the reason why their stock has skyrocketed.
 
why is VRAM not increasing with the 2000 series?...still stuck on the same 8GB or 11GB (Ti)
 
Last edited:
why is VRAM not increasing with the 2000 series?

If I to take a stab, probably because GDDR6 is still expensive, though if they are gonna charge 1.2k per card, you think they will throw in at least 16gb. We will see if 11GB is enough for 4k gaming once reviews are out.
 
Your acting like these price points are new, nvdia been doing this since the 8800 ultra. That thing was 800 dollars on release day. The top tier nvidia graphics card is always going to be way more then the mid tier. Pc gaming is my favorite hobby, so I can justify dropping 1k on the top tier graphics card, thats why they have the xx60 and xx70 to fill that spot where your willing to spend xx dollars.

Exactly. And honestly it’s a pretty cheap hobby.

nVidia’s gross profit has been flat since 2012.

Well... so much for that eh?

All you do is complain. You complained about the 4k144hz displays and now your complaining about this. Perhaps its time you find another hobby, because your constant belly aching is not helping anything.

You can simply stay one GPU gen behind and get all the price / performance value you want by purchasing last gen cards second hand once the latest and greatest comes out. In fact I have two wonderful Titan X Pascals ready for you to buy very soon.....all happy now?

Curious... How are you helping by NOT complaining? Will video cards/monitors become cheaper or be released faster by not complaining?
 
Hey guys, I thought they already released Vega? Am I in the right thread, this is the GPU line with useless features at the moment, and doesn't touch on real world performance... Oh wait....no this is an nVIDIA launch?! Guys??

Yeah, I don't think I would sign up for more than a 2070 at this point, even then it's probably the new X50Ti...


/end sarcasm
 
Honestly... Nothing I'd be interested in playing (at least not at the settings that would require that much VRAM). At this point I think I'm just bitching to bitch because of my previously mentioned reasons.

Games that use high amounts of texture memory like; Wolfenstein II, GTA V, Shadow of War, etc and some VR games/demos have been shown using every scrap of the 12gb available on the Titan X in some situations though, which makes me wonder if games making use of all that memory would be more common if GPUs had more VRAM across the board.



Almost guaranteed.

But that is because they have to allocate memory to shadowmaps, pre-baked lighting etc.
But don't buy.
If fake graphics is fine for you, no problem.

But don't whine as the rest of the world moves on.
 
any theoretical/speculated/relative performance numbers been posted anywhere? i.e. we know these specific specs, lets try to extrapolate what the performance numbers would look like in scenario a, b, c
 
I setup a bunch of price alerts on 1080 Ti's, will grab one for like $400 and then upgrade to the 3080 next year I guess.
Pretty weak showing, Nvidia.
 
I am not concerned about the price of 2080Ti (and trust me, in Sweden, we pay a lot more for that than you in the US do), I'm concerned about the future. So why did we get 2070, 2080 and 2080TI and not 2070, 2080 and new Titan? Nvidia milked early adopters with Titans for so many years, why the sudden change? I think we will see the real 2080Ti in 2019.
I will wait until summer 2019, my 1080Ti is more than adequate for 4k gaming.
 
As I also posted in another thread Caseking.de has listed the EVGA 2080Tis for 1.400 Freaking Euros and no Ti below EUR 1229.

I imagine a new Titan or new Ti in 2019 will be even more expensive and then this gets ridiculous fast.
 
Nvidia is just a greedy bastard. Ti models we're always around $700 and now $1200? Even the regular 2080 is more than previous ti models approaching Titan pricing. Seriously Nvidia, you're fucking nuts. Middle finger to you.
 
I think they screwed themselves with their naming convention....
 
Seriously, where the F is AMD? This pricing is getting out of control. $1200 this time, $1500 next time, then $2000? Nvidia is going to keep raping until no end. Everyone, don't buy it and force Nvidia to lower prices.
 
Last edited:
If the number of people capable / willing to spend $1200 for this has reached "critical mass" then this is how it's going to be from now own. More people, with more income and the will to spend it on GPUs.
 
People spend $1000 on a phone to take selfies and go one facebook which any $200 phone can do.
 
  • Like
Reactions: mikeo
like this
People spend $1000 on a phone to take selfies and go one facebook which any $200 phone can do.

You know we have an island in Greece called "Mykonos" where people spent $10,000 for dinner tables and there have been bills of $150,000 for a night in certain tables.

There are extremes and there are extremely rich people but this is not what worries me. The fact that average prices of what we used to get for top end models in the past are climbing up in almost everything ranging from Phones to Kitchen stoves, tools, watches, cars, blenders, etc.

I have a big list of things I want to buy in Amazon and the top end models for certain things cost a shitload of money.


Are there many more consumers with higher incomes these days? Probably yes. Is there going to be a day where 20% of the population will be on a class of its own income wise - I bet so.
And what companies are going to do about it? They are going to introduce product lines that catter to them the 20%, the 10% the 5% and even the 1% if it amounts to a good number.

So everybody should get used to it and enjoy what does not cause a financial stress to them because they HAVE to get the BEST. They vast majority won't be able to and this is becoming a fact sooner rather than later.

I predict that as PCs become more and more of a niche, top end gear will skyrocket in prices and not to mention Pro PC hardware - oh this will become a serious investment.
 
Your acting like these price points are new, nvdia been doing this since the 8800 ultra. That thing was 800 dollars on release day.


Yes they did and it wasn't worth it back then either. It dropped in price in less than a month. At least you could say that the 8800 GTX was a massive step over everything else, it's not looking like the performance of these new cards will be anything to shout about over their Pascal counterparts apart from games that have Ray Tracing. So you are paying 100's of dollars for minimal improvement and features that can only be used in a handful of games.
 
Well I definitely picked the wrong year to leave my high paying senior management role to go and work for a charity for bits of string.

Guess I’ll skip this gen and save for the next one.
 
I still want to see the performance numbers. I was ready to pull the trigger at 1k. 1200 made me flinch a little.
 
The nice thing about a Pre-order is it can be canceled :) I'm tentatively in for one, but won't hesitate to cancel at the last minute if I don't see some solid numbers first.
 
The nice thing about a Pre-order is it can be canceled :) I'm tentatively in for one, but won't hesitate to cancel at the last minute if I don't see some solid numbers first.

Personally I expect less then a 25% difference for you from a Titan to a 2080ti, unless for some reason they overclock very well.
 
Yes they did and it wasn't worth it back then either. It dropped in price in less than a month. At least you could say that the 8800 GTX was a massive step over everything else, it's not looking like the performance of these new cards will be anything to shout about over their Pascal counterparts apart from games that have Ray Tracing. So you are paying 100's of dollars for minimal improvement and features that can only be used in a handful of games.
If you assume no increase in performance on a per-clock basis, the 2080 Ti apparently at least includes an increase in CUDA core count, so there is at least that. <shrug>

why is VRAM not increasing with the 2000 series?...still stuck on the same 8GB or 11GB (Ti)
Capacity does not seem to be the limiting factor to performance, currently. They could presumably include more memory than that, but it would go unused for any task they mean for you to use this for. If you need more memory than this, they have the Quadro and Titan lines available as well.
 
So some further speculation based on facts presented.

Facts: The 2070 has a bit of a few less CUDA cores than a 1080 and is launching near the 1080's SRP. Clock speeds are within 100-200mhz of each other. Not very significant I guess.

Assuming Turing has a 5-10% improvement over the Pascal arch, the rumors saying that the 2070 is about 8-10% faster than the 1080 would be plausible.

The question is... given the availability of the 1080/1080Ti, what will be the 2070's selling point? To an extent, this affects the 2080 as well given it's price.

Sure, it has ray tracing and everything but that's probably one game in the next 50 releases/2 years...

Wonder what the 2070 has up its sleeve?
 
Last edited:
How does Nvidia sell GPUs that cost hundreds of dollars more than their predecessors while also being slower? What happens when the Pascal market dries up and you're forced to spend more for less on a Turing GPU? PC graphics value is actually going to regress for the first time in history. And to top it all off, this is all happening despite Pascal being really old (2.5 years).

This is going to be the weirdest GPU launch ever, and may actually dethrone the FX 5000 series for "Biggest Piece of Shit" ever made by Nvidia. Can't wait to read the reviews.
 
There's no way they'll be slower. How much of a 'value' they're going to be at current prices remains to be seen. Probably not a great one.

I assume they're priced like they are in part to clear out that remaining Pascal inventory. Watch the prices drop on these things Q1/Q2 next year.
 
Something that hasn't been mentioned much is that using the RT and AI cores provides a double benefit. On one hand you have lighting effects, AA, and scaling that are heavily accelerated, but in addition to that, the CUDA cores are freed up from those tasks to do more shading. So in essence, you may only have 500 extra CUDA cores in TU102 vs. GP102, but you have a lot more that are freed up from doing lighting and anti-aliasing, and you can leverage AI to scale up to 4K without much of a performance penalty. All of that is contingent on developers taking advantage of those accelerators though.

So I also wonder if they'll be able to leverage the RT and Tensor cores for shading tasks in games that don't support RT/AI-assisted AA. It won't be efficient, but still if you can get another 10% improvement by emulating shading tasks on those cores, it may be worth it. I think 7nm + RT maturity + competition from Intel will likely make the next generation much more appealing.
 
I believe Nvidia is charging a lot because they know that the perf over the 10 series will be pretty huge.

Turing is not based on Pascal architecture, it is a brand new architecture that Nvidia has worked on for 10 years, so basically we cant guess the perf based on how many cores it has, because we don't know how powerful the cores are on Turing yet, but my guess is that they are a lot more powerful than the Pascal cores.

I think that once the official benchmarks come out, a lot of people are going to be surprised how powerful Turing really is.
 
I believe Nvidia is charging a lot because they know that the perf over the 10 series will be pretty huge.

Turing is not based on Pascal architecture, it is a brand new architecture that Nvidia has worked on for 10 years, so basically we cant guess the perf based on how many cores it has, because we don't know how powerful the cores are on Turing yet, but my guess is that they are a lot more powerful than the Pascal cores.

I think that once the official benchmarks come out, a lot of people are going to be surprised how powerful Turing really is.
Did they mention it at the unveiling that it will be much more powerful than the 1080ti?
 
why is VRAM not increasing with the 2000 series?...still stuck on the same 8GB or 11GB (Ti)

It is interesting that Nvidia kept the 2080 Ti at 11GB. Maybe they are afraid of cannibalizing their professional (volta) market. I've heard reports that 12GB or more is necessary for some enterprise uses. That 1 GB literally makes all the difference. For $1200 I'm sure they could have given us 12GB but it seems they want to segment their market between video game enthusiasts and professional workstation type users.
 
Did they mention it at the unveiling that it will be much more powerful than the 1080ti?

They didn't mention anything about how much more powerful it will be in non Ray Tracing game titles, I liked QuantumBraced comment above, the Tensor cores and AI should benefit in older games, propelling them to run at very high fps.
 
Looks like Ray tracing takes a huge performance hit. Can't even run shadow of the tomb raider at 60fps at 1080p LOL. Huge gimmick. Why spend $1200 for features that force you to run at 1080p.

https://www.dsogaming.com/news/nvid...-tomb-raider-with-60fps-at-1080p-with-rtx-on/
And this is one of the reasons we're not seeing any benchmarks or reviews at this point. Nvidia knows damn well these cards are not going to perform all that great when running Ray tracing. And looking at the specs on the card it's likely not to be much of an improvement in regular games over the current gpus. But hey let's go pay $1,200 for a GPU with absolutely no fucking reviews and worry about it later...
 
They didn't mention anything about how much more powerful it will be in non Ray Tracing game titles, I liked QuantumBraced comment above, the Tensor cores and AI should benefit in older games, propelling them to run at very high fps.

We didn't get any perf gains from the tensor cores with the Titan V, so if it were capable of that then Nvidia must have been sandbagging with the drivers.
 
Back
Top