NVIDIA Controls AIB Launch and Driver Distribution @ [H]

I can't help but wonder whether AMD starting to utilize TSMC is for two reasons...

What do you mean starting?

Every AMD GPU made has been made at TSMC. If not all, damn near. I think Nvidia started using Samsung for a few parts, but TSMC has been the GPU fabrication center for decades.
 
What do you mean starting?

Every AMD GPU made has been made at TSMC. If not all, damn near. I think Nvidia started using Samsung for a few parts, but TSMC has been the GPU fabrication center for decades.
Didn't AMD own GF at some point in history, and then decided to split it off from AMD and sell it off?
 
Didn't AMD own GF at some point in history, and then decided to split it off from AMD and sell it off?

Sure, but they didn't build GPUs there. That was for CPUs. Only recently has TSMC (and others) proven to be competitive with CPU manufacturing, and mostly because of Intel's tribulations with their own 10nm process.
 
Didn't AMD own GF at some point in history, and then decided to split it off from AMD and sell it off?

Sure, but they didn't build GPUs there. That was for CPUs. Only recently has TSMC (and others) proven to be competitive with CPU manufacturing, and mostly because of Intel's tribulations with their own 10nm process.

AMD's graphics division was ATI and ATI was being fabbed with TSMC. AMD had been fabbing their CPU for decades but then split that off into GF.
 
AMD has momentum, I don't want them to rush out something just because.

I'm willing to wait for AMD's 7nm GPU...my 1080Ti is working just fine.

They have momentum in their CPU division. Vega was a let down and AMD over hyped it. They also lost points due to releasing the cards at such a high price and not selling cards at MSRP.

Say what you want about nvidia but atleast they sold their founders cards at MSRP during the mining craze.

Hopefully AMD does better with whatever comes next. They did great with ryzen.
 
This is exactly what happens when one company is allowed to grow more powerful while their competition lags behind. You see it across any industry.

Unfortunately it doesn't appear AMD will help anytime soon and Intel will be a few gens behind before their place in dGPU is solidified. AIB are also of no help apparently.

So NV will continue to act like spoiled shitheads. Its a shame because I actually think Huang is a good leader. It would appear he is allowing his legal/PR departments to run wild. But again this is typical of a large powerful company with little to no comp. You should read Boeing, Gulfstream, or Disney NDA's.

I am with what others have said, just not going to buy a 2XXX card this go around. I want the fastest GPU I can get, just don't want to feel dirty about buying it. Between the lack of benches, NDA, and price increase (I mean WTF) I just can't buy. I am fine paying for high end cards, but bumping prices $200+ just fucking cause is non-sense.

Huang needs to get his shit together and tell his company to stop being dicks. It won't happen but that is the only way to stop this crap since as I said earlier, AMD and Intel will take too long. Eventually all NV's crap will catch up with them. Meanwhile we, consumers, will either stay a gen behind and wait for comp, or ask NV nicely to buy us dinner first.
 
  • Like
Reactions: MrXL
like this
What do you mean starting?

Every AMD GPU made has been made at TSMC. If not all, damn near. I think Nvidia started using Samsung for a few parts, but TSMC has been the GPU fabrication center for decades.
You're (mostly) right, I was mistaken and they had indeed been using TSMC on GPUs, except for Polaris and Vega.

Which perhaps is why Polaris made Raja look so bad... since it didn't manage to live up to maybe half of his hype. Perhaps the power efficiency and clocks would've been better with TSMC? *shrug* Just another thing we'll never know due to being a "what if" situation.
 
This is exactly what happens when one company is allowed to grow more powerful while their competition lags behind. You see it across any industry.

Unfortunately it doesn't appear AMD will help anytime soon and Intel will be a few gens behind before their place in dGPU is solidified. AIB are also of no help apparently.

So NV will continue to act like spoiled shitheads. Its a shame because I actually think Huang is a good leader. It would appear he is allowing his legal/PR departments to run wild. But again this is typical of a large powerful company with little to no comp. You should read Boeing, Gulfstream, or Disney NDA's.

I am with what others have said, just not going to buy a 2XXX card this go around. I want the fastest GPU I can get, just don't want to feel dirty about buying it. Between the lack of benches, NDA, and price increase (I mean WTF) I just can't buy. I am fine paying for high end cards, but bumping prices $200+ just fucking cause is non-sense.

Huang needs to get his shit together and tell his company to stop being dicks. It won't happen but that is the only way to stop this crap since as I said earlier, AMD and Intel will take too long. Eventually all NV's crap will catch up with them. Meanwhile we, consumers, will either stay a gen behind and wait for comp, or ask NV nicely to buy us dinner first.

Unfortunately, it doesn't really matter, if the performance increase is there, people will buy it.
 
Time to vote with your dollars, people.
Yep, Kyle refused to save me money so I will have to continue my patreon support. ;)

On a serious note, I can't state how much I appreciate someone taking a stand against this type of corporate behavior. Not sure why Nvidia needs to resort to these tactics when they obviously make a solid product. You win or lose on how good your product is...period.
 
Last edited:
Yep, Kyle refused to save me money so I will have to continue my patreon support. ;)

On a serious note, I can't state how much I appreciate someone taking a stand against this type of corporate behavior. Not sure why Nvidia needs to resort to these tactics when they obviously make a solid product. You win or lose on how good your product is...period.

Has more to do with the lack of free market principles. When there's only 1 or 2 companies playing there is no free market and therefore no pressure to be "better".
 
Unfortunately, it doesn't really matter, if the performance increase is there, people will buy it.

I totally agree! That is why it will take several gens for Intel to actually end up being comp and why AMD needs to get their shit together.

It is also why, if the performance is there, I can't say I won't look at the 20XX eventually. I will feel dirty about it but I want more performance.
 
50% AMD's fault, 30% government oversight committee and monopoly watchdog , and 20% consumer fault all this is happening with Nvidia.
The government under Obama bailed out 2 of the big 3 but couldn't give a shot in the arm to one of our own Tech companies? Oh yeah they did, the 2 or 3 failed solar companies. That's prime example how I hold the government responsible.

Consumers and Paypal mentallity is the last one. Let's be honest, some of us did really need to upgrade, looking at my fellow 980Ti/Radeon Fury folks here for 1440 and 4K. But most of the current buyers of the 2080/Ti that are selling out, just wanted the latest and greatest. People would be shocked to find these cards pushing 1080p and not the higher resolutions intended. Steams survey still show 1080p as main resolution. And another thing. Are their really that many people with higher paying jobs, say $50k+ a year single, or over $100k married buying this card for $800/1100 out of pocket? Nope. I live in a dorm, live off Ramen noodles, have a $1500-2000 paypal limit, HIT BUY DAMMIT! (no not me)

What I am getting at is we all are responsible in some way for the way Ngreedia practice now. No saint here. I should have tried to bid on someone else's 1080Ti, and not give Nvidia my money. But the thought of using someone else's stuff which I have no clue what they did versus having new out of box, well new out of box won.

Point is, we really are to blame ourselves for the way they have become. WE DID NOT HAVE TO HIT THE BUY BUTTON! But we did. Now we have to deal with the devil.
 
I totally agree! That is why it will take several gens for Intel to actually end up being comp and why AMD needs to get their shit together.

It is also why, if the performance is there, I can't say I won't look at the 20XX eventually. I will feel dirty about it but I want more performance.

Indeed, it will take a couple of generations for either AMD or Intel to catch up and that is assuming if Nvidia let off the gas pedal. Nvidia is betting big on Ray Tracing and hoping it will be the next graphics revolution they can capitalize on, whether it will succeed remains to be seen, but if it does, Nvidia will have a big advantage over AMD and Intel. But if it fails, that is when AMD can catch up if they focus more on performance and Intel being the wildcard out there.

Nothing wrong wanting more performance especially if Nvidia can provide it, you buy whatever suits your needs and should leave brand recognition out of it.
 
50% AMD's fault, 30% government oversight committee and monopoly watchdog , and 20% consumer fault all this is happening with Nvidia.
The government under Obama bailed out 2 of the big 3 but couldn't give a shot in the arm to one of our own Tech companies? Oh yeah they did, the 2 or 3 failed solar companies. That's prime example how I hold the government responsible.

Consumers and Paypal mentallity is the last one. Let's be honest, some of us did really need to upgrade, looking at my fellow 980Ti/Radeon Fury folks here for 1440 and 4K. But most of the current buyers of the 2080/Ti that are selling out, just wanted the latest and greatest. People would be shocked to find these cards pushing 1080p and not the higher resolutions intended. Steams survey still show 1080p as main resolution. And another thing. Are their really that many people with higher paying jobs, say $50k+ a year single, or over $100k married buying this card for $800/1100 out of pocket? Nope. I live in a dorm, live off Ramen noodles, have a $1500-2000 paypal limit, HIT BUY DAMMIT! (no not me)

What I am getting at is we all are responsible in some way for the way Ngreedia practice now. No saint here. I should have tried to bid on someone else's 1080Ti, and not give Nvidia my money. But the thought of using someone else's stuff which I have no clue what they did versus having new out of box, well new out of box won.

Point is, we really are to blame ourselves for the way they have become. WE DID NOT HAVE TO HIT THE BUY BUTTON! But we did. Now we have to deal with the devil.

It is hard, since there are people out there who try to support AMD but got burned by them after one let down after another. I don't necessary see people buying a 2080Ti a sell out since we don't exactly know what they are gonna do with the purchase. Some maybe intrigue by Ray Tracing but others probably feels they can justify spending that much especially if they can get the best absolute performance since the competition couldn't offer that, cause people who spends $1200 absolutely does not give a damn about $/performance.
 
cause people who spends $1200 absolutely does not give a damn about $/performance.

Many do- but they have the $ and the performance still isn't enough, so that's what you get :D

[the 1080Ti at US$700, while expensive, was surprisingly competitive in terms of performance per dollar...]
 
Indeed, it will take a couple of generations for either AMD or Intel to catch up and that is assuming if Nvidia let off the gas pedal. Nvidia is betting big on Ray Tracing and hoping it will be the next graphics revolution they can capitalize on, whether it will succeed remains to be seen, but if it does, Nvidia will have a big advantage over AMD and Intel. But if it fails, that is when AMD can catch up if they focus more on performance and Intel being the wildcard out there.

Nothing wrong wanting more performance especially if Nvidia can provide it, you buy whatever suits your needs and should leave brand recognition out of it.

Well Nvidia is going for special feature. Its still up in the air if RTX is part of gameworks or no. I think RTX feature is based of dx12 so I think developers shouldn't have tough time implementing it across all GPU vendors. But if AMD can come up with something similar and have enough horsepower they might actually catch up. But yea overall I think the way nvidia is going feature specific seems to be worrying though. I think they are pushing developers to use their features vs doing what pascal did to 980. Meaning brute force lol. I think AMD might leverage built in ray tracing of Dx12. However I do think that RTX does that same with dedicated RTX cores. The new Time Spy demo is based of DX12 ray tracing feature, its Not specific to RTX.

That would suck though if enabling RTX is nvidia exclusive on games doesn't run ray tracing on other hardware that might support it via Directx. I highly doubt AMD is going to make something ray tracing exclusive for themselves in gaming. They will likely go for open standard.
 
I think what you don't realize here is how Nvidia and AMD work with game developers in different ways. A game developer may try to cut corners and write sloppy code to get things done sooner. Nvidia hand holds and tweaks their drivers to maintain performance until the developer can get around and fix their sloppy code. AMD just flat out says "you are doing this wrong" and is okay with sub par performance. This was obvious back with PUBG back in the day where Nvidia cards got the best performance by a large margin. Eventually the PUBG devs tightened up their code and AMD got some good performance boosts without doing anything to their own drivers.

Not necessarily, the thing is that nVidia GPUs are more dependent on driver tweaks as their GPUs don't have a proper hardware scheduler like AMD GPUs does. nVidia uses a code morphing approach similar to Tegra Denver cores where they use code morphing, software scheduling and dispatching to feed the cores, so it offers flexibility to overcome quite a lot of the API deficit like Draw calls on DX11, that is why nVidia GPUs offers nice gains when you implement multithreaded draw calls as the threads are broken on different parts. AMD approach is far less flexible, but more predictable in performance and less reliant on driver performance to perform. That is why AMD's performance on DX11 is not as good as nVidia when hitting a CPU or draw call bottleneck but amazing once you use a modern API that can tax those cores as much with compute and lots of parallel work queues.
 
I highlydoubt AMD is going to make something ray tracing exclusive for themselves in gaming. They will likely go for open standard.
AMD has been working on raytracing for a while and wants it open standard...
Nvidia likely, not so much.
Most likely it will be Cuda locked like physx or nvidia hairshit.
 
Well Nvidia is going for special feature. Its still up in the air if RTX is part of gameworks or no. I think RTX feature is based of dx12 so I think developers shouldn't have tough time implementing it across all GPU vendors.
Oh get real lmao You'd have to be a fool (not saying you are, I know you're not!) to believe that nVidia would come out with an open standard, or rather, utilize one that is already there. That would show kindness and some semblance of caring, both of which go completely against nVidia's business model.

I mean I get why they won't support FreeSync (Adaptive Sync ala VESA standards naming now), that wouldn't make their G-Sync be too enticing then. However, if they could come up with a DRM equipped $50 adapter that the drivers see and enable FreeSync support... that would actually be a good move by them since it'd still get them $50 for free (I see this device costing like $1 to make), and would give people the opportunity to switch from AMD. I'm not for people switching... lol I'm just for having more options. G-Sync is slightly superior to FreeSync given it IS a hardware interface. It has like 5ms lower input response (which at the 25-30ms that they're at, is imperceptible anyways), and the range is much larger as well. Whether or not FreeSync 2 has improved upon either of those points, I dunno... Again, it would give people a cheaper monitor to use with virtually the same benefits, or the ability to use nV but utilize the monitor they already have.

But I digress, nV doesn't have that in them, so we won't see that happen. Just like I see it being a fat chance that their Ray-Tracing is anything but their own proprietary tech. I can give you two really good reasons why it'll be this way...
#1 If it were 'open' then it could just as easily be implemented on their 10-series cards. Can't have GTX cards eating into RTX's main selling point.
#2 It would mean that AMD can use their shiny new tech, and what happens if by some miracle AMD's cards manage to execute nV's Ray-Tracing with higher performance? They would completely lose their fucking minds over that!! lol ...Then they'd have to concoct some bullshit --which they are obviously happy to do and capable of-- on why this technology is no longer 'open' and only functions on nVidia's RTX line.

We've seen it all before in various manifestations. Early days of SLi, couldn't run CrossFire on those boards. Someone came up with a work-around via driver at some point, and I think nV countered to close that loophole. (either that or they caved and allowed it, I don't recall specifically when SLi and CrossFire could happen regardless of chipset...)
Don't forget the PhysX fiasco, being able to run an ATi card for graphics and an nV card for PhysX computations... That's totally uncool, better patch that ability out of the drivers!! What would our mother's think if we allowed another company to benefit from our hardware PhysX, even IF it meant we might make a sale in the process?! :p
 
Well Nvidia is going for special feature. Its still up in the air if RTX is part of gameworks or no. I think RTX feature is based of dx12 so I think developers shouldn't have tough time implementing it across all GPU vendors. But if AMD can come up with something similar and have enough horsepower they might actually catch up. But yea overall I think the way nvidia is going feature specific seems to be worrying though. I think they are pushing developers to use their features vs doing what pascal did to 980. Meaning brute force lol. I think AMD might leverage built in ray tracing of Dx12. However I do think that RTX does that same with dedicated RTX cores. The new Time Spy demo is based of DX12 ray tracing feature, its Not specific to RTX.

That would suck though if enabling RTX is nvidia exclusive on games doesn't run ray tracing on other hardware that might support it via Directx. I highly doubt AMD is going to make something ray tracing exclusive for themselves in gaming. They will likely go for open standard.

I think overall RayTracing will be just part of DX12 DXR with both Nvidia and AMD pushing out specific libraries for their respective hardware to games dev to use, pretty much no different from today. I am sure AMD can enable RTX features but probably take a very large performance hit in doing so.
 
AMD has been working on raytracing for a while and wants it open standard...
Nvidia likely, not so much.
Most likely it will be Cuda locked like physx or nvidia hairshit.
Oh get real lmao You'd have to be a fool (not saying you are, I know you're not!) to believe that nVidia would come out with an open standard, or rather, utilize one that is already there. That would show kindness and some semblance of caring, both of which go completely against nVidia's business model.

I mean I get why they won't support FreeSync (Adaptive Sync ala VESA standards naming now), that wouldn't make their G-Sync be too enticing then. However, if they could come up with a DRM equipped $50 adapter that the drivers see and enable FreeSync support... that would actually be a good move by them since it'd still get them $50 for free (I see this device costing like $1 to make), and would give people the opportunity to switch from AMD. I'm not for people switching... lol I'm just for having more options. G-Sync is slightly superior to FreeSync given it IS a hardware interface. It has like 5ms lower input response (which at the 25-30ms that they're at, is imperceptible anyways), and the range is much larger as well. Whether or not FreeSync 2 has improved upon either of those points, I dunno... Again, it would give people a cheaper monitor to use with virtually the same benefits, or the ability to use nV but utilize the monitor they already have.

But I digress, nV doesn't have that in them, so we won't see that happen. Just like I see it being a fat chance that their Ray-Tracing is anything but their own proprietary tech. I can give you two really good reasons why it'll be this way...
#1 If it were 'open' then it could just as easily be implemented on their 10-series cards. Can't have GTX cards eating into RTX's main selling point.
#2 It would mean that AMD can use their shiny new tech, and what happens if by some miracle AMD's cards manage to execute nV's Ray-Tracing with higher performance? They would completely lose their fucking minds over that!! lol ...Then they'd have to concoct some bullshit --which they are obviously happy to do and capable of-- on why this technology is no longer 'open' and only functions on nVidia's RTX line.

We've seen it all before in various manifestations. Early days of SLi, couldn't run CrossFire on those boards. Someone came up with a work-around via driver at some point, and I think nV countered to close that loophole. (either that or they caved and allowed it, I don't recall specifically when SLi and CrossFire could happen regardless of chipset...)
Don't forget the PhysX fiasco, being able to run an ATi card for graphics and an nV card for PhysX computations... That's totally uncool, better patch that ability out of the drivers!! What would our mother's think if we allowed another company to benefit from our hardware PhysX, even IF it meant we might make a sale in the process?! :p

Of course nvidia wouldn´t go for an open standard. They´ve invested millions developing the middleware just to give it away.

Now on your 2 points.

#1 Turing has dedicated Raytracing and AI (tensor cores) hardware which pascal lacks. Even if Raytracing worked on pascal, performance would be abismal.
#2 I woudn't count on AMD performing faster in RT in the near future, given that AMD lacks specialized raytracing hardware. I think AMDs Raytrcing implemetation used Async Compute, but don't quote me on that one.
 
Of course nvidia wouldn´t go for an open standard. They´ve invested millions developing the middleware just to give it away.

Now on your 2 points.

#1 Turing has dedicated Raytracing and AI (tensor cores) hardware which pascal lacks. Even if Raytracing worked on pascal, performance would be abismal.
#2 I woudn't count on AMD performing faster in RT in the near future, given that AMD lacks specialized raytracing hardware. I think AMDs Raytrcing implemetation used Async Compute, but don't quote me on that one.

Who says you need special hardware to do it. Nvidia may use part of their chip that way. No reason why AMD cant find a way to use their massive compute power for it as well or they can add in support for it in a new chip as well. Since were not talking full screen ray tracing anyway it's all just a bit of a gimmick that makes everything look a bit too shiny for me right now.
 
This level of control mania sounds like the new product is going to suck.

It might not and Nvidia be crazy, but if you have market leading product what the fuck are you scared of?

You are afraid of a market in which there is little to no growth.

For GPUs, right now there's gaming, AI, and mining. Where are the sales growth areas? Who has the best value per dollar there? NV has gaming locked down right now, growth is not massive, and they have nowhere to go but down while existing in a market where your stock value is based coke snorting gambling fiends and the perception of your ability to grow.

They are charging $1200 for the top of the line launch card. The Last one was $700. That's a $500 increase on a card that can't possibly live up to their marketing hype.

In the best possible case, NV is saying it's 2x as fast a s a 1080, so it is priced 2x as much as a 1080, and we are being assholes because we can.

Reality would suggest that's not the case.

Best case I can think of is if the card presents technology that can get them in the next gen consoles, if it isn't too late for that. If it does that, and they can find a niche market for it until version next, they have a path to growth. At that point they would be defining the feature set for cards via the consoles, and have a head start on the mass market version of a card with real time raytracing for actual use in actual games.
 
Who says you need special hardware to do it. Nvidia may use part of their chip that way. No reason why AMD cant find a way to use their massive compute power for it as well or they can add in support for it in a new chip as well.

Potentially, but why have 'Tensor' cores or any specialized cores at all? Why not just have 'compute' cores a la Xeon Phi? Wouldn't that be just as fast?

I believe that the dedication of hardware is a necessity. This means that while AMD most certainly can leverage their current shipping hardware to do ray tracing, or any other compute-intensive task, they will also most certainly be slower at ray tracing until they start dedicating hardware. And if the Nvidia hardware is being panned for being too slow, well...

Since were not talking full screen ray tracing anyway it's all just a bit of a gimmick that makes everything look a bit too shiny for me right now.

I can't say that I've been extremely wowed by the demos shown thus far, however, I can say that I've been extremely impressed with my reaction to how 'lifelike' light and shadow have appeared at times. In combination with a decent HDR implementation, itself currently a bit of a pipe-dream for PC gaming, I feel that ray tracing will present a significantly better experience. Add VR to that, and maybe a Ford Focus' MSRP in GPUs...

:D
 
Who says you need special hardware to do it. Nvidia may use part of their chip that way. No reason why AMD cant find a way to use their massive compute power for it as well or they can add in support for it in a new chip as well. Since were not talking full screen ray tracing anyway it's all just a bit of a gimmick that makes everything look a bit too shiny for me right now.

I´m not saying you need specialized HW to do raytracing. I say you probably need it to do it fast. Even then its not "real" raytracing, but a mixed mode renderer, but still
Sure AMD could do an implementation that´s faster than nvidia´s, but history is not on their side. I mean they can´t even make a card that can compete with the 1080Ti (and can barely keep up with a GTX1080), but suddenly they will make a card that runs faster than a 2080Ti in raytracing?
 
Last edited:
but suddenly they will make a card that runs faster than a 2080Ti in raytracing?

Once we get to dedicated ray tracing cards- as in, ditch raster graphics altogether- then this is absolutely possible, and even probable. Of course, I'd expect Intel to be ahead of both, assuming that they sort their fab progression issues.

I see pure ray tracing as far simpler both to accelerate in hardware as well as to code for. AMD's biggest problem is that they have to be good at so very many things that they also have to predict, and they're simply not as nimble as Nvidia.
 
I´m not saying you need specialized HW to do raytracing. I say you probably need it to do it fast. Even then its not "real" raytracing, but a mixed mode renderer, but still
Sure AMD could do an implementation that´s faster than nvidia´s, but history is not on their side. I mean thay can´t even make a card that can compete with the 1080Ti (and can barely keep up with a GTX1080), but suddenly they will make a card that runs faster than a 2080Ti in raytracing?

We dont know what is coming from AMD on the GPU front, we know they are spending their current resources in that area so who knows what we will get but it also will be on 7nm which will likely help them quite a bit. Many thought Ryzen had 0 chance of being competitive with Intel and well we seen how that turned out. Plenty of times either company has dropped the ball and came back stronger. Also 98% of the market is not considering a TI so not a huge loss to not compete there and well Vega has been selling quite well, keep in mind most of the market is 1070 power levels or below and the 2000 series has seriously priced itself outside a majority of the market. I have ran both companies cards over the years and it all goes by value to performance at the time. Will see how it all shakes out for Nvidia in a few weeks.
 
First, what if all this hooting and hollering was nV's intention to keep the media and public from discussing the RTX's performance (whether or not it is there or is lacking)? In other words: a diversion tactic...
Implausible as hell. RTX performance is just too big of a focus for the tech press to be distracted by something as incredibly petty as this by comparison. Seems all you're going by is this threads discussion and not the real world. :rolleyes:
 
They [Nvidia] are incredibly sneaky.....and this will become obvious to all looking back in a few months from now.

Anyone notice that the reason they are holding the release of the 2070 to November is to deplete stock of 10 series cards, which they have a lot of, and have forced AIB's to take and sell if they want 20 series stock first!

I am willing to bet that they are purposely creating this uncertainty in the market regarding 20 series cards, including putting FRAPS on their OWN event booth showing shitty FPS in the new Tomb Raider title. Doesn't this seem like a really bad idea at your own launch? This was the start of their beautiful 'trick' that is now in full motion and has caught so many 'hook line and sinker'.
Doesn't this seem completely counter-intuitive? Why would they purposely sabotage their own brand new GPU by showing a title playing at 30-40FPS when the event was for gamers who would never play at these FPS? And how as time has passed, we get leaks with those FPS starting to creep up?

This very uncertainty around the 20 series cards is causing the price of their 10 series cards to start creeping up again - Amazon 1080ti cards have gone up between $50 and $100 in the last few days, now once again ABOVE launch MSRP. So they are making more money on last gen tech than they did when it launched close on 1.5-2 years ago!

Then consider the unusual nature of all these NDA's and tightly orchestrated release of 20 series benchmark info?
Because of the uncertainty and negativity they have created you wouldn't be criticized for thinking that maybe these new cards are not that good. But I think if you hold this mindset, you will be wrong.

It's to sell as much of the 10 series cards BEFORE the 20 series launch through all the uncertainty and negativity that's being proliferated by predominantly people who don't really know what they are talking about/paid too much for a 10 series card during the crypto mining frenzy - a frenzy that I also believe was engineered by Nvidia controlling supply to cause prices to inflate to more than the cost of the 2080ti!

When the real benchmarks come out for the 2080/2080ti they are going to be quite a bit higher than what all these rumors currently might suggest, and that's when the trick will have been played.
Anyone starting to see a trend here?

The fact that the 20 series cards are more expensive than the 10 series will be enough in most cases for people who panic purchased 1080/1080ti cards to keep them even with the benchmarks of the 20 series being higher than anyone expected. Well, because $200 is a lot for a lot of people and it's harder to pull apart your shiny upgrade when it's still showing great performance in the titles you currently play. So many reasons for most to justify just sticking with their newly purchased 10 series card.

This will have been a masterful marketing manipulation by Nvidia - love them or hate them for it, it's going to be next level.

And if I am wrong, then I'll be wrong.

But humor me for a second and forget about video cards.
If you don't already know, Nvidia's stock is up way over 1000% in less than a decade and they continue to control the GPU space.
Do you really think they would release a brand new card that's barely more powerful than the last gen, despite millions in R&D, manufacturing etc?
Do you think the would knowingly release a product that is a flop and will result in thousands of returns and a failure that will hit their stock price and anger their shareholders?

C'mon this is Nvidia we are talking about here. They are at the top for a reason, and I'm certain we will all witness just how well they played us for one purpose only.
The only real purpose that matters.
Money.
 
Implausible as hell. RTX performance is just too big of a focus for the tech press to be distracted by something as incredibly petty as this by comparison. Seems all you're going by is this threads discussion and not the real world. :rolleyes:

Actually 'Formula 350' was thinking out of the box a little more than most. Just a little off the mark.
The intention was to keep the media and gaming rumors focused on the questionable RTX performance, thus creating rumor that this would reflect the 20 series as an overall poor performing card/barely an increase over the 10 series. But he was correct in his suspicions that purposeful diversion was and is going on. And it's working out beautifully so far.

But what would I know? I only spent a very large chunk of my professional career (the "real world" you were referring to) in sales and marketing for a tech company that could swallow Nvidia in one bite....
I'm probably just out of touch and all the gaming kid's rumors are the real story and Nvidia screwed up big time with this card ;)
 
Potentially, but why have 'Tensor' cores or any specialized cores at all?
Which on that note, are the Tensor cores being utilized in the way on desktop machines, as they were initially envisioned? Or did nV just treat them as genera-compute cores and so on the desktop they are kinda being wasted?
In other words, is the reason that those here who have felt the Ray Tracing done by the 2080 as underperforming entirely due to the fact these Tensor cores aren't utilized as intended? IE like running OpenGL on you CPU. In this instance though, they are there, and may as well utilize them to do something interesting instead of lasering them off.

Implausible as hell. RTX performance is just too big of a focus for the tech press to be distracted by something as incredibly petty as this by comparison. Seems all you're going by is this threads discussion and not the real world. :rolleyes:
It probably is. I'm not trying to say it's a sure bet... I just wouldn't put it past nV is all. In reality I should've tagged that with [tinfoil hat] like I had before. As Globespy said though, I was merely thinking outside the box for no other reason than the joy of speculation. :p Just like the reply to the quote above this, I have no clue how off-base I am with that, but as I said it's still fun to speculate!
 
Which on that note, are the Tensor cores being utilized in the way on desktop machines, as they were initially envisioned? Or did nV just treat them as genera-compute cores and so on the desktop they are kinda being wasted?
In other words, is the reason that those here who have felt the Ray Tracing done by the 2080 as underperforming entirely due to the fact these Tensor cores aren't utilized as intended? IE like running OpenGL on you CPU. In this instance though, they are there, and may as well utilize them to do something interesting instead of lasering them off.


It probably is. I'm not trying to say it's a sure bet... I just wouldn't put it past nV is all. In reality I should've tagged that with [tinfoil hat] like I had before. As Globespy said though, I was merely thinking outside the box for no other reason than the joy of speculation. :p Just like the reply to the quote above this, I have no clue how off-base I am with that, but as I said it's still fun to speculate!

RT cores are their own thing AFAIK. They’ve been labeling them separate.

nVidia is using tensor cores for DLSS which is exactly how deep learning works for practically all applications but using it for games is pretty genius imo.

DE32F74A-8664-48E5-97CA-F5837840DC1F.png
 
Last edited:
Which on that note, are the Tensor cores being utilized in the way on desktop machines, as they were initially envisioned? Or did nV just treat them as genera-compute cores and so on the desktop they are kinda being wasted?
In other words, is the reason that those here who have felt the Ray Tracing done by the 2080 as underperforming entirely due to the fact these Tensor cores aren't utilized as intended? IE like running OpenGL on you CPU. In this instance though, they are there, and may as well utilize them to do something interesting instead of lasering them off.


It probably is. I'm not trying to say it's a sure bet... I just wouldn't put it past nV is all. In reality I should've tagged that with [tinfoil hat] like I had before. As Globespy said though, I was merely thinking outside the box for no other reason than the joy of speculation. :p Just like the reply to the quote above this, I have no clue how off-base I am with that, but as I said it's still fun to speculate!

2080/2080ti will be amazing GPU's.
Nvidia just have to count all those dollar bills from panic purchased 10 series cards before showing why they really should have waited for the benchmarks and updated drivers.
But they know most people are fickle and reactive.

Plus most have a very short memory when the 'sheep' all start jabbering about just how amazing the 20 series is, and all that negativity vanishes like it never happened.
Good marketers are part psychologist, and humans are indeed quirky things and most can be turned on a dime like their previous opinion didn't matter when all they see are pretty pictures at high frame rates.

I'm not cancelling my pre-order for my 2080ti - whether this gen of RT is a huge success or not, I'm buying for higher FPS and especially VR support which I use a lot.
 
Back
Top