AdoredTV: NVIDIA’s Performance Improvement Has Dropped from 70% to 30%

Stating that "NIVIDA hates innovation"

Names technologies innovated by NVIDA:
CUDA, G-Sync, GameWorks.


Not sure if you are trolling?


I would agree that Nvidia doesn't do much innovating. Gameworks and G-sync are closed / proprietary features firstly meant to lock in the Nvidia ecosystem, and not to innovate technology. And G-sync is a closed / proprietary variant of adaptive-sync - which was pioneered by AMD, and so isn't an Nvidia innovation.


Here are ATI / AMD innovations:

GDDR
HBM
APUs
GPU tessellation
low-level GPU technology (Mantle -> Vulkan / DX12)
multi-core x86 CPUs
on-die memory controllers
DisplayPort
64-bit x86 CPUs
async-computation in GPUs


And AMD makes their innovations free to use publicly. AMD innovates, Nvidia just tries to secure their market share.
 
Yes, the rate of performance increase declined post-Fermi due to a change in design philosophy at NVIDIA. They went from power at all cost to power efficiency. Not at all shocking that the performance deltas between generations declined as they did. We were also stuck on the same 28nm fab process for 5 years. Now that they have the efficiency down and the node has finally shrunk we are seeing good gains again, with performance improvements across the product stack averaging around 70% with Pascal over Maxwell.

Oh, really? I compiled the data on midrange NVIDIA cards going back to GeForce2.

View attachment 36452

I couldn't find comparative performance data for the 7600 GT and prior, which is why they're empty, but the slope of the trend line shows a an average increase of $3.79 per generation. Looks like there are an awful lot of $199 occurrences in there, though :whistle:.

And here's those MSRPs adjusted to 2016 USD.

Not that it means anything.

upload_2017-9-13_17-23-17.png
 
I would agree that Nvidia doesn't do much innovating. Gameworks and G-sync are closed / proprietary features firstly meant to lock in the Nvidia ecosystem, and not to innovate technology. And G-sync is a closed / proprietary variant of adaptive-sync - which was pioneered by AMD, and so isn't an Nvidia innovation.


Here are ATI / AMD innovations:

GDDR
HBM
APUs
GPU tessellation
low-level GPU technology (Mantle -> Vulkan / DX12)
multi-core x86 CPUs
on-die memory controllers
DisplayPort
64-bit x86 CPUs
async-computation in GPUs


And AMD makes their innovations free to use publicly. AMD innovates, Nvidia just tries to secure their market share.

Let me ask this then.

If the roles were switched, in that AMD is the dominant force in CPU and GPU while Intel and nVidia are left fight over for the scraps, would you have expected AMD to still have come out with those technologies, free of charge?

I for one, do not believe that, not in a million years.

AMD were already trailing significantly, they needed some PR in order to get some market share back from the giants. No company EVER researches tech and gives them out for free, unless they believe it's in their best interest to do so, and for AMD, because they are trailing, they will need to make their new technology easier to access than nVidia's so developers and manufacturers will have a better chance of entertaining to use AMD's tech, because otherwise there would have been a lot fewer people trying to use it.

I fully believe that, if roles were switched, and AMD is dominant in both GPU and CPU segments, with nVidia and Intel picking up the scraps, a lot of their in house developed tech will be open to use too, and all of AMD's tech will be proprietary.

Same thing happened with iOS and Android when Android first arrived on the scene.
 
I would agree that Nvidia doesn't do much innovating. Gameworks and G-sync are closed / proprietary features firstly meant to lock in the Nvidia ecosystem, and not to innovate technology. And G-sync is a closed / proprietary variant of adaptive-sync - which was pioneered by AMD, and so isn't an Nvidia innovation.


Here are ATI / AMD innovations:

GDDR
HBM
APUs
GPU tessellation
low-level GPU technology (Mantle -> Vulkan / DX12)
multi-core x86 CPUs
on-die memory controllers
DisplayPort
64-bit x86 CPUs
async-computation in GPUs


And AMD makes their innovations free to use publicly. AMD innovates, Nvidia just tries to secure their market share.
Well they were pioneers in antialiasing. They were the first to have MSAA on their cards with the 9700, they had Adaptive AA out before Nvidia did TrSSAA, and they also had MLAA out before Nvidia developed FXAA. I'm not sure any of them were invented by them, but I think they were the first to implement them on the hardware level in video cards.

There's probably a point about the proprietary stuff though. They had Truform back in 2001 ten years before tessellation emerged in DX11 and as far as I know, that was never an open standard. That was sort of equivalent to the Gameworks bullshit Nvidia does nowadays.
 
Last edited:
I would agree that Nvidia doesn't do much innovating. Gameworks and G-sync are closed / proprietary features firstly meant to lock in the Nvidia ecosystem, and not to innovate technology. And G-sync is a closed / proprietary variant of adaptive-sync - which was pioneered by AMD, and so isn't an Nvidia innovation.


Here are ATI / AMD innovations:

GDDR
HBM
APUs
GPU tessellation
low-level GPU technology (Mantle -> Vulkan / DX12)
multi-core x86 CPUs
on-die memory controllers
DisplayPort
64-bit x86 CPUs
async-computation in GPUs


And AMD makes their innovations free to use publicly. AMD innovates, Nvidia just tries to secure their market share.

I need to stop being so easily baited by ridiculousness. Your entire post is one of those posts that I can't respond properly to without violating H Forum rules.

I can only say you're wrong, and fundamentally so.

Every single bit of your list is incorrect.

Wow. OMG.

I think you gave me an annuerism.
 
Last edited:
I generally upgrade when these are true.

1. My current card is not fast enough
2. New cards that are not astronomically priced are at least double the speed of my current card

OR when my current card dies.

The pricing of current cards alongside the fact that everything I play still gets at least 60fps with all the detail turned to max leaves me with 0 desire to buy a new video card now.

Maybe next year or if I get a really, really, really good deal on a card before then. With the lame mining craze sucking up all the cards I don't see that happening unless the mining craze crashes.

You are my hero. This is what I say.. I am going to do. Then I buy the Titan every time because it's a tiny bit faster. Then I usually get the TI because it's a tiny bit faster. I always say.. I am going to just do Ti to Ti and not between. That is the proper thing for me to do then I will be getting meaningful performance.
 
The do make those cards, they're called the Ti's but they milk the non Ti's since there's no competition, it's business 101 everyone would do this if they were in Nvidias position.

Could be Intel, they have really split up releases to the point where instead of 10 percent it's 3 and such. They string it out in a way that makes Nvidia envious as hell! haha
 
I would agree that Nvidia doesn't do much innovating. Gameworks and G-sync are closed / proprietary features firstly meant to lock in the Nvidia ecosystem, and not to innovate technology. And G-sync is a closed / proprietary variant of adaptive-sync - which was pioneered by AMD, and so isn't an Nvidia innovation.


Here are ATI / AMD innovations:

GDDR
HBM
APUs
GPU tessellation
low-level GPU technology (Mantle -> Vulkan / DX12)
multi-core x86 CPUs
on-die memory controllers
DisplayPort
64-bit x86 CPUs
async-computation in GPUs


And AMD makes their innovations free to use publicly. AMD innovates, Nvidia just tries to secure their market share.

You need to google the word "Innovation".
The meaning of the word has eluded you.
Stop inventing new meaning of word to push FUD...it is dishonest.

#WaitForADictionary...
 
I need to stop being so easily baited by ridiculousness. Your entire post is one of those posts that I can't respond properly to without violating H Forum rules.

I can only say you're wrong, and fundamentally so.

Every single bit of your list is incorrect.

Wow. OMG.

I think you gave me an annuerism.
It did seem like an awful lot of claims, some of them did strike me as wrong. However, you say every single bit of his list is incorrect. To the best of my knowledge, he's right about the first GPU with hardware supported tessellation. That's exactly what Truform was back in 2001 by ATI. Unless you have a source of an earlier video card doing the same, I think you're wrong on that one.


You need to google the word "Innovation".
The meaning of the word has eluded you.
Stop inventing new meaning of word to push FUD...it is dishonest.

#WaitForADictionary...
Ah, he's just using it the way Apple does.
 
I would agree that Nvidia doesn't do much innovating. Gameworks and G-sync are closed / proprietary features firstly meant to lock in the Nvidia ecosystem, and not to innovate technology. And G-sync is a closed / proprietary variant of adaptive-sync - which was pioneered by AMD, and so isn't an Nvidia innovation.


Here are ATI / AMD innovations:

GDDR
HBM
APUs
GPU tessellation
low-level GPU technology (Mantle -> Vulkan / DX12)
multi-core x86 CPUs
on-die memory controllers
DisplayPort
64-bit x86 CPUs
async-computation in GPUs


And AMD makes their innovations free to use publicly. AMD innovates, Nvidia just tries to secure their market share.

Just to mention a few.
On die memory controllers isn't AMD, not even remotely close. Try lookup something like the 386SL and 486SL. Not to mention countless chips before them.
APUs? If you mean the name yes, the concept no. Ages behind then.
GDDR and HBM, no. That's a myth. AMD for example first joined HBM 2 years after its development start.
Low level API, certainly no. They are decades behind there like the rest of the PC.
Displayport, no.

Companies only makes something free when its failed and/or they try to push all the cost on someone else.
 
I think you're wrong on that. Adaptive sync was a response from AMD after Gsync had been announced the way I remember it.

Exactly. AMD was a couple of years behind there before they got it copied into a product.
 
I need to stop being so easily baited by ridiculousness. Your entire post is one of those posts that I can't respond properly to without violating H Forum rules.

I can only say you're wrong, and fundamentally so.

Every single bit of your list is incorrect.

Wow. OMG.

I think you gave me an annuerism.


If you even think that half of the list I gave is wrong then you're less informed on the subject than I am - which makes the emphasis you put into your post very bizarre, as well as hypocritical.


https://www.amd.com/en-us/innovations/software-technologies/hbm

A legacy of enabling industry-wide innovation
AMD has a long history of pioneering innovations, spawning industry standards and spurring the entire industry to push the boundaries of what is possible. HBM is just the most recent in an impressive list that spans CPUs, graphics, servers, and more:

  • X86-64: The 64-bit version of the x86 instruction set found in all modern x86 CPUs
  • Wake-On-LAN: Co-invented w/ HP, this revolutionary computer networking standard enables remote computer wake-up
  • GDDR and now HBM: Pervasive industry standards for high-performance memory, invented by AMD with contributions from the Joint Electron Device Engineering Council (JEDEC) and industry partners
  • The first multi-core x86 processor: AMD’s Opteron™ 100 Series CPU was famously the first to bring multi-core computing to the PC space
  • DisplayPort™ Adaptive-Sync: Implemented by AMD as FreeSync™ technology, this VESA-ratified AMD proposal eliminates stutter for smoother gameplay
  • First on-die memory controller for x86: AMD’s “Hammer” architecture was first to integrate a memory controller onto consumer CPUs for peak performance
  • Mantle: The first low-overhead PC graphics API, sparking a revolution that now spans the entire PC graphics industry
  • First on-die GPU: AMD’s Accelerated Processing Units (APUs) were the first to explore the integration of a GPU with the CPU, eliminating the need for a bulky external GPU for compact or inexpensive PCs


HBM:
"The development of High Bandwidth Memory began at AMD in 2008 to solve the problem of ever increasing power usage and form factor of computer memory." - https://en.wikipedia.org/wiki/High_Bandwidth_Memory#History

APUs:
AMD started working on the concept in 2006 - https://en.wikipedia.org/wiki/AMD_Accelerated_Processing_Unit#History

GPU tessellation:
https://en.wikipedia.org/wiki/ATI_TruForm

low-level GPU technology (Mantle -> Vulkan / DX12):
Self-explanatory. Vulkan and DX12 both sparked from AMD's prototype Mantle technology that debuted in Battlefield 4.

multi-core x86 CPUs:
"Demonstrated the world’s first native dual- and quad-core x86 server processors (2004)" - http://www.amd.com/en-us/innovations/2000-2009
"AMD has won another race against Intel by demonstrating the first dual-core processor. AMD expects the dual-core AMD Opteron processor for servers and workstations to offer the best performance per watt in the market when it will be available in mid-2005." - http://www.tomshardware.com/news/amd-demos-x86-dual,188.html
"In April 2005, Intel's biggest rival, AMD, had x86 dual-core microprocessors intended for workstations and servers on the market, and was poised to launch a comparable product intended for desktop computers. As a response, Intel developed Smithfield, the first x86 dual-core microprocessor intended for desktop computers, beating AMD's Athlon 64 X2 by a few weeks" - https://en.wikipedia.org/wiki/Pentium_D#Smithfield

64-bit x86 CPUs:
I wonder who you think created the first 64-bit x86 CPU - https://en.wikipedia.org/wiki/X86-64#History
"A DECADE AGO AMD released the first Opteron processor and with it the first 64-bit x86 processor. AMD's lavish New York launch for the Opteron processor was far more than a product launch, it was AMD showing it could reproduce the success of the Athlon K7 chip and take the fight to Intel by developing ground breaking new features and not just one-upping its rival on some benchmarks. The firm's Opteron processor brought 64-bit computing to the commodity x86 chip market, along with an on-die memory controller and the Hypertransport bus all in one product." - https://www.theinquirer.net/inquirer/feature/2262881/amd-brought-64bit-to-x86-10-years-ago-today

GDDR3:
"Despite being designed by ATI, the first card to use the technology was nVidia's GeForce FX 5700 Ultra in early 2004, where it replaced the GDDR2 chips used up to that time" - https://en.wikipedia.org/wiki/GDDR3_SDRAM


Sorry for being more right than you could handle ;)
 
Last edited:


Talk about embarrassing: You people were / are for the large part wrong, and you're actually still tying to play it off - hilariously, with a single example.

I gave a list of items I believed AMD to have contributed significant innovation in, but I didn't present myself arrogantly about it or as though it was immaculately stated. You, on the other hand, are doing that, and that's why you've f***ed up.
 
Last edited:
I'm not 100% on this, but I think Intel might have beat AMD to market:

https://www.bit-tech.net/news/tech/cpus/intel-details-first-cpu-with-integrated-gpu/1/

That article is calling it the first cpu with integrated graphics, I'm not sure when their development started though.

Both Clarkdale and Sandy Bridge launched before the first APU. But countless SoCs with integrated graphics was produced long before so its a moot point. The only innovation AMD did was to call it an APU.
 
Copied the concept is perhaps a better wording ;)
Yeah, well the point is that's a stretch to call innovation. Doing something desirable that somebody else brought to market first isn't really innovation, just refinement. Again, as an example, MLAA was the first shader-based AA method brought to market for videocards by AMD. Nvidia responded with FXAA. While I would argue FXAA was better, that doesn't make Nvidia the innovators in that particular case. I think innovate doesn't mean you have to have invented a concept, but if you were the first to bring it to market where it simply wasn't present before and established a precedent for other companies to follow or change, that certainly counts.
 
Yeah, well the point is that's a stretch to call innovation. Doing something desirable that somebody else brought to market first isn't really innovation, just refinement. Again, as an example, MLAA was the first shader-based AA method brought to market for videocards by AMD. Nvidia responded with FXAA. While I would argue FXAA was better, that doesn't make Nvidia the innovators in that particular case. I think innovate doesn't mean you have to have invented a concept, but if you were the first to bring it to market where it simply wasn't present before and established a precedent for other companies to follow or change, that certainly counts.

Making a free, open-source software-based version of adaptive-sync technology that works natively with DisplayPort, when the alternative is proprietary, hardware-based, and expensive, is certainly an innovation.
 
Making a free, open-source software-based version of adaptive-sync technology that works natively with DisplayPort, when the alternative is proprietary, hardware-based, and expensive, is certainly an innovation.

Its not a software version.

It got nothing to do with innovation, just plain business. When you got something inferior and late you can try and add it as value while putting the cost on someone else. In this case they was lucky that some monitor manufactors was willing to pay the bill so they could add it as a value feature.
 
Making a free, open-source software-based version of adaptive-sync technology that works natively with DisplayPort, when the alternative is proprietary, hardware-based, and expensive, is certainly an innovation.
If you can show me any evidence that AMD was planning to do adaptive sync BEFORE Gsync was announced, then sure, I would call it innovation. Otherwise, it's responding to your competitor and doing something similar to what they're doing, except in a different way. I don't call that innovation.

Another way to look at it: If Nvidia never decided to make Gsync, Freesync probably wouldn't exist. That should tell you all you need to know about how innovative it is right there. I mean christ, the technology behind the concept had been around long enough even prior to Gsync. For something to be innovative, you have to do something new. I'm not trying to pick on Nvidia or AMD, again, AMD has a few examples of pioneering graphics tech that Nvidia followed after the fact. Both companies have had different innovations over time.
 
If you can show me any evidence that AMD was planning to do adaptive sync BEFORE Gsync was announced, then sure, I would call it innovation. Otherwise, it's responding to your competitor and doing something similar to what they're doing, except in a different way. I don't call that innovation.

That's precisely how the word Innovate's meaning differs from the word Invent's meaning. Yes, a lot of the time people innovate in response to what their competitor is doing.


http://www.dictionary.com/browse/innovate

to introduce something new; make changes in anything established.


http://www.dictionary.com/browse/innovation

the act of innovating; introduction of new things or methods.
 
Making a free, open-source software-based version of adaptive-sync technology that works natively with DisplayPort, when the alternative is proprietary, hardware-based, and expensive, is certainly an innovation.

By your "definition"...anyone that innovates and patents said innovation isn't innovating *sigh*

You know you have left reality when words are spun to mean something else...
 
The guy is wrong on one point in his video tho. The Ti started with the GF3 Ti 200 and 500 not the GF4 line of cards

Pretty sure I can recall a Geforce 2 Ti model. Not sure if the Ti meant the same thing but I do believe it started with Geforce 2.
 
By your "definition"...anyone that innovates and patents said innovation isn't innovating *sigh*

*Sigh* That idea doesn't make a lick of sense. How you read into things to conclude that is equally an enigma. My definition of 'innovate' is the only definition of 'innovate'. Maybe seek out a dictionary instead of just assuming what words mean? I recommend that you also look up what FUD stands for while you're expanding your understood vocabulary.

You know you have left reality when words are spun to mean something else...

At least you acknowledge that.
 
Last edited:
That's precisely how the word Innovate's meaning differs from the word Invent's meaning. Yes, a lot of the time people innovate in response to what their competitor is doing.


http://www.dictionary.com/browse/innovate

to introduce something new; make changes in anything established.


http://www.dictionary.com/browse/innovation

the act of innovating; introduction of new things or methods.
Sure, if you want take it by the broadest definition possible, just about ANY technological change is innovative. I consider incremental stuff that makes small changes to what was already established not that innovative, but if you want to be really broad, sure. Under that definition Microsoft was "innovating" when it created Internet Explorer, the Zune, Windows Phone, and Bing, despite having really obvious, safe, established precedents they were copying in functionality.

My definition of innovative is more narrow, you have to be doing something much more different than copying a competitor's existing product and tweaking aspects of it. Of course you can be innovative in response to competition, but that means you're doing something very different than they are. The Wii was an example of this. Neither Sony nor Microsoft were doing anything with motion controls, but they innovated and centered their whole console around the concept as a means to compete with them. That's a lot different than creating a Zune, which was a response to the already-existing iPod. See the difference?

Besides, it sounds like you're trying to have it both ways. You're saying any changes in anything estabilshed can count as innovation, fine. But then you turn around and claim Nvidia does NOT innovate, while AMD DOES, pointing to Gsync v. Freesync as an example. Fine, let's work with that:

When Gsync came out, there was NO existing solution for frame syncing with a variable framerate. It was either Vsync or tearing, the end. They were first on the scene with that. There wasn't an established precedent for this, they took the first step on that. I call that innovation. Freesync was announced later and used a different methodology to accomplish the exact same function: frame syncing with a variable framerate. So if you want to argue AMD was still innovating there, fine. But then how can you argue in that instance Nvidia wasn't innovating MORE? They did it FIRST and thus unable to use their competitor's work as a model to build off of.
 
The word "innovation" doesn't care what the entity do with the new idea.

So a company being innovative has no obligation to make it freely available to everyone on the planet. In fact, a lot of the innovations human kind have made have started out as NOT being open.

For example, GPS. It wasn't even open to civilian use, it was restricted strictly as military use, the technology is innovative in using artificial satellites to help pin point your exact location on the planet, but it was about "proprietary" as you could possibly get.

EDIT:
In fact, some posts of this section actually suggest that copying someone else's idea and then making it more available to everyone is innovation.

Wanna know who fits that bill the best? Counterfeiters.

Not saying AMD is a counterfeiter (I consider them anything but), but if AMD is considered innovators in the field of FreeSync, then you could consider Counterfeiters as the next coming.
 
Sure, if you want take it by the broadest definition possible, just about ANY technological change is innovative. I consider incremental stuff that makes small changes to what was already established not that innovative, but if you want to be really broad, sure. Under that definition Microsoft was "innovating" when it created Internet Explorer, the Zune, Windows Phone, and Bing, despite having really obvious, safe, established precedents they were copying in functionality.

Yes, all purposeful modifications are innovative, but I wouldn't count mundane and directly inherent progression as noteworthy examples of company innovations - and I don't believe that FreeSync is such a development.


Besides, it sounds like you're trying to have it both ways. You're saying any changes in anything estabilshed can count as innovation, fine. But then you turn around and claim Nvidia does NOT innovate, while AMD DOES, pointing to Gsync v. Freesync as an example. Fine, let's work with that:

When Gsync came out, there was NO existing solution for frame syncing with a variable framerate. It was either Vsync or tearing, the end. They were first on the scene with that, there wasn't a precedent for this, they took the first step on that. I call that innovation. Freesync was announced later and used a different methodology to accomplish the exact same function: frame syncing with a variable framerate. So if you want to argue AMD was still innovating there, fine. But then how can you argue in that instance Nvidia wasn't innovating MORE? They did it FIRST and thus unable to use their competitor's work as a model to build off of.

I did not say that Nvidia does not innovate. I said: "I would agree that Nvidia doesn't do much innovating. Gameworks and G-sync are closed / proprietary features firstly meant to lock in the Nvidia ecosystem, and not to innovate technology."

And I was wrong about G-sync following AMD's adaptive-sync, and I put up no fuss about being corrected regarding that. However, that doesn't change things to make it appear as though Nvidia does a lot of innovating.

There seems to be something other than my comment at work here. I made a mostly-accurate general remark featuring a personal viewpoint, and a string of posters thought it was their opportunity to make themselves feel better about themselves by, mostly erroneously, jumping on that post. It turned out some of those people were just full of shit and didn't even know what the word 'Innovate' means, and having been mostly wrong seems to be making some of them all the more desperate to feel right about something, as if they were entitled to be. I don't see the point.


The word "innovation" doesn't care what the entity do with the new idea.

So a company being innovative has no obligation to make it freely available to everyone on the planet. In fact, a lot of the innovations human kind have made have started out as NOT being open.

For example, GPS. It wasn't even open to civilian use, it was restricted strictly as military use, the technology is innovative in using artificial satellites to help pin point your exact location on the planet, but it was about "proprietary" as you could possibly get.

I haven't said or suggested this weird idea that proprietary technologies are not innovative. I said that Nvidia's relatively few innovations, coupled with the fact that they are made to be proprietary and exclusive, to serve to defend Nvidia's market and eco-system, make it appear as though innovating technology is not a primary goal of Nvidia, and is instead done conservatively when it will serve Nvidia's primary goal of brand dominance - substantiating my view that Nvidia doesn't do much innovating, either compared to AMD, or compared to what I think that could be expected from a company that guided by a huge interest in technology development. And having closed systems like Nvidia's Gameworks and G-sync harm innovation, and also contribute statements on Nvidia's interest level in producing innovation.
 
Last edited:
Sure, but without G-Sync, AMD would not have FreeSync, even if nVidia's own innovations were entirely self-serving, it has had the effect that VRR technology is now available to both side, however it is implemented.

Besides, I don't think any company would innovate out of the goodness of their heart. Pharmaceuticals develop new medicines with the intention of having a monopoly on the said medicine (even if for limited time before it comes public domain) until its patents run out. That new medicine would very well lead to other companies making their own versions or perhaps making discoveries in other places, but the origin of the said medicine has always been for company's own gain.

Neither nVidia nor AMD are immune to this, this is what I am arguing. RTG would most likely have done the same thing if it is in nVidia's position, and nVidia would have to use some other methods to come up with a competing solution. With FreeSync, it solved many problems simultaneously: it off-loaded the implementation to DisplayPort standard, off-loaded its hardware implementation to the manufacturers, which in turn allows them to cut corners or bling it up as much as they desire, and will come out with either cheaper alternative to G-Sync, or have better monitor selection, as well as helping to portray AMD as the 'hero' and nVidia as the 'villain' in this G-Sync vs FreeSync war.

But one thing is definitely for certain: nVidia came up with G-Sync idea first, and they were the first to get G-Sync monitor out, and their successes directly lead to FreeSync's existence and widespread adoption.

I would also argue that it's one of the few things remaining to keep RTG afloat, so nVidia probably HAS to play the 'villain' here.
 
I haven't said or suggested this weird idea that proprietary technologies are not innovative. I said that Nvidia's relatively few innovations, coupled with the fact that they are made to be proprietary and exclusive, to serve to defend Nvidia's market and eco-system, make it appear as though innovating technology is not a primary goal of Nvidia, and is instead done conservatively when it will serve Nvidia's primary goal of brand dominance - substantiating my view that Nvidia doesn't do much innovating, either compared to AMD, or compared to what I think that could be expected from a company that guided by a huge interest in technology development. And having closed systems like Nvidia's Gameworks and G-sync harm innovation, and also contribute statements on Nvidia's interest level in producing innovation.
I think there's been a lot of innovation on both sides, I'm not sure there's enough to declare winners either way. I can't speak as much to particular architecture changes, more features that have come along with their cards. Here's a list of some I can think of off the top of my head, where one company was first before an equivalent emerged from the competition (in other words, I'm not listing FXAA since MLAA came first, and not listing freesync since gsync came first, etc.):



Nvidia:
Bringing back SLI (though 3DFX was first obviously)
3D Vision (derived from buying Elsa, but still)
PhysX
CUDA
SSAA
Adaptive Vsync
G-sync
DSR
Geforce Experience
VRAM memory compression
Perspective surround (fixed perspective for multiple displays)
Gameworks (even though I consider this a negative)
VRworks
Ansel
Shield


AMD:
Anisotropic filtering support (not 100% on this one)
Truform
MSAA
Adaptive AA
MLAA
Temporal AA (they did this way before Nvidia)
Eyefinity
SSAA support for DX11 titles


I'm sure I'm missing some and feel free to fact check me on any of those. There's no doubt that open standards are better for the tech ecosystem, but I'm not sure where you're coming from thinking Nvidia doesn't do much innovating.
 
Fanboyism is really embarrassing.

We're talking about inanimate bits of technology. Who gives a fuck.
 
Fanboyism is really embarrassing.

We're talking about inanimate bits of technology. Who gives a fuck.

While I agree fanboyism is embarassing, I do strongly believe that people should have some degree of passion for technology. Passion for technology is what drives engineers, scientists and innovators to want to create new stuff and make things work.

Everyone should be championing advancement of technology, not blindly follow 1 company at the expense of another.
 
Absolutely agree, but when it gets to a purple one vs a blue one. Yawn.

I spent 7 years working in the technology sector, even when it was a competitor (and later that happened a lot) I'd still think it was cool. It was internal fanboys that killed us because not invented here was so strong.

It's easy to forget that a lot of successful companies are not successful innovators but successful at execution. Given NVidia have the performance lead and are going gangbusters with the CUDA stuff I'd say their innovation practice is just fine and they're clearly executing well.

Really hoping that AMD pick up the ball again but to the articles point, the days of 70% improvements are not going to happen on a mature market. Not unless the generations slow down, effectively jumping straight to the Ti version, could be done but they'd need to sell longer to make back the R&D money instead of getting two bites of the generational architecture apple.
 
Geez how did he come up with those conclusions lol, still doesn't know history at all, doesn't go into the fact nV was nearly out of business with the FX line, and NV1. We are still getting 60%-80% improvement when its possible. Lately it wasn't possible because everyone outside of Intel was stuck on 28nm! What and idiot lol, just doesn't know what he is talking about as always. Major node changes and performance increases, give use those type of improvements, as Phasenoise stated earlier, they are becoming few and far between and relent on architecture is becoming much more important than the other two facets of performance improvements (frequency and power) because of limitations in those two areas.

AMD, even ATi, nV and Intel have stated this YEARS ago. hence why moores law is dying or pretty dead now!

God can't believe it even a non technical idiot can understand this, but Adored spins it as competition? There is only so much any company can do, even with unlimited resources they hit by the wall of physics and reality.

Now back to major node changes and performance increases. If Adored tallied them up instead of putting down his not so fellow reporters/journalists, he would see that 60-80% performance increase hit ever single time when both were accomplished with new architectures on both nV and AMD products. It was much easier in the past as node (die size) and power limits, weren't being hit either.

Well another waste of videos from Adored. To Adored fellow youtube followers. Follow an idiot and become one cause he is feeding you a lot of BS. Do your own reading, youtube is worse than TV, pick up a book, it will help you, instead of rotting your brain.

Granted AMD has been behind the times in the past 2 gens, which everyone remembers cause well lets face it most people only remember recent things, but the performance drop per gen was not due to AMD not competing, it was due to the slow down of process node technology and power limits.
 
Last edited:


Here ya go guys, starting at 6 minutes, anyone that supports Adored, his latest video pretty much states anyone that buys nV products and are enthusiasts are sheep. This is Adored for you, a flaming fan boy that says you are all sheep for buying the best products available on the market with your own money!

Let him tell you what you buy with your hard earned cash and then let him talk to down to you for being an enthusiast of technology!

He is a f'in moron, don't even know why people even listen to his biased ass shit. He is talking about himself as if he knows more then anyone else and patting himself on the back while doing so even though his information is mostly incorrect due to the missing relevant facts.

This is Adored for you, he know more then anyone else, he talks down to enthusiasts that spend their own money the way they want to spend it, best products possible for their $, talks down to journalists that have been in the tech industry for many more years then he has been, who also know a shit load more about GPU and CPU tech then he does and ever will because of his arrogance. Its good to be arrogant at times when ya know you know your shit and need to prove it, not when ya don't know shit cause ya try to prove that you do, then you fall flat on your face and look like an idiot.

He is probably a 20-25 year old kid that doesn't even have a degree in any tech field or was never exposed to tech as a young kid and finally when he got his first forum account somewhere started getting into tech and listening to all the BS over at that forum and started up a youtube account, thinking he knows what he is talking about, but as we all know most forums don't have a lot of in depth info about tech, they are few and far between.

This is how cults are born ;)

Wow look at which other youtube journalists are on his youtube threads, all of them are a waste.......
 
Last edited:
He is probably a 20-25 year old kid that doesn't even have a degree in any tech field or was never exposed to tech as a young kid and finally when he got his first forum account somewhere started getting into tech and listening to all the BS over at that forum and started up a youtube account, thinking he knows what he is talking about, but as we all know most forums don't have a lot of in depth info about tech, they are few and far between.
sound like a few around here...
 
I think I will try and follow up on this post I made ~2 years ago and add process-technology to the "picture"...I have a feeling it will reveal how big a tech-noob Adroed really is:
https://hardforum.com/threads/390x-coming-soon-few-weeks.1848628/page-29#post-1041445280

So many people get confused by PR names over SKU names....I bet even more will fail when you add process technology to the mix.

But I like that I was right ~2 years ago:

I predict this "tick-tock" will continue with "Pascal":
GF114 -> GK104 -> GM204 -> GPxx4
GF100 -> GK110 -> GM200 -> GPxx0

I did however not see that NVIDIA would split it up even more:


Midrange: GF114 -> GK104 -> GM204 -> GP104 -> GV104
Highrange: GF100 -> GK110 -> GM200 -> GP102 -> GV102
Enterprise: GF100-> GK110 -> GM200 - > GP100 -> GV100

But my point still stands ^^
 
Back
Top