EVGA will no longer do business with NVIDIA

Status
Not open for further replies.
yeah but how much better in terms of longevity or performance has that own design actually gotten them? Was it a deciding factor for purchasers, or was it the warranty and customer service that brought customers to EVGA?

I purchased for a combination for those things. This generation and generation's past.
 
GPUs aren't for gamers anymore. R&D prices increasing and wafer prices increasing have pushed the mid to high end to higher and higher prices for gaming cards. AI/Compute are where the money is coming from to give us hand me down GPUs to game on. It is what it is.

Maybe tiled/chiplet GPUs will save the gamer in the future. We can only hope.
 
Revenue =/= profits. You can sell $1,000,000 of product, all at a loss, and it is still revenue.
Revenue is the amount of money brought it. Salaries is one of the things you deduct from revenues to reach profits.
 
In a recent Q&A with Nvidia's CEO Jensen Huang, shortly after the company's GTC live stream, I asked the CEO what were his thoughts on EVGA's departure from that side of the business

"Andrew [Han] wanted to wind down the business. And he's wanted to do that for a couple of years. Andrew and EVGA are great partners, and were great partners, and I'm sad to see them leave the market. But, you know, he's got other plans and he's been thinking about it for several years

"The market has a lot of great players. And it'll be served well after EVGA. But we'll always miss them. I'll always miss them. And they were an important part of our history, and Andrew is a great friend. And, you know, I think that it was just time for him to go do something else."

https://www.pcgamer.com/nvidia-ceo-...d-well-after-evga-but-it-was-a-great-partner/
 
In a recent Q&A with Nvidia's CEO Jensen Huang, shortly after the company's GTC live stream, I asked the CEO what were his thoughts on EVGA's departure from that side of the business

"Andrew [Han] wanted to wind down the business. And he's wanted to do that for a couple of years. Andrew and EVGA are great partners, and were great partners, and I'm sad to see them leave the market. But, you know, he's got other plans and he's been thinking about it for several years

"The market has a lot of great players. And it'll be served well after EVGA. But we'll always miss them. I'll always miss them. And they were an important part of our history, and Andrew is a great friend. And, you know, I think that it was just time for him to go do something else."

https://www.pcgamer.com/nvidia-ceo-...d-well-after-evga-but-it-was-a-great-partner/
That’s about as good a response as you can get there really…
 
GPUs aren't for gamers anymore.
They are now, for the most part. GPU's won't be used for mining ever again thanks to proof of stake. This is why EVGA is jumping ship.
R&D prices increasing and wafer prices increasing have pushed the mid to high end to higher and higher prices for gaming cards.
I think this has more to do with Nvidia being a publically traded company than prices of R&D and chip manufacturing. It's clear that prices need to go down but that's not what stock holders want to see. You need to justify those prices by adding features.
AI/Compute are where the money is coming from to give us hand me down GPUs to game on. It is what it is.
AI wouldn't use gaming hardware. Nowadays you want specialized chips for AI. Compute was certainly used on gaming hardware, but as we're learned we can do away with that by changing things up.
Maybe tiled/chiplet GPUs will save the gamer in the future. We can only hope.
What we need is more competitors in the market to get prices down.
 
"Andrew [Han] wanted to wind down the business. And he's wanted to do that for a couple of years. Andrew and EVGA are great partners, and were great partners, and I'm sad to see them leave the market. But, you know, he's got other plans and he's been thinking about it for several years
Yeah, but if that's all there was to it why didn't Andrew Han just come out and say that? Why not just issue a press release and move on? Why invite the youtoobers out to CA to stir up a bunch of drama?
 
Yeah, but if that's all there was to it why didn't Andrew Han just come out and say that? Why not just issue a press release and move on? Why invite the youtoobers out to CA to stir up a bunch of drama?
Because they'd rather blame Nvidia than take the blame themselves?
It's most likely a combination of many things, one of which being Nvidia's slim margins (according to EVGA).
 
They are now, for the most part. GPU's won't be used for mining ever again thanks to proof of stake. This is why EVGA is jumping ship.
This right here, EVGA made more money in the last 2 years than they have in the last 10, if you were ever looking for a time to close shop and retire that's it right there.
You need to justify those prices by adding features.
Feature creep is real, GPU's are getting too stupid levels of complexity and it will eventually plateau, but that is still a few years off. We are in the middle of a complete shift with how and what GPUs are doing and it's kind of wild.
AI wouldn't use gaming hardware. Nowadays you want specialized chips for AI. Compute was certainly used on gaming hardware, but as we're learned we can do away with that by changing things up.
Specialized everything, power usage on generic hardware is getting to the point where it's too expensive to run generic hardware for many tasks. This is why Nvidia is king in its space right now, CUDA has a sort of stranglehold on many enterprise-level workloads because while being proprietary it allows for very detailed levels of optimization which results in proportionately faster completion times with a statistically significant reduction in power usage. This is also why Intel and AMD are both getting very nervous about the number of data centers purchasing more ARM-based CPUs, they offer a greater degree of specificity that x86 can't currently match, for certain types of workloads.
AI wouldn't use gaming hardware. Nowadays you want specialized chips for AI. Compute was certainly used on gaming hardware, but as we're learned we can do away with that by changing things up.
Any serious AI work at this stage is painful to do with consumer-level hardware to the point where the time investment isn't worth the cost savings from not buying the specialized hardware. If you are just tinkering around at home, learning, or prototyping or something yeah you can do it of course, but if you are serious about it then you need the bigger stuff, the amount of data you are required to move at the speeds you need to move it is just too limited by the memory and storage configurations in the consumer hardware.
 
Last edited:
Because they'd rather blame Nvidia than take the blame themselves?
It's most likely a combination of many things, one of which being Nvidia's slim margins (according to EVGA).
It's most certainly a combination of many things. Kudos to Jensen for being diplomatic, but the way Andrew made the announcement I don't really see the two as "great friends" anymore.
 
Because they'd rather blame Nvidia than take the blame themselves?
It's most likely a combination of many things, one of which being Nvidia's slim margins (according to EVGA).
But Nvidia's other partners see comparatively larger margins, EVGA may make their own designs but as they outsourced everything else, the other partners have their own design teams but also manufacturing and logistics capabilities which increased their profits. They have also had to call out EVGA's overly generous warranty program, GPU's are just too complex at this stage and too sensitive with far too many ways for them to go wrong. I was reading (I will search for the link after work) that their contract terms with Nvidia for their 3090s were on them to replace 100% as EVGAs customizations to power delivery were what caused the whole New Worlds burnout problem, so those weren't covered by EVGA's warranty with Nvidia for the chips, which was ultimately expensive and left a really bad taste in their mouth as well.
 
But Nvidia's other partners see comparatively larger margins, EVGA may make their own designs but as they outsourced everything else, the other partners have their own design teams but also manufacturing and logistics capabilities which increased their profits. They have also had to call out EVGA's overly generous warranty program, GPU's are just too complex at this stage and too sensitive with far too many ways for them to go wrong. I was reading (I will search for the link after work) that their contract terms with Nvidia for their 3090s were on them to replace 100% as EVGAs customizations to power delivery were what caused the whole New Worlds burnout problem, so those weren't covered by EVGA's warranty with Nvidia for the chips, which was ultimately expensive and left a really bad taste in their mouth as well.
nvidias other partners revenue channels dont primarily depend on selling nvidia gpus either.
 
GPUs aren't for gamers anymore. R&D prices increasing and wafer prices increasing have pushed the mid to high end to higher and higher prices for gaming cards. AI/Compute are where the money is coming from to give us hand me down GPUs to game on. It is what it is.
I think you're caught up in branding and thus are looking at this from the wrong perspective. What if, instead of the branding, we look at the performance?

Nvidia's presentation indicates that the 16GB 4080 can compute pixels faster than 98% of gamers' monitors can display them. The 4090 is looking to be faster than 99.8% of gamers' monitors. The 12GB 4080 might be faster than 90% of gamers' monitors. Thus, although the branding of the 12GB 4080 indicates that it is a mid-tier card, the functionality and performance of the 12GB 4080 have it serving only the top 15% of gamers. In this context, it is very much a top-end card and thus is priced as such.

This top-end pricing for a top-end card really serves no harm to gamers. The price they don't want to pay does nothing more than buy performance they can't use anyway. Does the existence of a Ferrari mean that cars aren't for regular people any more?

If we extrapolate down the chain based on my uninformed guessing, the 4070 will be priced around $5-600. This is squarely in the mid range from a branding perspective. Functionally, the 4070 is likely to be a very legitimate 4K120 card when DLSS3 is enabled. Who has a real 4K120 screen? The top ~5-10% of gamers. Without DLSS, it will likely be a very legitimate 2.7K120 card. Who has one of these screens? Maybe the top 25% of gamers. Thus, the functionality and performance of the 4070 is still top-end even though the branding is mid-range.

Going further down the chain, if the 4050 keeps up the trend of being 2x the performance of its 30x0 counterpart, we're looking at a card which will readily do 1080p120 at max quality in almost any current title without enabling DLSS3. A 1080p120 screen is still in the top 50% of gamers - maybe even the top 30-40%. This would mean that the branding is entry-level but the functionality and performance is still comfortably above mid-range.

An entry-level card that exceeds the capability of the majority of gamers' monitors is a good thing.


Of course, I will probably end up eating these words once independent benchmarks are published. I'm just going off of a tweet from Otoy saying that the rendering performance of the 4090 is, indeed, over 2x that of the 3090 and then one from Otoy's CEO saying "more like 3X [in Brigade.]" That makes it a monster.
 
Nvidia's presentation indicates that the 16GB 4080 can compute pixels faster than 98% of gamers' monitors can display them. The 4090 is looking to be faster than 99.8% of gamers' monitors. The 12GB 4080 might be faster than 90% of gamers' monitors. Thus, although the branding of the 12GB 4080 indicates that it is a mid-tier card, the functionality and performance of the 12GB 4080 have it serving only the top 15% of gamers. In this context, it is very much a top-end card and thus is priced as such.
This is one of the most ridiculous reasonings I've seen on this forum, and I've been here awhile.

----



Gamers Nexus with some up to now unknown Nvidia shenanigans. Didn't surprise me, but boy aren't they one of the shittiest companies in the business...
 
This is one of the most ridiculous reasonings I've seen on this forum, and I've been here awhile.

----



Gamers Nexus with some up to now unknown Nvidia shenanigans. Didn't surprise me, but boy aren't they one of the shittiest companies in the business...

Hearing that Nvidia had setup their development tools to be Online at all times so Nvidia could monitor everything they did. That is some next level of untrust.

Makes me wonder... was Nvidia really just concerned that EVGA might overclock something a bit too much. Or was it punishment for sneaking blacklisted reviewers cards. As Kyle has pointed out its not like EVGA was some fair review champion. Still Nvidia has been pushing their weight around the last few years. No matter the reason it doesn't sound like anyone at EVGA that had to work with Nvidia was very happy about it.
 
Hearing that Nvidia had setup their development tools to be Online at all times so Nvidia could monitor everything they did. That is some next level of untrust.

Makes me wonder... was Nvidia really just concerned that EVGA might overclock something a bit too much. Or was it punishment for sneaking blacklisted reviewers cards. As Kyle has pointed out its not like EVGA was some fair review champion. Still Nvidia has been pushing their weight around the last few years. No matter the reason it doesn't sound like anyone at EVGA that had to work with Nvidia was very happy about it.
Nvidia, TSMC, and Samsung, all have their development tools as online only and they monitor everything, it's one of the ways they keep some of their stuff from being stolen.
 
Nvidia, TSMC, and Samsung, all have their development tools as online only and they monitor everything, it's one of the ways they keep some of their stuff from being stolen.
Heaven forbid vBIOS secrets get stolen. I'm sure that was Nvidia's reason. They accidentally stumbled and locked it down from their partners to be able to overclock and customize. Probably slipped on a banana peel in front of a keyboard.
 
Nvidia, TSMC, and Samsung, all have their development tools as online only and they monitor everything, it's one of the ways they keep some of their stuff from being stolen.
Yeah, this is pretty standard practice in most industries. I don't see what all of the noise is about.
 
Yeah, this is pretty standard practice in most industries. I don't see what all of the noise is about.

Because it was used as a tool in this instance where EVGA was doing things to get around stuff in order to make top tier cards that are appealing to overclockers and enthusiasts at a high price to increase margins or separate them from the rest of the competitors and nvidia would watch what they do and immediately patch it out. There is only so much a company is going to fight a battle that falls on deaf ears.
 
Because it was used as a tool in this instance where EVGA was doing things to get around stuff in order to make top tier cards that are appealing to overclockers and enthusiasts at a high price to increase margins or separate them from the rest of the competitors and nvidia would watch what they do and immediately patch it out. There is only so much a company is going to fight a battle that falls on deaf ears.
Then EVGA should be blown away at their thriving AMD business where everyone is free to do anything they want.

.......right?
 
I bet the shows on twitter from EVGA is dead than too.
The guys always showed off the RTX card with game play and some random game give always or code for discounts.
I hope those guys find a job somewhere.
 
are you trying to say that AMD is the same way?
If EVGA's true calling were to just take an OEM's GPU and build a balls-out performing card around it but only if the GPU-maker allowed them, then why not just take a year off and switch to AMD? Why not switch to Intel and be their first AIB like with Nvidia? Why go so far as to formally announce a permanent end to all GPU business?
 
If EVGA's true calling were to just take an OEM's GPU and build a balls-out performing card around it but only if the GPU-maker allowed them, then why not just take a year off and switch to AMD? Why not switch to Intel and be their first AIB like with Nvidia? Why go so far as to formally announce a permanent end to all GPU business?
thats a question for them, not me. Seems they want to bow out of VGA.
 
If EVGA's true calling were to just take an OEM's GPU and build a balls-out performing card around it but only if the GPU-maker allowed them, then why not just take a year off and switch to AMD? Why not switch to Intel and be their first AIB like with Nvidia? Why go so far as to formally announce a permanent end to all GPU business?

Given AMD's GPU market share, they simply might not see it as worth the effort. It would cost a lot of money for them to switch over to AMD and if they don't think the profit will be there it would be a bad decision to make.
 
Given AMD's GPU market share, they simply might not see it as worth the effort.

They have all of this very unprofitable year and the next to negotiate with AMD and Nvidia about how to reenter the GPU market, while consolidating or expanding their remaining business, brand, and marketing.

It seems likely (though too early to call it) that AMD will have GPU chiplets working by RDNA4, and developmentally, Nvidia is a generation and a half behind at this point on the same stuff. They can say they're out of graphics, take a breather during a hurricane season, and come back with whoever they choose to.

Which might be no one, if they can get their other businesses up and cranking.
 
They have all of this very unprofitable year and the next to negotiate with AMD and Nvidia about how to reenter the GPU market, while consolidating or expanding their remaining business, brand, and marketing.

It seems likely (though too early to call it) that AMD will have GPU chiplets working by RDNA4, and developmentally, Nvidia is a generation and a half behind at this point on the same stuff. They can say they're out of graphics, take a breather during a hurricane season, and come back with whoever they choose to.

Which might be no one, if they can get their other businesses up and cranking.

If AMD can get themselves in a better position and is willing to offer EVGA a good deal I wouldn't be surprised to see them re-enter the market in a generation or two. Assuming they feel the need to do so and haven't fired all of their engineers before-hand. It's going to be interesting to see what moves they make in the next year. If people decide to start upgrading to ATX 3.0 PSUs and they get some good models out on the market they might stand a chance of making good money.
 
If AMD can get themselves in a better position and is willing to offer EVGA a good deal I wouldn't be surprised to see them re-enter the market in a generation or two.

Did EVGA make AMD or ATI cards? I can't remember.
 
This is one of the most ridiculous reasonings I've seen on this forum, and I've been here awhile.

----



Gamers Nexus with some up to now unknown Nvidia shenanigans. Didn't surprise me, but boy aren't they one of the shittiest companies in the business...

So over the years, eVGA found ways to get around limits imposed on the chip, and then nVidia would close the loophole in the next generation GPU. And they controlled the vbios. My guess is when nVidia started doing LHR bios's, was when eVGA finally decided to stop trying to hack the chips... and there may not have been anything left for them to find and exploit. They stayed in the GPU business as these last few years have been insanely good business thanks to crypto. But that's over, engineering a kingpin card isn't fiscally responsible anymore, so the time to get out was right.

Not sure I would describe that as a "crazy level of control". Sounds overblown to me.

I know eVGA wanted ways to distinguish themselves in a sea of AiB's, can't blame them for that. But to me, if they would have come out with a much better cooling solution like the 3xxx FE's had (blows some of the heat out of the back of the case), that would have been innovation enough for me to choose them over the others. I despised those older FE's with the single blower fan or other simply average cooling they used to have, never gave them a second thought. I went eVGA for 980ti, 1080ti, 2080ti. But for 3090, I went with the FE solely based on it's cooling solution.

Those Kingpin OC cards are neat, but it's a small fraction of their sales. That alone isn't enough market. And I can't say that it justifies all of the engineering expense, from a business perspective. Always thought those were cool, but double the price for another 5%? I think those cards gave eVGA some solid brand recognition for awhile, top entries on 3dmark, plus the excellent customer service, helped put them at the top of the AIB's in most users minds. But continuing that high expense doesn't feel like a smart move. The CEO even said as much about the margins getting too low to support all of the engineering expenses. nVidia's own design is what probably half of all of the AIB's use for the PCB. They save themselves that cost and can get 10% margins. eVGA on the other hand was seeing only 5% due to the expenses. I can see why he felt the time to get out was now. But they could have continued, used nVidia's board design, and come up better cooling, and made a go of it.

I can't point to any one thing, it was many things that caused him to decide to step away from GPU's.
 
So over the years, eVGA found ways to get around limits imposed on the chip, and then nVidia would close the loophole in the next generation GPU. And they controlled the vbios. My guess is when nVidia started doing LHR bios's, was when eVGA finally decided to stop trying to hack the chips... and there may not have been anything left for them to find and exploit. They stayed in the GPU business as these last few years have been insanely good business thanks to crypto. But that's over, engineering a kingpin card isn't fiscally responsible anymore, so the time to get out was right.

Not sure I would describe that as a "crazy level of control". Sounds overblown to me.

I know eVGA wanted ways to distinguish themselves in a sea of AiB's, can't blame them for that. But to me, if they would have come out with a much better cooling solution like the 3xxx FE's had (blows some of the heat out of the back of the case), that would have been innovation enough for me to choose them over the others. I despised those older FE's with the single blower fan or other simply average cooling they used to have, never gave them a second thought. I went eVGA for 980ti, 1080ti, 2080ti. But for 3090, I went with the FE solely based on it's cooling solution.

Those Kingpin OC cards are neat, but it's a small fraction of their sales. That alone isn't enough market. And I can't say that it justifies all of the engineering expense, from a business perspective. Always thought those were cool, but double the price for another 5%? I think those cards gave eVGA some solid brand recognition for awhile, top entries on 3dmark, plus the excellent customer service, helped put them at the top of the AIB's in most users minds. But continuing that high expense doesn't feel like a smart move. The CEO even said as much about the margins getting too low to support all of the engineering expenses. nVidia's own design is what probably half of all of the AIB's use for the PCB. They save themselves that cost and can get 10% margins. eVGA on the other hand was seeing only 5% due to the expenses. I can see why he felt the time to get out was now. But they could have continued, used nVidia's board design, and come up better cooling, and made a go of it.

I can't point to any one thing, it was many things that caused him to decide to step away from GPU's.
Yeah gone are the days where there is a measurable performance difference between the same card from different manufacturers. There was a time when you would have to measure the MSI, Gigabyte, EVGA, and … against each other and there would be measurable differences in performance and price tags to match. Now they are all the same card with minor differences in cooling, which has its ups and downs. Yeah they are all the same generic card with some unicorn vomit for flair, but from a support perspective, this is ideal for Nvidia. Nvidia can issue driver and bios updates as needed without fear of breaking something from their AIB vendors, an issue AMD struggled with until recently.

As far as the bios loopholes I think EVGA is thinking too much of itself here. If they could discover those loopholes in the bios so could others. For as big as they were in the GPU space the market for others modifying the vBIOS to circumvent limitations placed on the consumer hardware was larger.
 
Last edited:
Kind of sad when people on [H]ardforum of all places are in favor of fewer AIB “pre-modded” cards like in the old days. So you’re just a happy consumer to consume the latest bland (in technical terms) product release where AIBs slap a ridiculous branding/logo and some RGB trash on “their” product?

eVGA hacking chips meant they were tuning the silicon as much as possible (and often overshadowing nvidias own arbitrary “tiers” - highly pre-ovwrclocked AIB 8800GT/GTS512 vs GTX/Ultra anyone?). Obviously that’s unacceptable if you’re nvidia and trying to price segment based on your own bullshit tiers.
 
So over the years, eVGA found ways to get around limits imposed on the chip, and then nVidia would close the loophole in the next generation GPU. And they controlled the vbios. My guess is when nVidia started doing LHR bios's, was when eVGA finally decided to stop trying to hack the chips... and there may not have been anything left for them to find and exploit. They stayed in the GPU business as these last few years have been insanely good business thanks to crypto. But that's over, engineering a kingpin card isn't fiscally responsible anymore, so the time to get out was right.

Not sure I would describe that as a "crazy level of control". Sounds overblown to me.

I know eVGA wanted ways to distinguish themselves in a sea of AiB's, can't blame them for that. But to me, if they would have come out with a much better cooling solution like the 3xxx FE's had (blows some of the heat out of the back of the case), that would have been innovation enough for me to choose them over the others. I despised those older FE's with the single blower fan or other simply average cooling they used to have, never gave them a second thought. I went eVGA for 980ti, 1080ti, 2080ti. But for 3090, I went with the FE solely based on it's cooling solution.

Those Kingpin OC cards are neat, but it's a small fraction of their sales. That alone isn't enough market. And I can't say that it justifies all of the engineering expense, from a business perspective. Always thought those were cool, but double the price for another 5%? I think those cards gave eVGA some solid brand recognition for awhile, top entries on 3dmark, plus the excellent customer service, helped put them at the top of the AIB's in most users minds. But continuing that high expense doesn't feel like a smart move. The CEO even said as much about the margins getting too low to support all of the engineering expenses. nVidia's own design is what probably half of all of the AIB's use for the PCB. They save themselves that cost and can get 10% margins. eVGA on the other hand was seeing only 5% due to the expenses. I can see why he felt the time to get out was now. But they could have continued, used nVidia's board design, and come up better cooling, and made a go of it.

I can't point to any one thing, it was many things that caused him to decide to step away from GPU's.
EVGA stopped trying before LHR lol. There is a pattern of this over time. LHR had zero to do with any of it. That was something nvidia did as a publicity stunt to make gain points with angry gamers who couldnt get cards.
Not sure how your able to say what is and isnt fiscally responsible and what a SKU is in accordance with their overall sales and that it can be justified. Seems like your really firing from the hip here.
 
EVGA stopped trying before LHR lol. There is a pattern of this over time. LHR had zero to do with any of it. That was something nvidia did as a publicity stunt to make gain points with angry gamers who couldnt get cards.
Not sure how your able to say what is and isnt fiscally responsible and what a SKU is in accordance with their overall sales and that it can be justified. Seems like your really firing from the hip here.
LHR was more a shot at the AIB’s who were making bank with inflated direct sales to the miners starving the consumer market in the process.
 
Kind of sad when people on [H]ardforum of all places are in favor of fewer AIB “pre-modded” cards like in the old days. So you’re just a happy consumer to consume the latest bland (in technical terms) product release where AIBs slap a ridiculous branding/logo and some RGB trash on “their” product?

eVGA hacking chips meant they were tuning the silicon as much as possible (and often overshadowing nvidias own arbitrary “tiers” - highly pre-ovwrclocked AIB 8800GT/GTS512 vs GTX/Ultra anyone?). Obviously that’s unacceptable if you’re nvidia and trying to price segment based on your own bullshit tiers.
I agree with you, it is a sorry state as it is. However taking into consideration the prices that these AIB's are charging for their custom designs and also the fact that even OCing is yealding, in most cases, academic results, things are very expensive with very little to show for it. The fact that Nvidia is doing the price hike or the AIB's is not really a valid option to us as consumers. But yeah it's cooler to hate Nvidia.
 
Status
Not open for further replies.
Back
Top