Most of AMD's Next-Gen Radeon Cards Rumored to Be Rebrands

And how long ago was that? It seems AMD is only good at copying others mistakes.

NVIDIA has still done this fairly recently with their low-end GPUs.
This is why it is wise to pay attention to the codename of the GPU, rather than the model given on the card.

I don't see how this was a "mistake", as NVIDIA made huge bank on rebadging their GPUs.
Also, did you forget about the recent bullshit that NVIDIA pulled with the GTX970?

I don't think I've seen AMD ever stoop that low.
Not trying to attack NVIDIA or side with sides, but I do have to say that NVIDIA has pulled some real shit in the last eight years, up to this year.
 
Oh, you mean the cards that had quiet mode turned on instead of Uber by default? Please, try something more applicable to the discussion and not false accusations.

Did you read the article or just immediately jump to AMD's defense?

"Beyond that, I think we've collected enough data to say with confidence that our initial R9 290X review unit, sample 1, is superior to the two retail cards we tested, regardless of the driver or firmware revision."
 
Did you read the article or just immediately jump to AMD's defense?

"Beyond that, I think we've collected enough data to say with confidence that our initial R9 290X review unit, sample 1, is superior to the two retail cards we tested, regardless of the driver or firmware revision."

You act shocked that a company, specifically AMD, would do such a thing.
Seriously, every single tech company ever has done this.

Of course they give out creme of the crop chips/cards for review testing.
Why the hell wouldn't they?

AMD didn't lie about their firmware setting.
Unlike NVIDIA doing so again and again with their GTX970.

This isn't an NVIDIA vs AMD pissing match, it's just reality.
Are you really trying to go there with the morality of these semi-mega corps??? ;)
 
Did you read the article or just immediately jump to AMD's defense?

Here's a nice lesson for you in "corporate ethics":

4371dec09c68012f2fe400163e41dd5b
 
NVIDIA has still done this fairly recently with their low-end GPUs.
This is why it is wise to pay attention to the codename of the GPU, rather than the model given on the card...

That checks.

GT 610 is the old GT 520.
At least some of the GT 620s and 730s are the old GT 430.
Some of the GT 630s are the old GT 440.

I think all of these are derivatives of the GF108, and I believe is about 4-1/2 years old.
 
That checks.

GT 610 is the old GT 520.
At least some of the GT 620s and 730s are the old GT 430.
Some of the GT 630s are the old GT 440.

I think all of these are derivatives of the GF108, and I believe is about 4-1/2 years old.

I still don't think PRIME1 will agree with you, though. ;) :p
 
You act shocked that a company, specifically AMD, would do such a thing.
Seriously, every single tech company ever has done this.
Pretty much, Nvidia and ATI both have dirt on them going way back. ATI would change its render settings if you renamed the executable to Quake 3 so it would optimize more for benchmarks, Nvidia I remember released a driver claiming all these speed improvements and ATI called them out on how it screwed up rendering accuracy (and showed pictures of sawtoothed water in the original Far Cry), also my favorite, I forget when this was, but I know at one point, Nvidia coded their driver so that it would increase the texture settings to "very high" when it detected a screenshot being taken, then change it back to normal afterwards, to make the benchmarks look higher than they actually were for the default quality. And this is to say nothing of the backroom stuff going on with Gameworks and the whole 970 RAM issue. I'd actually say Nvidia has the shadier behaviors at this point, but it's not like ATI/AMD have had a squeaky clean graphics history either.
 
Yeah, I think the GTX970 was the final nail in the coffin for a lot of people.
Many of them are only using NVIDIA at this point because they are the only option for this generation of GPUs.

Hopefully AMD can do some good against them, otherwise I see GPU tech stagnating like Intel with their CPUs.
~5-10% improvement every year, woooo.
 
Back
Top