Nvidia Should Be Held Accountable For OverCharging With Lower Performance.

Actually, that's not really what people are talking about... I'd recommend reading said articles before commenting. And, just because people are talking about yet more reasons to avoid AMD, doesn't mean they're "embarassing" themselves, unless you're a complete fanboy of one company or the other. Discussion is what forums are for, and that's what's (mostly) taking place.

Yeah, you sure showed me up, I wish I had of learned to read so I wasn't just looking at the pictures next to the articles.

Or maybe you missed my point a bit mate. I agree the articles are quality reading. I am also looking forward to the comparison between the 660ti and 7950 on pcper.com coming soon. It is the part that is going to be most relevant to me due to what I am running in my comp.
What is not high quality is the rubbish being spewed across various forums loosely based on these new (and incomplete) results. Laughably, people are actually claiming amd cards are terrible.

Having read these articles you would know that as far as single gpu solutions go this does not seem to be the case. 7970s still appear to be as good a buy as anything. Doesn't matter if you measure performance using tools that are new or old. Developed by nVidia or independently. fps or frame times. The 7970 is a beast as we all already knew.

Crossfire on the other hand would appear to have some serious issues out of the box. Some people don't even agree with this as they use frame limiters and other tweaks. They claim to enjoy a smooth and input lag free experience. I wouldn't have a clue as I have never tried a crossfire system. Either way it would be very interesting to see a bit more discussion surrounding peoples experiences with the issues, how they are combating them, if they have succeeded, pros and cons of their solutions etc..

Maybe you don't agree, but, personally I think the most interesting discussions surrounding all of this are going to be:

Where the differences in performance are (differences between gpu generations, drivers, games, number of gpus, etc)
and solutions like radeon pro, and what it fixes and what it doesn't (or what it breaks if anything)

Discussions falling into the categories of "AMD sucks the nutz" or "nVidia is a fking rip off" don't seem that exciting in comparison. Could you say taking part in them is maybe even a bit embarrassing?

Edit:
I thought people were simply discussing the issue more now that technical evidence has come to light confirming and explaining the existence of the problem?:rolleyes:
Is that really what you thought was happening in the majority of threads :rolleyes:
 
Last edited:
I have to say that going from a 670 evga vanilla to a sapphire 7950 was an upgrade. The 7950 runs cooler, quieter and faster for games. Plus I got crysis 3 and bioshock. Only bioshock was worth a crap but yeah. No games with the 670. So Ati is a better value imo. At least now that the drivers are at least ok. I still have some issues with ati drivers but it is all desktop and not in game thank goodness.

Now if I was multi-gpu I would go nvidia no doubt.
 
Nvidia cards offer more features and scale better with sli, why should they cost less?

Price vs performance on single card are pretty close too, it's not like it's night and day.
 
Reference cards on nVidia aren't jet engine vacuums like Radeon AMD cards are, I would hope you do know... :p. Reference SUCKS on AMD but is good on nVidia.

Oh.

Very annoying Whurrrrr sound from the fan
Fan is quiet at idle but whines under load.
Very,Very,very loud and very bad cooling performance.
I don't have an AMD card with a blower, but if Nvidia's are better, AMD's must really suck.
 
NVIDIA doesn't owe anybody anything. They can charge what they want for their products. It's up to you, the consumer, to do your own research and make educated decisions.

People need to stop blaming others for their stupid ass decisions.
 
I have to agree with the op and lets not even get into the whole bump-gate fiasco... nvidia always over promises and under delivers while overchargeing

Then don't buy their shit problem solved.
 
Alittle Balance To The Threads Posted.

Typically Nvidia's cards have cost more then their AMD counterparts all while having less VRAM and producing Lower FPS. With performance per dollar used in various reviews and guides, how does someone feel about paying more for less?

Discuss..

Don't do this.

Someone close this thread and the others which seemed to have caused it.
It's an insult to the intelligent people of [H].
 
Don't do this.

Someone close this thread and the others which seemed to have caused it.
It's an insult to the intelligent people of [H].

Because something paints CF in a negative light, we should close the thread. That's absolute none sense when people should be made aware of this issue.

If AMD were held responsible, they would think twice of using such tricks to boost frame rates.This is a new article that tests the 7970s at ARES 2 speeds(which makes me feel bad for ares owners).

http://www.pcper.com/reviews/Graphi...eForce-GTX-690-Radeon-HD-7990-HD-7970-CrossFi

It tests two cards 7970 at the 1050 in 6 major titles. Skyrim, Dirt 3, Crysis 3, Battlefield 3, Sleeping Dogs, Far cry 3. Only Dirt 3 ran without issues. 3 of the titles(Battlefield 3, Crysis 3 and sleeping dogs) had their FPS boosted using runt frame, and the real performance is lower than titan. In each of these cases, frame rates without the cheat drop from 50 percent to 30 percent. Two other titles(Far cry 3 and skyrim) simply have issues. Far crysis has massive frame time differences, while skyrim just has bad scaling. Only 1 title ran well and that was Dirt 3.

Techpowerup did a Ares II review awhile back and 1 of 3 titles didn't have crossfire problems(this didn't include runt Frame boosting at the time). This ratio lines up well with this PCperscpective article as 1 in 3 titles had issue there as well.

Now with this runt frame boosting, 1 out of 2 games had issues where FPS were being boosted via runt frames and none of these titles overlapped with the 2 games that had issues to begin with.

This means only 1 out of 6 titles had no issues? I can understand people supporting the underdog when 1 out of 3 titles had issues, but 5 out of 6?
Thats horrendous. Games with CF issues went up from 33 percent to 83%.

This complete changes what a consumer really purchased, Would anyone have purchased a CF setup knowing 83 percent of their games had issues? I doubt it. The only people who could defend such an issue are people working at AMD, that don't want it to lose money and have more to gain from AMD not losing money, then a gain from gameplay experience.

This isn't shilling. This issue only came about being brought to light recently and honestly requires multiple discussion because it is so huge.

1 out of 6 titles work! This isn't not something that we should sweep under the run and forget about.
 
Last edited:
You need to do a little research as to what it is they're calling a "runt frame".
 
What many people fail to realize is the GTX Titan is much, much more than a gaming card. Basically, GTX Titan is a low-end workstation card AND a gaming card which hasn't been present in previous times.

Why can't people get that fact through their heads? :confused:
 
To be a workstation card it would have to allow you to use the CAD drivers of Nvidia, since the accelerated drivers are the main selling point of the actual workstation cards.
 
It tests two cards 7970 at the 1050 in 6 major titles. Skyrim, Dirt 3, Crysis 3, Battlefield 3, Sleeping Dogs, Far cry 3. Only Dirt 3 ran without issues. 3 of the titles(Battlefield 3, Crysis 3 and sleeping dogs) had their FPS boosted using runt frame, and the real performance is lower than titan. In each of these cases, frame rates without the cheat drop from 50 percent to 30 percent. Two other titles(Far cry 3 and skyrim) simply have issues. Far crysis has massive frame time differences, while skyrim just has bad scaling. Only 1 title ran well and that was Dirt 3.
1 out of 6 titles work! This isn't not something that we should sweep under the run and forget about.

You should know that hardocp tested most of those titles on AMD cards and did not see the terrible performance you keep talking about. Of course the guys on hardocp actually PLAY the games
 
You should know that hardocp tested most of those titles on AMD cards and did not see the terrible performance you keep talking about. Of course the guys on hardocp actually PLAY the games

Yeah, so do many other reviewers as well as forum users. Your point?

To be a workstation card it would have to allow you to use the CAD drivers of Nvidia, since the accelerated drivers are the main selling point of the actual workstation cards.

They actually do let you enable a DP mode.
 
You need to do a little research as to what it is they're calling a "runt frame".

It's a partially rendered frame that fraps counts as a fully rendered frame, artificially inflating the measured performance of the card and causing perceptible microstutter.

If I was an AMD user I'd be rightfully pissed off, as I can't see how this issue is going to be fixed in a way that wont massively impact the performance of the card or latency's, there's no way that this is all a marketing lie and I fail to see how AMD didn't know about it. AMD's excuses in this article just don't add up.
 
Last edited:
You should know that hardocp tested most of those titles on AMD cards and did not see the terrible performance you keep talking about. Of course the guys on hardocp actually PLAY the games

But they've also consistently said that SLI feels smoother than Crossfire, which is kind of what the frame time stuff is trying to measure.
 
It's a half rendered frame that fraps counts as a fully rendered frame, artificially inflating the measured performance of the card and causing perceptible microstutter.

Not trying to correct you, but the definition of a runt frame :

By default, the script's definition of a "runt frame" is one that occupies 20 scan lines or less, or one that comprises less than 25% of the length of the prior frame.
 
Soz demowhc, should have been more specific,

"A partially rendered frame"....;)
 
Soz demowhc, should have been more specific,

"A partially rendered frame"....;)

It isn't partially rendered.

It is partially shown, but it was fully formed on the frame buffer.

See, on alternate frame rendering if the first frame of the first card is followed by the 1st frame of the 2nd card too fast, and vertical sync is off, then the system will cut off the display of that 1st frame and move to the 2nd. The full work did happen, but it just wasn't shown.

Vsync fixes this because it gives you a schedule and it makes it so that each shown frame is a fully formed frame.

Adding a properly timed delay would also fix this. Getting that "properly timed" part down is what will take some time for AMD.
 
Yeah fully rendered on the GPU, so its not cheating as the GPU still has to do the work, but its only partially rendered on the display making it useless, and reduciung the perceived frame rate as reported by Fraps.
 
you guys.
ITS RENDERED BUT NOT DISPLAYED
so they can get 300 fps if they dont show anything.
Makes a nice smooth black on my screen.

that is not a defense of what we have here.
 
you guys.
ITS RENDERED BUT NOT DISPLAYED
so they can get 300 fps if they dont show anything.
Makes a nice smooth black on my screen.

that is not a defense of what we have here.



You show a total lack of understanding of what is happening.

edit to correct, should show ... my mind played tricks on my fingers :p
 
Last edited:
Yeah, so do many other reviewers as well as forum users. Your point?

My point is that AND cards are not as bad as the other screaming threads claim. BTW check my SIG (since that is the thing to do when someone makes a post in these threads) you will see Nvidia. I have actually never owned an AMD/ATI card, I just want to be fair.
 
Obviously we need a 3rd big name manufacturer of video cards to RULE THEM ALL
 
CORECT. But hardocp NEVER said a second AMD card was "useless"

And PCPer shouldn't have either. That one inflammatory statement gives everyone who doesn't want to believe this stuff an out - they just claim it is all bias, or the PCPer is a shill, or that the tools are developed by Nvidia so they must be broken/rigged.
 
You have to realize... there are some people in this world SUPER loyal to a brand name. And they begin to despise to competition. And everything about thier brand is superior and the competition stinks. This goes way back. There was a guy on the old tom's delpee forums named AJTONY that was HARDCORE 3DFX. he was infamous for his hatred of Nvidia ... back then Ati wasnt really in the game yet... and when 3DFX folded the man literally went mental and imploded... he was that brand loyal... and you see it here too at varying levels. Only when there is Parity in the Brands will u see such fanboi-ism. i am anti-brand ... I go with price/performance everytime. This is my first AMD video card in about 10 years.
 
Well no wonder. I never had to deal with runt frames with my SLI Diamond Monster II 8mb.
 
You have to realize... there are some people in this world SUPER loyal to a brand name. And they begin to despise to competition. And everything about thier brand is superior and the competition stinks. This goes way back. There was a guy on the old tom's delpee forums named AJTONY that was HARDCORE 3DFX. he was infamous for his hatred of Nvidia ... back then Ati wasnt really in the game yet... and when 3DFX folded the man literally went mental and imploded... he was that brand loyal... and you see it here too at varying levels. Only when there is Parity in the Brands will u see such fanboi-ism. i am anti-brand ... I go with price/performance everytime. This is my first AMD video card in about 10 years.

I agree 100 percent! It would be a better world if people learned the meaning of "logic"
 
I agree 100 percent! It would be a better world if people learned the meaning of "logic"

Yeah it would be awesome, I switch back and forth from AMD/Nvidia and I have both gaming consoles PS3 and Xbox 360. If you read most forums this is the equivalent of cheering for both teams at a Yankees vs Red Sox game. Some people would probably even claim blasphemy.
 
After I burn my brother-in-law on a 5970, I will never recommend an AMD. I will pay the price to drivers that work for SLI configuration.
 
I have no horse in this race, or both horses, depending on how you look at it. Actually, I thought AMD WAY overcharged on the 7970 and the 7950. In fact, I found the price difference between the 7950 and 7970 very insulting($500 for the 7970 and $450 for the 7950) originally as their is a performance difference had a bit of a gulf originally. Then the GTX680 came out for the same $500 as the 7970 and they were competitive. BUT the GTX670(the 7950's analog) was $100 less than the top card, not a paltry $50, and performed so close to the GTX680 that it was almost foolish to pick a GTX680 over a GTX670.

In fact, I think AMD should really start correcting their prices due to the GTX 650 Ti w/boost (probably the worst named card this generation...only the 7870-LE/Tahiti/Myst/w/boost is close) as it competes well with the 7850 and decimates the 7790.

However, if you take bundles into account, then nVIdia is completly outclassed. I planned on getting Tomb Raider and Bioshock Infinite anyway and wanted to upgrade my system. I looked at the GTX660 Ti and the Radeon 7930(what the 7870-LE should have been called) and with the bundled games (a $110 value in reallity...not the $150 value AMD is trying to peddle). The 7930 came to only cost me $100 (Picked up the Myst card when it was $209.99), and it's at an even higher performance level than the $300 GTX660Ti. So, for 1/3rd the price of the nVidia card I have a much better performing card.

I guess it's how you look at it. I will say that nVIdia's bundle deal with the "In game currency" is a non-deal for me, and really can't factor it, because I will guarantee beyond any shadow of doubt that their is more interest in Bioshock Infinite and Tomb Raider than in the 3 Freemium games the in game currency is for.
 
If all companies were required to charge the same price for goods, we'll have to change our country name to something more oppressive.
 
One thing that bugs me about nVidia's support is that their drivers were never optimised for older gen cards. I had a GTX 590 and all recent drivers only mentioned that optimisation was done on the 600 series. When playing Crysis 3, my framerate drops to 15 during cutscenses which the 600 series never experienced. The only advantage I got from newer driver is SLI profile for new games. While older reviews shows that GTX590 is marginally faster than the 680, I doubt that this is still true today.
Another thing is that I lauged when nVidia touted its Titan has the most advanced fan system that blow air to the back of the card, doesn't AMD already have these a long time back?
BTW, I have a Titan now.
 
Alittle Balance To The Threads Posted.

Typically Nvidia's cards have cost more then their AMD counterparts all while having less VRAM and producing Lower FPS. With performance per dollar used in various reviews and guides, how does someone feel about paying more for less?

Discuss..

By balance you mean reinterpret positive attributes about Nvidia into negatives by disregarding certain features as unimportant, even though they actually are, in the value equation.


Reports like these only help us.

So far the value proposition of each brand is.

AMD
Better price per performance in single cards. - Usually
Gaming related bundles are a crushingly better value
Multimonitor- AMD is unique in that they support combining displays that aren't of the same resolution and they offer great flexibility than the standard 3x1 setup.
TressFX

Nvidia
Better driver support - This matters a lot if you buy games at launch. It's almost irrelevant if you wait for Steam steep discount sales.
Crushingly better investment value for multi-gpu setups.
Surround is superior to Eyefinity - It has more features and less drawbacks. You have to sacrifice a lot to make use of those features unique to AMD.
PhysX
Lightboost Trick - Requires a specific type of 3D panel. Image quality suffers unless you manually remove the matte coating.
Adaptive Vsync
 
Back
Top