AMD should be held responsible to CF owners who bought cards based on reviews.

Status
Not open for further replies.
And how do you do that, enable vsync and all your runt frames magically disappear?!

Honestly, not meaning to take the piss, and us NV owners cop it as well as we give it, you AMD owners aren't saints. But this issue is proven and documented, not only that, but AMD have admitted to it.

There comes a time when you have to admit there's something dicey going on over at AMD, they're a biusness, they aren't the good guys.

I use D3Doverrider and enjoy 120Hz gaming, runt frames or not I am getting butter smooth performance in most games at 100+ FPS. CFX disabled usually cannot provide me the same performance. I am not pretending I haven't had any issues. However I am happy with the performance I have had the last 13 months with these cards and they are actually getting better with time.
And if you want to talk about AMD vs nVidia fanboys, believe me the nVidia bunch seem to act much more immature with all their flaming... give me a break... I personally have zero loyalty to either company. If/When I look to upgrade this fall or early 2014 I certainly may go nVidia, we will see what 8000 series brings.
 
skyrimfpsfiltered.gif


bf3fps.gif


These are from the techreport

Look at these two games. Notice there is few sets of numbers, the one where crossfire has high frame rates, which is bolstered by runt frames.

Then another set with the runt frames not counted and where they are filtered out(i.e 5 lines on a screen should not count as a frame render). So as a result removing these runt frames for the FPS average, CR scaling drops from 90 percent or 100 percent down to 10 percent or 40 percent scaling.

If a person were to see these scaling figures in the first place, would they still have purchased a crossfire setup? That is my question and I think not.

We shouldn't blame the reviewers if the tools people normally use cannot catch the error. If AMD did this intentionally, they would definitely try to cheat by using a method that is hard to detect.Heck no one trusted HardOCP, when they noted that CF just doesn't feel as smooth. This was because hardocp had a hard time producing proof due to the lack of tools at the time, all they could produce was a qualitative inspection(which many AMD fans disregarded). Now that there are tools that can catch runt frames that bolster averages, who to say it wasn't a cheat. AMD excuse is they wanted to minimize latency between frames or time between frame. But if to decrease latency a visual artifact needs to be introduced that isn't a real frame, that excuse doesn't hold water.

For the people blaming the reviewers, there are really showing they are team red. Reviewing cards take a lot of work and effort. Hence having to change the testing methodology and make tools to catch such FPS cheats is not a 1 website job. Its takes a lot of work and simply measuring frametimes takes a boat load of effort, which HARDocp has admitted it doesn't have the resources or time to do, particularly to catch runt frames.

I expect most informed buyers to use reviewers and the reason being, we shouldn't have to buy videocards blind. And if there is a cheat present that can't be found until much later and I cannot return the cards, and I didn't get what I was expecting, I would be pissed off.

E.g If I bought a car that specified and was tested by the government to get 30 mpg and I found out the goverment MPG ratings were falsified because the car company did something special to that particular car and the actual MPG was 15. I would try to organize a class action lawsuit if the proof was solid enough.

AMD has turned me off pc gaming in general. The last desktop videocards I got were a set of 4870x2 and a 4890 in trifire. It was such a expensive setup(I bought water blocks too, as these cards were notoriously loud) and in the end, it was a complete waste. Trifire didn't function at all and the whole experience soured my taste in PC gaming, particularly AMD cards. I have never bought a desktop card from either company again. Some games like crysis, I couldn't even open the menus in the game because there was so much glitching going on.

This issue shouldn't matter if you have a preference for team red or green. This is a matter of if a company cheats in a benchmark and the company sells more cards based on these review figures. Should the company that implemented the cheat get away with it, especially when this cheat has effected multiple games and concrete evidence of real performance being lower exists?

If Nvidia did this, people would be pissed(e.g the crysis 2 tesselation issue was much smaller than this) and likely want restitution.

I would hope that people that paid 1100 dollars for their cards could see what AMD did was completely in the wrong and they deserve to be treated better by a company. You guys deserve better and if you don't see that, your emotional attachment(as AMD the lovable saintly underdog) to AMD is clouding your better judgement.
 
Last edited:
I use D3Doverrider and enjoy 120Hz gaming, runt frames or not I am getting butter smooth performance in most games at 100+ FPS. CFX disabled usually cannot provide me the same performance. I am not pretending I haven't had any issues. However I am happy with the performance I have had the last 13 months with these cards and they are actually getting better with time.
And if you want to talk about AMD vs nVidia fanboys, believe me the nVidia bunch seem to act much more immature with all their flaming... give me a break... I personally have zero loyalty to either company. If/When I look to upgrade this fall or early 2014 I certainly may go nVidia, we will see what 8000 series brings.

Valid points NukeDukem,

Bear in mind that us NV users, especially around the time of ATi's 5xxx series copped a fair amount of flac for our loyalty...;)
 

Ah, interesting Fooshnik, I missed that. So PCP actually noticed my suspicions during testing and commented on them.

I'm not interested in a flamewar/fanboi debacle, I just find this terribly interesting - If not a little deceiving on behalf of AMD as I remember specifically back in the 5xxx Eyefinity days there was a massive issue with stuttering while gaming with performance that was unexplainably odd at times when running CF.
 
Whole, extra frames....Where do the extra frames go when vsync is enabled?

Remember, as far as I understand it, this issue is a whole lot different to microstutter. Microstutter has to do with the timing of the frames being displayed, it has nothing to do with runt (or extra) frames being displayed.



Runt frames are the result of frames being delivered too closely together, so the runt frame only takes up a few pixels worth of the display. In this sense, runt frames are microstutter.

Imagine this is 3 frames on the display during a single refresh/scan cycle. Frame 3 is displayed so quickly after frame 2, it makes frame 2 a "runt frame". There would be a tear line between 1 & 2, and 2 & 3.

11111111111
11111111111
11111111111
22222222222 < runt frame
33333333333
33333333333
33333333333

Vsync should eliminate runt frames as there is 1 frame per refresh, because its synced, and each frame is displayed fully.
 
On some games crossfire will send two frames to the display at nearly the same time, causing the first to be truncated by the second when vsync is off. With vsync on it's fine. I'm not even going to bother linking the the article as it'll be reposted for the 20th time sometime in the next five minutes. I get nearly the double the framerate on Tomb Raider that I do with one card, which is good as it's unplayable on a single card. If AMD wants to give me money I'm okay with that, but they don't owe me anything as I got what I paid for.

what you fail to mention after "with vsync on, it's fine" is the fact that turning on vsync adds noticable latency to the experience (read: input lag)

i have two 7970's and some games with vsync enabled show noticable input lag, if i turn it off, it looks stuttery

not every game does this but some do and it's pretty annoying. people need to stop trying to sweep this under the rug like it's not an issue
 
The reason why I dont CF my 7970 and use that rig to run multi display. Instead, I build a new rig with 680 SLI to run multi display.
 
Runt frames are the result of frames being delivered too closely together, so the runt frame only takes up a few pixels worth of the display. In this sense, runt frames are microstutter.

Imagine this is 3 frames on the display during a single refresh/scan cycle. Frame 3 is displayed so quickly after frame 2, it makes frame 2 a "runt frame". There would be a tear line between 1 & 2, and 2 & 3.

11111111111
11111111111
11111111111
22222222222 < runt frame
33333333333
33333333333
33333333333

Vsync should eliminate runt frames as there is 1 frame per refresh, because its synced, and each frame is displayed fully.

Can you explain it again, but simpler?
 
During each monitor refresh the image is drawn from top left to bottom right, line after line. At 60hz this takes ~16ms. When a new frame is recieved, the monitor stops drawing the old frame at whatever point it was at during the refresh cycle. This might be half way down the screen, at say 8ms, where it then starts to draw the new frame (this is the tearing line). If the frame after that is recieved too quickly, say 1ms later, the 2nd frame is only drawn a few lines before the 3rd frame is recieved, which fills the rest of the screen.

This image has 3 frames, first is the gray frame, then the pink frame, then the yellow frame. The yellow frame was given to the monitor so quickly that barely any of the pink frame was drawn, making it a runt frame, meaning there is no point in it being drawn at all.You can see the tearing lines at the points where the new frames are recieved.

fr_cf_3.jpg


Also note that the yellow frame will continue to be drawn at the top of the next refresh, and the gray frame was drawn at the bottom of the previous refresh, just so you dont think they are partially wasted too.
 
Last edited:
Actually here's a good diagram of evenly spaced/timed frames, imagine between frame 1 & 2 there is a tiny sliver (very quick runt frame)

FrameInterval_575px.jpg
 
So, it seems to me that this is a timing issue with the workloads of the AFR setup.

Vsync fixes it, because Vsync gives you a schedule to follow, without vsync they will have to write some kind of average timing analysis to accelerate or delay the workload.

May be possible, but it will be hard as hell and explains why they require months for a proper all encompassing fix, since they need to dynamically time the frame just enough to make each one meaningful without affecting the scaling too much.
 
AMD should have to give these owners are partial refund or at least some free games because this issue affects most games.

Agree 100% on the refunds to owners who think this is a significant issue. I also believe AMD should blacklist these customers making them ineligible from all future promotions and warranties on AMD GPUs. CF has had problematic scaling and specific game issues since its inception and any consumer sophisticated enough to actually set up a multi GPU system should have known this before their purchase. Best move for AMD is to pay them to shut their holes then discourage them from being future customers. AMD and most internet forums would be better off without these few crybabies.
 
It's been out of control for awhile now, but lately more and more coming out of the closet.

Suddenly, Micro-stuttering is a whole new thing that people doesn't know about....... :p

as much as i'm trying to ignore all these threads.. my god... they are popping up every day now and such and it's just getting annoyed.

i buy video card, it lets me play games, it looks good, it plays well, the end. I much prefer looking at real world performance then anything else which is why i use [H] reviews more then anything. That's all i need.
 
In a faster display you would get that the "runt frames" would fill more of the screen, but still they would be as uneven because this is about timing between "done frames" in each of the gpu's.

What seems to be happening without vsync is that you are kinda getting the frames in pairs with pauses between each pair, instead of a more continuous stream, so yeah, you do need to add a small delay at the start so that the frames aren't paired up...

Determining that delay in a non detrimental way for the user will be a bitch to handle.
 
I don't think AMD is cheating, because if 1x 7970 can do say 60fps, then theoretically adding a second could go as high as 120fps with perfect AFR (micro-stuttering aside for now).

Nvidia has hardware based frame metering. Does AMD? If they don't, then it's probably best they focus on getting something built in with their HD 8000 refresh. All this time we believed CF was some kind of bargain, when it fact, it actually was the opposite.
 
During each monitor refresh the image is drawn from top left to bottom right, line after line. At 60hz this takes ~16ms. When a new frame is recieved, the monitor stops drawing the old frame at whatever point it was at during the refresh cycle. This might be half way down the screen, at say 8ms, where it then starts to draw the new frame (this is the tearing line). If the frame after that is recieved too quickly, say 1ms later, the 2nd frame is only drawn a few lines before the 3rd frame is recieved, which fills the rest of the screen.

This image has 3 frames, first is the gray frame, then the pink frame, then the yellow frame. The yellow frame was given to the monitor so quickly that barely any of the pink frame was drawn, making it a runt frame, meaning there is no point in it being drawn at all.You can see the tearing lines at the points where the new frames are recieved.

fr_cf_3.jpg


Also note that the yellow frame will continue to be drawn at the top of the next refresh, and the gray frame was drawn at the bottom of the previous refresh, just so you dont think they are partially wasted too.

It's the reverse to how you are describing it.

The yellow frame is the first frame, it was rendered in its entirety (hence it reached the bottom of the screen). The pink frame then came in, it got half way down the screen but then the grey frame was received.

The display ditched the pink frame half way through drawing it (leaving the bottom half of the displaying as the yellow frame) and started to render the grey frame.

People are asking why VSYNC fixes it, surely that's because it caps the framerate to that of the refresh rate of the monitor? Frames are only generated at a rate that means they ALL get drawn in their entirety 100% of the time. This is also why tearing is fixed. At least that is my understanding. Could be way off.
 
That doesnt make sense because the screen refreshes from top left to bottom right. The order is grey, pink, yellow.

The yellow frame continues onto the next refresh aswell, at the top. Remember this isnt like a film reel, these are the timings at which they are drawn and dont really have a "top" and "bottom" like a film cell, but rather only points in time.

Look at this diagram again, the yellow frame in the screenshot above is like the red frame in the diagram (frame 2)

FrameInterval_575px.jpg
 
I never really thought there was an intentional campaign against AMD - potentially initiated by nVidia - until I saw these two threads side by side on this forum:

"AMD Crossfire a scam - Almost no benefit over single card"

"AMD should be held responsible to CF owners who bought cards based on reviews."


Thank you, nVidia, now I know to take any AMD bashing worth a grain of salt and use only my own personal experience.
 
I never really thought there was an intentional campaign against AMD - potentially initiated by nVidia - until I saw these two threads side by side on this forum:

"AMD Crossfire a scam - Almost no benefit over single card"

"AMD should be held responsible to CF owners who bought cards based on reviews."


Thank you, nVidia, now I know to take any AMD bashing worth a grain of salt and use only my own personal experience.

You only now learned to take bashing with a grain of salt? Also you follow up this revelation with blaming this all on Nvidia, which itself is baseless bashing. Thumbs up, you've sunk to everyone else's level! =D
 
During each monitor refresh the image is drawn from top left to bottom right, line after line. At 60hz this takes ~16ms. When a new frame is recieved, the monitor stops drawing the old frame at whatever point it was at during the refresh cycle. This might be half way down the screen, at say 8ms, where it then starts to draw the new frame (this is the tearing line). If the frame after that is recieved too quickly, say 1ms later, the 2nd frame is only drawn a few lines before the 3rd frame is recieved, which fills the rest of the screen.

This image has 3 frames, first is the gray frame, then the pink frame, then the yellow frame. The yellow frame was given to the monitor so quickly that barely any of the pink frame was drawn, making it a runt frame, meaning there is no point in it being drawn at all.You can see the tearing lines at the points where the new frames are recieved.



Also note that the yellow frame will continue to be drawn at the top of the next refresh, and the gray frame was drawn at the bottom of the previous refresh, just so you dont think they are partially wasted too.

They want there tearing evenly space out.
So instead of having those 3 frames overlapping near the top they should be evenly spaced which should take 3rd of the screen each giving you a tear a 3rd up and a 3rd down.
No thanks ill just take Vsync.
 
I never really thought there was an intentional campaign against AMD - potentially initiated by nVidia - until I saw these two threads side by side on this forum:

"AMD Crossfire a scam - Almost no benefit over single card"

"AMD should be held responsible to CF owners who bought cards based on reviews."


Thank you, nVidia, now I know to take any AMD bashing worth a grain of salt and use only my own personal experience.

The two threads are clearly baited with terrible attempts to paint AMD in a bad light. I agree, these threads offer nothing into the problem that AMD is currently fixing (and mostly has fixed too). Infact I think there is more misinformation then actual good information.

@demohwc - I remember seeing that diagram somewhere, I was looking for it the other day when I saw the original thread and totally forgot about it when I got to work. I think its a great way to represent the problem that is currently going on with Radeon cards.
 
Yeah seems to show whats going on well.

They want there tearing evenly space out.
So instead of having those 3 frames overlapping near the top they should be evenly spaced which should take 3rd of the screen each giving you a tear a 3rd up and a 3rd down.
No thanks ill just take Vsync.

Many people dont use vsync because it intoduces a number of issues. I play a lot of online FPS and vsync isnt an option. Tearing can be managed or minimised btw, without the use of vsync.
 
Yeah seems to show whats going on well.



Many people dont use vsync because it intoduces a number of issues. I play a lot of online FPS and vsync isnt an option. Tearing can be managed or minimised btw, without the use of vsync.

Yes the Output lag is very real.

But there are some new tricks to reduce it, in BF3 setting ingame cap to 59 while also having Vsync on gives you no noticeable tearing while seriously reducing the output lag i was presently surprised.
Using AB frame cap set to 59 didn't have the same effect in BF3 i had to use the imgame cap but, for some games using AB frame cap set to 59 it does.
Vsync output lag is the lesser of 2 Evil's for me personally.

I don't use RadeonPro which seems to have more options in that field.
 
Agree 100% on the refunds to owners who think this is a significant issue. I also believe AMD should blacklist these customers making them ineligible from all future promotions and warranties on AMD GPUs. CF has had problematic scaling and specific game issues since its inception and any consumer sophisticated enough to actually set up a multi GPU system should have known this before their purchase. Best move for AMD is to pay them to shut their holes then discourage them from being future customers. AMD and most internet forums would be better off without these few crybabies.

Blacklist customers because they stood up for themselves? Thats insane. That would make AMD look ten times worse and ensure they turn to look like the bad guy. Looking at reviews across multiple website is already techsavvy enough that people shouldn't be blamed for their purchasing decisions. If a cheat is present that review website cannot detect with normal tools and only visual inspection.

Crossfire scaling problems are detectable, when runts frames are not boosting the average. But with runt boosting CF scaling , the CF scaling is much harder to detect using tools such as fraps and was first noticed by users using visual inspection. They could only provide the evidence as an observation by the reviewers. The problem is most reviews simply comment of the FPS recorded in FRAPS.

You already admit that CF has alot of issue in your post. You think runt frame time boosting the FPS of games like BF 3 where scaling is supposed be be fantastic(based on reviews) is acceptable?

Customers in good faith already bought CF setup, even with issues they known from the get go from reviews. Having new issue that have popped up that possibly affect the remaining supposedly working games(and this isn't insignificant, its an issue which cuts the potential amount of frames in half) is completely unacceptable. Customers were already willing in good faith ignore or accept the detectable problems found in reviews, having to deal with another widespread issue that is hard to detect using Fraps should be unacceptable.

How can you possibly think this is Okay? Both issues combined should render a CF setup completely wasteful setup considering you have to double your purchase price to receive only non issue working experience in a handful of games. With all this information before hand and without having an emotional attachment beforehand, people wouldn't have bought a second card for crossfire or got a nvidia setup if they wanted a dual card experience.


If Nvidia did the same thing, there would be 10 times the uproar.
And with Nvidia, you wouldn't have nearly as many issue in the first place as reviews have shown. Letting people know about this issue, when it is this significant, is honestly benefiting you AMD purchasers benefit too. Do you think AMD is going to be idle with this issue, if people are noticing and talking about it. All this fanboy coverup and ignoring the issue has gotten CF to the sorry setup it is today. It is time that people stop ignoring this issue so AMD takes issue and make it a priority in fixing it. It took the same type of reaction for AMD to try to fix its enduro and GPU utilization issue(or atleast attempt to).

Do you expect people to not react at all, so the problems not get fixed?

AMD has already admitted that this issue is real. So there is no denying this issue doesn't exist. If AMD gave people a portion of their money back or a free game, it would restore some goodwill that was lost in this issue. This thread should be about CF owners being compensated for an unacceptable gaming experience.
 
Runt frames are the result of frames being delivered too closely together, so the runt frame only takes up a few pixels worth of the display. In this sense, runt frames are microstutter.

Valid point demowhc. How do you explain the apparent loss of any performance gain by adding a second card?

Surely even with perceivable microstutter you'd still see an actual increase in performance?
 
Valid point demowhc. How do you explain the apparent loss of any performance gain by adding a second card?

Surely even with perceivable microstutter you'd still see an actual increase in performance?
What? You mean all those extra settings brent enabled in the hardware reviews aren't because of extra performance?
 
Valid point demowhc. How do you explain the apparent loss of any performance gain by adding a second card?

Surely even with perceivable microstutter you'd still see an actual increase in performance?

I don't know and thats a good question. I can only guess there are instances of groups of frames, say 3, that are delivered in rapid succession?

And yes I would have thought so too flu!d, I have always seen a perceivable increase in performance with the 3 CFX systems I owned. I dont buy their claims its useless, but its no secret SLI feels smoother at equal or even less FPS. This goes a long way to explaining that.

Yes the Output lag is very real.

But there are some new tricks to reduce it, in BF3 setting ingame cap to 59 while also having Vsync on gives you no noticeable tearing while seriously reducing the output lag i was presently surprised.
Using AB frame cap set to 59 didn't have the same effect in BF3 i had to use the imgame cap but, for some games using AB frame cap set to 59 it does.
Vsync output lag is the lesser of 2 Evil's for me personally.

I don't use RadeonPro which seems to have more options in that field.

Yes I know, I've been using a 59fps cap + vsync since my first CFX system back in '06 for single player games, and still do on my 1440p 60hz display with SLI. It's still not an option for online FPS for me though , but great for SP games.

For online games I have used a 125 or 250 fps cap for about a decade at 60hz, because it has a lot of benefits which includes minimising tearing. Been using a 120hz display for the last 3 years for online FPS, and still use those caps (125 or 250) and vsync isnt needed at all imo.
 
A lot of reviews have called out AMD on their crossfire performance before, it's just gotten more attention than usual lately. If you didn't do any research on the product you are buying, why should AMD be held responsible? I don't get it...
 
For online games I have used a 125 or 250 fps cap for about a decade at 60hz, because it has a lot of benefits which includes minimising tearing. Been using a 120hz display for the last 3 years for online FPS, and still use those caps (125 or 250) and vsync isnt needed at all imo.

A lot of people complain about vsync input lag with Counterstike and BF3. I use Radeon Pro's dynamic framerate control and never experienced it. This is at 1600p, which unfortunately won't be available at 120Hz anytime soon. At 120Hz the maximum lag should be about 16ms, with 8-10ms typical, which isn't bad considering a good human response time is 226ms.
 
I can feel input lag with vsync at 120hz, as I said its not an option. Theres no need for it at 120hz either, especially if you can maintain a solid 125 or 250fps frame cap. There are 1440p 120hz displays too btw.

Radeon pros dynamic frame rate control is just an fps limiter btw, its not vsync.
 
I use it with vsync or I get tearing. Newer games won't keep frame times under 16ms even with two cards.
 
I don't know and thats a good question. I can only guess there are instances of groups of frames, say 3, that are delivered in rapid succession?

And yes I would have thought so too flu!d, I have always seen a perceivable increase in performance with the 3 CFX systems I owned. I dont buy their claims its useless, but its no secret SLI feels smoother at equal or even less FPS. This goes a long way to explaining that.



Yes I know, I've been using a 59fps cap + vsync since my first CFX system back in '06 for single player games, and still do on my 1440p 60hz display with SLI. It's still not an option for online FPS for me though , but great for SP games.

For online games I have used a 125 or 250 fps cap for about a decade at 60hz, because it has a lot of benefits which includes minimising tearing. Been using a 120hz display for the last 3 years for online FPS, and still use those caps (125 or 250) and vsync isnt needed at all imo.

I agree i did the 120fps, 240fps on COD4 and the tearing was minimal, i was running 4x single slot 3870,s back then, but i im fine with 59fps cap + vsync for even on-line.
 
You only now learned to take bashing with a grain of salt? Also you follow up this revelation with blaming this all on Nvidia, which itself is baseless bashing. Thumbs up, you've sunk to everyone else's level! =D

Actually, yeah, I only learned this just now! I always thought that bashing was based on a legitimate reason! Now I see that, possibly, a lot of bashing that I see isn't based on anything legitimate but something that's pure malice with no constructive intent.

Occam's razor would say that the explanation making the least assumptions is the most probable. Therefore, is it not reasonable to suspect that the one initiating the malice would be AMD's top competitor? Seems the simplest and most probable source to me. Don't you think?
 

If Nvidia did the same thing, there would be 10 times the uproar.

Nvidia have done it the same way for years. Are you a new to video cards? Its only with the GTX 6XX Nvidia implemented frame metering to decrease microstuttering with SLI. And, it comes with some tradeoff (though the tradeoff is justified in my opinion):

Frame metering sounds like a pretty cool technology, but there is a trade-off involved. To cushion jitter, Nvidia is increasing the amount of lag in the graphics subsystem as it inserts that delay between the completion of the rendered frame and its exposure to the display. In most cases, we're talking about tens of milliseconds or less; that sort of contribution to lag probably isn't perceptible. Still, this is an interesting and previously hidden trade-off in SLI systems that gamers will want to consider.
http://techreport.com/review/21516/inside-the-second-a-new-look-at-game-benchmarking/11

AMD is taking a page out of Nvidia's book here and are implementing a choice to delay frames this summer. But, Nvidia started with this on their 600 series, so the big stink you are trying to create about this is kinda stupid, unless you want to hold Nvidia responsible towards all SLI customers pre-600 series? Don't forget that Tom Petersen from Nvidia tells/admits to TR in the link above that this was an issue with SLI and how they are trying to solve it with frame metering, in case you want to deny that this was an issue before frame metering.
 
Actually in the article you linked it says Nvidia have been implementing frame metering since 8800GTX.

Great that AMD will offer a choice, I think thats excellent and hope Nv follow suit.
 
Actually in the article you linked it says Nvidia have been implementing frame metering since 8800GTX.

Great that AMD will offer a choice, I think thats excellent and hope Nv follow suit.

Capabilities have been there since G80, but hardware based frame metering was introduced with the 600 series/Keplar:
Kepler introduces hardware based frame rate metering, a technology that helps to minimize stuttering.
http://www./whats-new/articles/article-keynote
insert geforce . com after www.

Microstuttering has been an issue for some (not all) since the beginning of multi-gpu. Many Geforce owners have chosen not to SLI because of microstutter.

Yes, its good that AMD will offer a choice. For some, input lag is a problem and for some input lag is only a problem in competitive games. :)

I am happy that there is a lot of focus about smoothness in games, since its one of the most important factors in my opinion. But, there is a lot of trollbait coming along with it, which is bad and misleading.
 
Status
Not open for further replies.
Back
Top