Scott Wasson Discusses Flawed Benchmarking and New AMD Adrenalin Features

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,060
AMD Senior Manager of Product Management, Scott Wasson was interviewed by Gamer's Nexus where he explains why fps metrics and benchmarks are flawed. In the second video, he demonstrates some of the exciting new features like setting custom memory timings for GPUs in the AMD Adrenalin 2019 Edition 18.12.2 drivers.

AMD's Scott Wasson walked us through the new Adrenalin driver update while the team visited GN HQ.
 
Also coming up , the Oakland Raiders talk about how touchdowns are not a good indicator of football prowess.
To be fair, average framerate numbers are pretty meaningless on their own, and showing an actual plot of FPS over time actually indicates proper smoothness(or lack of in some circumstances) the way [H] does.

That being said, it does come off as "We can't produce a high end product so please use this friendlier way to compare our products to others".
 
To be fair, average framerate numbers are pretty meaningless on their own, and showing an actual plot of FPS over time actually indicates proper smoothness(or lack of in some circumstances) the way [H] does.

That being said, it does come off as "We can't produce a high end product so please use this friendlier way to compare our products to others".

Im not saying there's no merit to either argument just that it shouldn't be coming from a team who hasn't made the playoffs for a decade. There is way more to gaming frame rates than fps, and more to football than touchdowns.
 
Anybody with half a brain will tell you that framerate means nothing as long as you are in freesync range especially when you can do it with minimal frametime.. Look up the comparison that kyle,h- team have done FreeSync vs GSync.
Comeback here with your thoughts... oh and rewatch the video because clearly you have no clue...:D
 
There is only one number I've always cared about, Minimum FPS, particularly that spot in your favorite game where FPS gets crushed by copious amounts of characters running around, complex architecture, explosions and all the eye candy maxed out. Once you've achieved that the rest is gravy.
 
Anybody with half a brain will tell you that framerate means nothing as long as you are in freesync range especially when you can do it with minimal frametime.. Look up the comparison that kyle,h- team have done FreeSync vs GSync.
Comeback here with your thoughts... oh and rewatch the video because clearly you have no clue...:D
Not sure if sarcasm is failing to come through due to this being text. Because regardless of what AMDs marketing tries to claim, FPS(especially MINIMUM FPS) still matters.
 
Before anyone goes crazy i want a competitive AMD, but pretending they are by manipulating data and talking down important distinctions to pretend they are competitive is wrong and helps no one, or just deceives the casual gamers. We need real competition not disingenuous praise to present the false narrative of competition when there currently is none at the high end.
 
i can see where he is coming from when we get into the higher fps numbers though: take a 30fps increase; the higher the "base fps" is, the lower the percentage difference is for the base+30fps resultant, even though it is still a difference of 30fps

30fps vs 60 fps = 100% increase from the "base" of 30

100fps vs 130fps = 30% increase from the "base" of 100

250fps vs 280fps = 12% increase from the base of 250

etc.

it is all relative.
personally as long as i get a smooth frame rate(~30+) i don't care what the difference is for speed etc. but i don't do bleeding edge gaming,
 
i can see where he is coming from when we get into the higher fps numbers though: take a 30fps increase; the higher the "base fps" is, the lower the percentage difference is for the base+30fps resultant, even though it is still a difference of 30fps

30fps vs 60 fps = 100% increase from the "base" of 30

100fps vs 130fps = 30% increase from the "base" of 100

250fps vs 280fps = 12% increase from the base of 250

etc.

it is all relative.
personally as long as i get a smooth frame rate(~30+) i don't care what the difference is for speed etc. but i don't do bleeding edge gaming,
You're talking about frame time variance, and several review outlets already do this.
 
Before anyone goes crazy i want a competitive AMD, but pretending they are by manipulating data and talking down important distinctions to pretend they are competitive is wrong and helps no one, or just deceives the casual gamers. We need real competition not disingenuous praise to present the false narrative of competition when there currently is none at the high end.
Definitely. I want to buy the best performing product I can afford, AMD hasn't delivered that in well... ages. It's not that I'm an nvidia fanboy who only buys nvidia, hell even if I were I'd want them to have some competition(like AMD has finally done again in the CPU market) just so the pricing and performance isn't as much of a joke like the current 2080ti.
 
Freesync vs Gsync comparison begs to dififer. Read my first sentence. Not everyone needs a 2080Ti to play in 4 K....or in some cases burn their place down...:D
 
Funny they show it on that laptop - I have it and Wattman doesn't work at all on it due to it being a mobile part (even though it's a full Vega56). Have to use Afterburner, unfortunately.
 
Well, Then you have to do more research and rewatch the videos. Maybe you'll grasp what im talking about till then there is no point discussing it any farther. Have fun...;)
 
Well, Then you have to do more research and rewatch the videos. Maybe you'll grasp what im talking about till then there is no point discussing it any farther. Have fun...;)
Research on what? I have a gsync display, neither technology has anything to do with shitty minimum framerates.
"hurr durr, you're wrong but I won't explain why" If you're not going to explain yourself why bother posting?
 
You are talking about highest framerates while I and the video in question talks about frametime. The Freesync vs G-sync comparison is what you need to look for...:) again, have fun !!!
 
You are talking about highest framerates while I and the video in question talks about frametime. The Freesync vs G-sync comparison is what you need to look for...:) again, have fun !!!
Can you not read? I've typed MINIMUM twice now when referring to shitty framerates. Where the hell do you get the idea that I'm talking about highest framerates? I already addressed the issue of highest FPS and average being meaningless in my first reply to this thread.

AT NO POINT IN THIS THREAD HAVE I SAID A DAMN THING ABOUT HIGHEST FRAMERATE.
 
Not sure if sarcasm is failing to come through due to this being text. Because regardless of what AMDs marketing tries to claim, FPS(especially MINIMUM FPS) still matters.

No kidding. Imhotep is one of the most clueless users on this site. Is he just a young fool or a master puppet master that's trolling us all. Stay tuned for more!

Edit: I saw Kyle's post too late. Watching the video now.
 
No kidding. Imhotep is one of the most clueless users on this site. Is he just a young fool or a master puppet master that's trolling us all. Stay tuned for more!

Edit: I saw Kyle's post too late. Watching the video now.
So back to actual discussion of the topic, the guy in the video isn't wrong, it's just that he comes off as shilling for AMD(which he is). All it really boils down to is average and highest FPS not being the most relevant thing in a benchmark, giving a figure like average frametime isn't any better either since what really needs to happen is posting actual FPS or frametime(which effectively do the same thing) plots as a line to them compare so you can see where and how often the drops actually occur pulling down the minimum framerate.
 
So back to actual discussion of the topic, the guy in the video isn't wrong, it's just that he comes off as shilling for AMD(which he is). All it really boils down to is average and highest FPS not being the most relevant thing in a benchmark, giving a figure like average frametime isn't any better either since what really needs to happen is posting actual FPS or frametime(which effectively do the same thing) plots as a line to them compare so you can see where and how often the drops actually occur pulling down the minimum framerate.

I am at minute 10 and have seen no shilling at all. He's right, and he's been saying that for years. In fact it was him (techreport) who broke the "news" to the morons a few years ago with their piece on frames and frame times.
 
I am at minute 10 and have seen no shilling at all. He's right, and he's been saying that for years. In fact it was him (techreport) who broke the "news" to the morons a few years ago with their piece on frames and frame times.
The reason I say it comes off as shilling for AMD, which it is in the sense of it's a GN video with an AMD guy in it, but it's not because the guy is an AMD employee so that's not really shilling at that point, is that regardless of his posting about frame times in the past the only reason I see for anyone from AMD to be doing that is because they still don't have anything to compete on the high end. The thing is, except for some of the worst case scenarios(usually due to the game itself), top end framerate performance does generally translate to better framerate performance on the bottom end of the chart when looking at minimum FPS. For example, I don't have a 1080ti so I can get a billion FPS, I have one so when the framerates do tank it's generally not as bad as a less performing card.

Like I said, it's definitely not wrong, it just comes off as not being genuine when it's from a company that hasn't had anything competitive on the high end for a few years now. I would say the exact same thing if it was an nvidia rep trying to talk up maximum and avg. FPS as well.
 
The reason I say it comes off as shilling for AMD, which it is in the sense of it's a GN video with an AMD guy in it, but it's not because the guy is an AMD employee so that's not really shilling at that point, is that regardless of his posting about frame times in the past the only reason I see for anyone from AMD to be doing that is because they still don't have anything to compete on the high end. The thing is, except for some of the worst case scenarios(usually due to the game itself), top end framerate performance does generally translate to better framerate performance on the bottom end of the chart when looking at minimum FPS. For example, I don't have a 1080ti so I can get a billion FPS, I have one so when the framerates do tank it's generally not as bad as a less performing card.

Like I said, it's definitely not wrong, it just comes off as not being genuine when it's from a company that hasn't had anything competitive on the high end for a few years now. I would say the exact same thing if it was an nvidia rep trying to talk up maximum and avg. FPS as well.

I feel the opposite. I salute honest talks from company representatives. So what if an nvidia engineer is trying to talk up maximum and average FPS. I am all for people hyping high refresh rates. I want 1KHz movies and games
 
There is only one number I've always cared about, Minimum FPS, particularly that spot in your favorite game where FPS gets crushed by copious amounts of characters running around, complex architecture, explosions and all the eye candy maxed out. Once you've achieved that the rest is gravy.

The problem with talking about 'frames per second', be they maximum, average, minimum... is that they are measured over the course of a second. In one second, you could have one 500ms frame and sixty 8.3ms frames, and you'd have 61FPS- and it'd look more like 2FPS. Hard stuttering if it repeats.

This is why frametime analysis is a big deal. Frametimes are what we've measured to generate FPS numbers, but actually looking at them tells us how gameplay actually feels.
 
The reason I say it comes off as shilling for AMD, which it is in the sense of it's a GN video with an AMD guy in it, but it's not because the guy is an AMD employee so that's not really shilling at that point, is that regardless of his posting about frame times in the past the only reason I see for anyone from AMD to be doing that is because they still don't have anything to compete on the high end. The thing is, except for some of the worst case scenarios(usually due to the game itself), top end framerate performance does generally translate to better framerate performance on the bottom end of the chart when looking at minimum FPS. For example, I don't have a 1080ti so I can get a billion FPS, I have one so when the framerates do tank it's generally not as bad as a less performing card.

Like I said, it's definitely not wrong, it just comes off as not being genuine when it's from a company that hasn't had anything competitive on the high end for a few years now. I would say the exact same thing if it was an nvidia rep trying to talk up maximum and avg. FPS as well.

I just want to point out there is a whole other market that matters as well. AMD may not have the highest end card but they have great cards for the rest of the market. I also have seen many sites that like to point out max fps like it means something, part of what started the canned benchmark craze. Stuff it all in a buffer and watch those max frames soar, but it's nothing like how the game will play. Part of the reason I like the video card reviews here is they actually play the game like I would. Nothing wrong with reminding people to look at a bigger picture then just 1 number on a chart.
 
It'd be a lot better if AMD would just make better products to compete with Nvidia on the high end. last time that happened was with the 7970's and I ran 2 in xfire. Make a better product and you will not need to expend the cycles spinning how blue is the new red.
 
I understand why frame time is important. I also understand the point in the video about the difference between 200 and 230 FPS being of less meaning. However, I have not had the money in my life to build many systems that run a 2K monitor 230+ FPS in the games I play. What I'm most interested in is how a card runs games I want to play at 2K because I have a 2K monitor.

I agree with what Merc says about minimums being more telling than maximums. If a game can't maintain greater than 30FPS I don't want it even if it maintains consistent frametimes and doesn't tear.

I also think it's worth noting that system configuration, game code, and many many other factors contribute to both FPS and Frametimes. Vanilla Skyrim probably still stutters on a 2080ti even though the FPS is fine. I have seen too many posts from people complaining about "(insert video card here) sucks" and then how they are "running a top of the line (insert shitty CPU here)" with a brand new (insert 5400 RPM shitty drive here)".

If frame rates or frame times get you excited, great!

I think the Kyle and his crew have the right of it. The best testing is real world experience testing. How does my game or a similar game play on the card. Is it smooth or does it suck?
 
I think everything changed for me once I got a pair of Crossfired 7970s. I was running games over 60fps at all times yet somehow it felt like shit.

Low and behold the whole frame time measurements started to become huge and everyone realized just how important minimum frame rates and frame times are.

The only way I was ever able to make Crossfire feel alright was to set my max frame limit to 59 with no syncing going on. If I enabled v sync the input lag was horrible.
 
I think everything changed for me once I got a pair of Crossfired 7970s. I was running games over 60fps at all times yet somehow it felt like shit.

Low and behold the whole frame time measurements started to become huge and everyone realized just how important minimum frame rates and frame times are.

The only way I was ever able to make Crossfire feel alright was to set my max frame limit to 59 with no syncing going on. If I enabled v sync the input lag was horrible.
Trying to sort out frame times on 5870 E6 Crossfire is what finally drove me to setting up a [H] forum account after lurking for the better part of a decade :D
 
To be fair, average framerate numbers are pretty meaningless on their own, and showing an actual plot of FPS over time actually indicates proper smoothness(or lack of in some circumstances) the way [H] does.

That being said, it does come off as "We can't produce a high end product so please use this friendlier way to compare our products to others".

He wasn't saying that at all. He was just trying to explain why frame-time is a better metric for analysis, and that frame-time isn't linear. 30 to 60 fps is a huge leap in frame-time, but you get diminishing returns with each doubling of frame-rate in theoretical frame-time. Also, real-world frame-time on real-world hardware starts to suffer and have these hiccups that can't be seen on an average FPS graph.

Going from 144FPS to 300FPS is still kinda noticeable, but it's much less so than any previous doubling, which is surprising as it's computationally hugely more expensive. This is because the frame-time is less significantly effected. Like tenths of 1ms.

That's all the guy was saying. I understand the host kept wanting to drum up some controversy and steer the conversation, but the guy wasn't taking the bait, and sincerely wanted to explain this fact to people.
 
Back
Top