Time to Take Moore's Law is Dead Nvidia Article Seriously...

Since when has "day 1 reviews" stood the test of time if the product had glaring issue?

I don't care how long you guys have been around or how much you think you know, if you are regurgitating this line, you don't know as much as you think you do.

Let me ask you guys this, is the GTX 970 known for it's day 1 reviews or its 3.5GB of VRAM?
 
Since when has "day 1 reviews" stood the test of time if the product had glaring issue?

I don't care how long you guys have been around or how much you think you know, if you are regurgitating this line, you don't know as much as you think you do.

Let me ask you guys this, is the GTX 970 known for it's day 1 reviews or its 3.5GB of VRAM?
Type "GTX 970 review" into your favorite search engine and tell me how much you see about 3.5GB issues.
 
Since when has "day 1 reviews" stood the test of time if the product had glaring issue?

and to further this point...what are people talking about today regarding the Ampere launch?...the Day 1 reviews or the capacitor/CTD issues?...not just with GPU's but any product...
 
Like I said, you guys know more about this than I do. I am just guessing...... /s
 
Type "GTX 970 review" into your favorite search engine and tell me how much you see about 3.5GB issues.

From the very first link...

Despite the impressive specification and efficiency improvements, the GTX 970 is fighting an uphill battle. Its total throughput of 3.49 TFLOPS is impressive, but AMD’s Radeon R9 290 – a card that’s around £30 cheaper – serves up 4.84 TFLOPS of power.

The GTX 970’s memory situation isn’t great, either. There may be 4GB of GDDR5 soldered to the PCB, but that’s not the whole story: half a gigabyte of that memory is partitioned into a slower section that’s used for less demanding workloads, which means that only 3.5GB is available for high-end games. That’s not the only issue – Nvidia’s initial specification lists said that the GTX 970 had 64 ROPs and 2MB of cache, but it’s actually got 56 ROPs and 1.75MB of cache.
 
1601340655153.png
 
And the next 5 links are all launch day reviews with no mention of it :)

with the exception of the "5 years later one"

The first link has it, and everyone who knows anything about GPU's is also well aware of it. The 3.5GB fiasco tainted any good will the Day 1 reviews could ever possibly give it. All your comment shows is how google algorithm works, and even with that, the first link still mentions it. What do you think of when someone mentions 970? How many people do you know that don't have a clue about PC builds that are gonna go and buy one without asking a friend or family member with experience first? This whole BS about day 1 reviews, it just that... BS
 
And the next 5 links are all launch day reviews with no mention of it :)

with the exception of the "5 years later one"

how many people in 2020 are actually looking to buy and/or looking up reviews on the GTX 970?...it exists in Google search for historical purposes only
 
Cool with me. NVIDIA got exactly what they wanted out of this. Shining Day 1 reviews which will stand for the entire product cycle. And I think that MLID said that was specifically part of their agenda.
Except performance seems to be the same with the new driver.
 
Except performance seems to be the same with the new driver.
Interesting, I seem to come away with what data I have seen to be between 1 to 5%. Hardly seems the same to me. But after reading this thread, I know if I go back into the review business, I certainly do not need to put as much work into it to satisfy a lot of the readership.
 
Interesting, I seem to come away with what data I have seen to be between 1 to 5%. Hardly seems the same to me. But after reading this thread, I know if I go back into the review business, I certainly do not need to put as much work into it to satisfy a lot of the readership.

i thought that was a given now? Most ppl seem to just take everything at face value to influencers more then anything now. I find it rather depressing how most reviews seems to be a promotion of the product rather then a review of a product and ppl just eat up it up. There are very few "reviews" that i truly consider to be solid nowadays. It very sad.

on topic, i am more curious what the outcry would be like if this was AMD. For some reason, there are way more ppl that just let nvidia do whatever they want and don't make a peep out of it.
 
on topic, i am more curious what the outcry would be like if this was AMD. For some reason, there are way more ppl that just let nvidia do whatever
Sadly, AMD is not devious enough, coordinated enough, or have enough of a deathhgrip on its partners to actually be able to pull something like this off.
 
I would say all reviews should have an update note at the beginning of the review when needed which would link, walk someone to something more present and applicable up to a point. I don't think they should sit on an island by themselves which in the end can mislead people into obsolete inaccurate data, findings, problems etc. Better yet if more substantial information is known, put warning that the information is for historical prospective and current understanding and tests have been updated.

For example when the RNDA2 cards come out with the findings, the original 3080/90/70/60 reviews should have a link/update to the more applicable findings at the beginning of the first review (at least for a given generation). This would automatically give the site more hits. No one seems to do this. Would think a good programmer or database could almost do this automatically or somewhat. This may be more helpful to the viewer and earn more members or viewers since there is continuity helping the person navigate the information to the most current.
 
Interesting, I seem to come away with what data I have seen to be between 1 to 5%. Hardly seems the same to me. But after reading this thread, I know if I go back into the review business, I certainly do not need to put as much work into it to satisfy a lot of the readership.

Is that data public? Anecdotal reports from Ampere owners on Reddit seems to say different. I don't think any sites have benched the new driver yet.

Edit: PC world tested HZD on a non-production EVGA card and an FE.

The “fix” results in such a minor speed difference that you won’t practically notice it in the game itself. After repeatedly hammering the benchmark, I managed to get HZD to complete a 1440p run once with the original, borked drivers. It averaged 129 frames per second. With the new, slightly lower-clocked drivers installed, it averaged either 127 or 128 frames per second across three runs. That’s essentially the same performance. When I tested the Nvidia RTX 3080 Founders Edition with the new drivers, there was no performance or power draw change whatsoever, though it didn’t come close to breaking the 2GHz barrier.

https://www.pcworld.com/article/358...-3080-crashes-by-sacrificing-clock-speed.html
 
Last edited:
For example when the RNDA2 cards come out with the findings, the original 3080/90/70/60 reviews should have a link/update to the more applicable findings at the beginning of the first review (at least for a given generation).

Actually that brings up a good point. People will be far more interested in day 1 Navi2x reviews that also include Ampere cards than day 1 Ampere reviews without Navi.
 
  • Like
Reactions: noko
like this
and to further this point...what are people talking about today regarding the Ampere launch?...the Day 1 reviews or the capacitor/CTD issues?...not just with GPU's but any product...

People are talking about the capacitor (non) issue right now because there is a lot of confusion and misinformation combined with a lot of upset individuals looking for any and every reason to complain because they haven't copped a GPU yet. Nvidias latest driver fixed this issue. Pretty much everyone, from the mighty Asus to the shitty Zotac, is reporting the problem is gone, cards are stable and boosting properly. The conversation has already shifted from Nvidia fucking up to Jay's two cents being a moron who jumped the gun despite having little to no technical understanding of what he was "reporting" on.

In another week, nobody will be talking about it all. We will all just be back to complaining about how bad we want a 3080 and how annoying it is that it's so hard to get one. Eventually, the product line will flesh out, stock will stabilize, and that conversation dies out as well. There will come a time (soon) when no one is talking about all the complaints levied against Nvidia this week. But the stellar reviews, those aren't going anywhere. They will be a talking point in all future video card reviews, both amd and Nvidia, for years to come.
 
People are talking about the capacitor (non) issue right now because there is a lot of confusion and misinformation combined with a lot of upset individuals looking for any and every reason to complain because they haven't copped a GPU yet. Nvidias latest driver fixed this issue. Pretty much everyone, from the mighty Asus to the shitty Zotac, is reporting the problem is gone, cards are stable and boosting properly. The conversation has already shifted from Nvidia fucking up to Jay's two cents being a moron who jumped the gun despite having little to no technical understanding of what he was "reporting" on.

In another week, nobody will be talking about it all. We will all just be back to complaining about how bad we want a 3080 and how annoying it is that it's so hard to get one. Eventually, the product line will flesh out, stock will stabilize, and that conversation dies out as well. There will come a time (soon) when no one is talking about all the complaints levied against Nvidia this week. But the stellar reviews, those aren't going anywhere. They will be a talking point in all future video card reviews, both amd and Nvidia, for years to come.
Some really good testing here. de8auer took off two of the SP-caps, and replaced those with 20 MLCC caps on a 3080. Could only OC to "+70" on SP-caps, after the changing out to MLCCs, he could boost to +100. So certainly the MLCCs add some clock headroom.

 
Some really good testing here. de8auer took off two of the SP-caps, and replaced those with 20 MLCC caps on a 3080. Could only OC to "+70" on SP-caps, after the changing out to MLCCs, he could boost to +100. So certainly the MLCCs add some clock headroom.



Yep, the general consensus still seems to be the MLCC caps are better. But, I've seen plenty of folks over on reddit saying their crashing problems are solved on Zotac and Giabyte cards that are all SP's. I saw one report of a Zotac card boosting above 2100mhz stable after the driver update.

Bottom line is, from what I can tell, it's fine now. Premium models are shipping with higher numbers of MLCC caps, as they should, but the base model Skus appear to be totally fine without them. No recalls. No broken cards. No chaos. It's just... Fine.

I'm curious to know though, are there diminished returns here? If 20 MLCC caps is better, would 40 see even greater gains? Or does it not matter above a certain point? Everyone is acting like Asus is making God tier cards because they are all MLCC... But do we know if that even matters?
 
Premium models are shipping with higher numbers of MLCC caps, as they should, but the base model Skus appear to be totally fine without them. No recalls. No broken cards. No chaos. It's just... Fine.

Yeah it seems all the hardware configs are fine and the issue was an over eager boost algorithm. Now we just need to get some benchmarks with the new driver to see if the restrained boost has any performance impact. The little evidence we have so far points to a problem with short lived clock spikes and not sustained clocks.
 
People are talking about the capacitor (non) issue right now because there is a lot of confusion and misinformation combined with a lot of upset individuals looking for any and every reason to complain because they haven't copped a GPU yet. Nvidias latest driver fixed this issue. Pretty much everyone, from the mighty Asus to the shitty Zotac, is reporting the problem is gone, cards are stable and boosting properly. The conversation has already shifted from Nvidia fucking up to Jay's two cents being a moron who jumped the gun despite having little to no technical understanding of what he was "reporting" on.

In another week, nobody will be talking about it all. We will all just be back to complaining about how bad we want a 3080 and how annoying it is that it's so hard to get one. Eventually, the product line will flesh out, stock will stabilize, and that conversation dies out as well. There will come a time (soon) when no one is talking about all the complaints levied against Nvidia this week. But the stellar reviews, those aren't going anywhere. They will be a talking point in all future video card reviews, both amd and Nvidia, for years to come.

Jayz was not the guy who first reported the issue...respected sites like Igor's Lab and Buildzoid went into details on what might be the issue...don't let your hate of one Tuber cloud your views on the issue at hand...Jayz posted a video on it just like dozens of other Tubers

the driver 'fix' is just Nvidia lowering the boost to no longer hit that 2000+MHz number...Nvidia didn't give AIB partners enough time with the cards and it probably had to do with getting those early Founders reviews out

* I don't think Day 1 reviews are what people most remember about a product (especially if it has issues) as some people on here do but it does make the product look really good for the first few days when a lot of people are likely to buy it
 
The first link has it, and everyone who knows anything about GPU's is also well aware of it. The 3.5GB fiasco tainted any good will the Day 1 reviews could ever possibly give it. All your comment shows is how google algorithm works, and even with that, the first link still mentions it. What do you think of when someone mentions 970? How many people do you know that don't have a clue about PC builds that are gonna go and buy one without asking a friend or family member with experience first? This whole BS about day 1 reviews, it just that... BS

A review years later hardly shows anything, show something from 2014/15 if you want to go down that road.
 
A review years later hardly shows anything, show something from 2014/15 if you want to go down that road.

It wasn’t discovered right away. All you’re proving is reviews prior to the find were never changed. That isnt the claim All the MLID worshippers are making. They are saying all anyone will pay attention to is day 1 reviews and they will supersede anything that comes after it no matter how negative.

The 970 is known in the community for its vram issue, not its glowing day 1 reviews or the value it provided on Day 1. Period. How many times are you guys going to shift goal posts?

If you meant to say Day1 reviews won’t themselves be amended you should’ve said so from the beginning. There not what you guys are saying. You’re saying Day 1 reviews are all that matters and it’s what everyone will remember. The 970 is proof that’s complete BS.
 
Jayz was not the guy who first reported the issue...respected sites like Igor's Lab and Buildzoid went into details on what might be the issue...don't let your hate of one Tuber cloud your views on the issue at hand...Jayz posted a video on it just like dozens of other Tubers.

Igors lab speculated about what the issue could be. It was a level headed and well written artical about what could be happening.

Jay took that artical and turned it into sensational drivel presented as fact to an audience that doesn't know any better.

Both were inaccurate, but presentation means a lot.
 
While people may remember this launch it doesn't help that the YouTubers are covering thier own asses BIG time. Of all of them the one I take the most umbrage over was Gamer Nexus. Even after thier video of explaining the problems with the cards they still insisted that the 3080's were good cards. This is AFTER finding out that because of clocks these cards need clean power in order to hit the advertised clocks. I say Gamer Nexus because they know better. A 80 series card requiring 100 watts more power to hit it's performance numbers after a die shrink isn't something I would write home about. Might as well throw Hardware Unboxed in there too.

This is definitely a wait and see situation to see if AMD screws this up or not. But overall if AMD hits 90% performance at 75% of the power then it's going to be very clear what's going on. The 3090 is completely safe. It will be at the top. The 3080, well if that card loses in anything it's going to be trouble because AMD from process alone is likely going to come in lower in the TDP department.

Hardware Unboxed / GN dismisses the power increases. But one sign that's going to annoy people maybe not now but definitely in summer is when the weather starts to really hit these cards. 350w in anything dissipating that kind of heat is what it is. Winter you'll love it. Summer? Ummm not so much.
 
While people may remember this launch it doesn't help that the YouTubers are covering thier own asses BIG time. Of all of them the one I take the most umbrage over was Gamer Nexus. Even after thier video of explaining the problems with the cards they still insisted that the 3080's were good cards. This is AFTER finding out that because of clocks these cards need clean power in order to hit the advertised clocks. I say Gamer Nexus because they know better. A 80 series card requiring 100 watts more power to hit it's performance numbers after a die shrink isn't something I would write home about. Might as well throw Hardware Unboxed in there too.

This is definitely a wait and see situation to see if AMD screws this up or not. But overall if AMD hits 90% performance at 75% of the power then it's going to be very clear what's going on. The 3090 is completely safe. It will be at the top. The 3080, well if that card loses in anything it's going to be trouble because AMD from process alone is likely going to come in lower in the TDP department.

Hardware Unboxed / GN dismisses the power increases. But one sign that's going to annoy people maybe not now but definitely in summer is when the weather starts to really hit these cards. 350w in anything dissipating that kind of heat is what it is. Winter you'll love it. Summer? Ummm not so much.
I'm looking at the numbers people are reporting on reddit that if you undervolt the cards by 25%-30% you are only losing out on 10% performance in one of the posts. Nvidia is desperately wringing every little drop out of these cards and rushing every step along the way. I get they want and need to stay ahead I think they should have pulled back on the power limits and gpu boost leaving it up to the end users to get more if they chose to do that. Doing so would have given AMD a lower target I know but I don't think its worth it, I'll be playing around under volting my card, I won't care too much on the wife's.
 
Last edited:
While people may remember this launch it doesn't help that the YouTubers are covering thier own asses BIG time. Of all of them the one I take the most umbrage over was Gamer Nexus. Even after thier video of explaining the problems with the cards they still insisted that the 3080's were good cards. This is AFTER finding out that because of clocks these cards need clean power in order to hit the advertised clocks. I say Gamer Nexus because they know better. A 80 series card requiring 100 watts more power to hit it's performance numbers after a die shrink isn't something I would write home about. Might as well throw Hardware Unboxed in there too.

This is definitely a wait and see situation to see if AMD screws this up or not. But overall if AMD hits 90% performance at 75% of the power then it's going to be very clear what's going on. The 3090 is completely safe. It will be at the top. The 3080, well if that card loses in anything it's going to be trouble because AMD from process alone is likely going to come in lower in the TDP department.

Hardware Unboxed / GN dismisses the power increases. But one sign that's going to annoy people maybe not now but definitely in summer is when the weather starts to really hit these cards. 350w in anything dissipating that kind of heat is what it is. Winter you'll love it. Summer? Ummm not so much.

I mean are they not good cards? Unless something else crops up, looks like the latest drivers fix the boosting and crash issues without losing any performance. AMD has been less efficient than Nvidia even being on a smaller node, so I'm not sure why you expect them to suddenly become more power efficient. Also, you can always buy a different card if you don't like the high power consumption. If you can't keep your room cool enough for these cards, buy a different one or change your cooling (A/C).

So far looks like MLID is was "right" about the limited availability. If you want to give him the AIB's cards being worse, I guess you can except they perform as good/better than the FE so not really.
 
I mean are they not good cards? Unless something else crops up, looks like the latest drivers fix the boosting and crash issues without losing any performance. AMD has been less efficient than Nvidia even being on a smaller node, so I'm not sure why you expect them to suddenly become more power efficient.
Also, you can always buy a different card if you don't like the high power consumption. If you can't keep your room cool enough for these cards, buy a different one or change your cooling (A/C).

So far looks like MLID is was "right" about the limited availability. If you want to give him the AIB's cards being worse, I guess you can except they perform as good/better than the FE so not really.
Looking at the power draw.. in all honesty no. Everything i'm looking at has absolutely nothing to do with AMD. Process shrinks are supposed to bring efficiency. This card brings a 10% - 15% efficiency rate over the previous generation. That's not a lot.

Here's the deal. For many many releases somewhere around 250w was the limit of where enthusiasts lived. At 100w more you're talking about above Titan X territory. At that limit someone should at least pause. An extra 100 watts is nothing to sneeze at inside of a case, or a room. And no i'm not going to get a new A/C unit because of one video card...lol.

Could Nvidia surprise us all with performance that's held back by drivers? Yes. There's a doubling of cores so the performance has to be somewhere. That's what I'm looking at and waiting for.
 
Last edited:
Looking at the power draw.. in all honesty no. Everything i'm looking at has absolutely nothing to do with AMD. Process shrinks are supposed to bring efficiency. This card brings a 10% - 15% efficiency rate over the previous generation. That's not a lot.

Here's the deal. For many many releases somewhere around 250w was the limit of where enthusiasts lived. At 100w more you're talking about above Titan X territory. At that limit someone should at least pause. An extra 100 watts is nothing to sneeze at inside of a case, or a room. And no i'm not going to get a new A/C unit because of one video card...lol.

Could Nvidia surprise us all with performance that's held back by drivers? Yes. There's a doubling of cores so the performance has to be somewhere. That's what I'm looking at and waiting for.

Complaining about efficiency for a gaming GPU (especially when you got a small one) is like buying a Ford Super Duty and complaining about gas mileage.
 
Last edited:
Some may have issues with heat do to their configuration, others have been dealing with way more heat than 350w, as in 2x 1080 Ti's or even 2x 2080 Ti's and know how to keep their rig cool. AMD Vega 64 Liquid Cool was rated at 345w and AMD gave you a whole 50%+ Power Target :D. It would do it! Over 500w! The card was built to handle that amount of power, cooler was really good up to 450w. Just you got virtually nothing out of it for the effort besides a better heater. A 2 slot card with an AIO. Anyways I just don't think 50w-100w is that significant, yes for some it would bring them over the top => Solved by getting more fans, better case. The ASUS 480w card -> bring it on.

As for coolers, I think Nvidia did an outstanding job, Gamer Nexus did a good review of the 3080 FE cooler and how it affects CPU, PCH, ram, other stuff. Cooler design is solid and I think is a step up from the three fan huge coolers taking up to 3 slots. A 2 slot cooler giving a run for the money with a 3 slot cooler is a good thing.

Here in Florida, central air is good for cooling a room but start dumping a lot of heat into it, either freeze the house out and pay huge electrical bills to keep your one room nice enough to use for gaming or what ever else. Or, put in an A/C unit either windowed or extra one ducted just for that one room. I have 18,000 BTU unit, lol -> had a lot of crap going on for awhile. Still a 6,000 BTU unit are light, cheap, install in less than 15min and will cool most small rooms well for a heat generating computer. Winter comes around, remove it in less than 15min, except in Florida Winter never comes around. You could even make your own ducting to and from that room if you don't want a window unit due to noise or appearance (I could care less). Would take some effort and some holes which may not be what one would want. Could just duct through window with an adaptor which then is easily removed when not needed. I just can't stand a hot room, can't enjoy VR or anything on the computer unless it is nice and cool as well as dry. If you have extra money than an air handler with evaporator/fan in your room or atic and compressor/condenser outside is the slick way.
 
Last edited:
Complaining about efficiency for a gaming GPU (especially when you got a small one) is like buying a Ford Super Duty and complaining about gas mileage.
Huh? Not really complaining, I'm underlining the difference between the older models vs this and that's about it. Using your own analogy if a Ford focus was sold as a Ford Focus but upon refresh turned into a Ford super duty then there's going to be questions, especially if on launch the Ford focus was crashing to the desktop.
 
Last edited:
Hardware Unboxed / GN dismisses the power increases.

Yes, power consumption is off the charts but Ampere is still the most power efficient architecture on the market today. It’s pretty difficult for reviewers to complain about power consumption right now. It would be silly to complain about the “best”.

Now if Navi 2 drops with much better power efficiency you will see opinions about Ampere change very quickly.
 
Some may have issues with heat do to their configuration, others have been dealing with way more heat than 350w, as in 2x 1080 Ti's or even 2x 2080 Ti's and know how to keep their rig cool. AMD Vega 64 Liquid Cool was rated at 345w and AMD gave you a whole 50%+ Power Target :D. It would do it! Over 500w! The card was built to handle that amount of power, cooler was really good up to 450w. Just you got virtually nothing out of it for the effort besides a better heater. A 2 slot card with an AIO. Anyways I just don't think 50w-100w is that significant, yes for some it would bring them over the top => Solved by getting more fans, better case. The ASUS 480w card -> bring it on.
This is a little bit more interesting to think about, specifically the 2x scenario. If you look at the card that way I suppose it doesn't look too bad. It basically runs somewhat like a SLI config at less power. That's a good way to look at it I suppose.

As for coolers, I think Nvidia did an outstanding job, Gamer Nexus did a good review of the 3080 FE cooler and how it affects CPU, PCH, ram, other stuff. Cooler design is solid and I think is a step up from the three fan huge coolers taking up to 3 slots. A 2 slot cooler giving a run for the money with a 3 slot cooler is a good thing.
No problem with the cooler I agree on that front. It's actually pretty damn cool when it comes to that.

Here in Florida, central air is good for cooling a room but start dumping a lot of heat into it, either freeze the house out and pay huge electrical bills to keep your one room nice enough to use for gaming or what ever else. Or, put in an A/C unit either windowed or extra one ducted just for that one room. I have 18,000 BTU unit, lol -> had a lot of crap going on for awhile. Still a 6,000 BTU unit are light, cheap, install in less than 15min and will cool most small rooms well for a heat generating computer. Winter comes around, remove it in less than 15min, except in Florida Winter never comes around. You could even make your own ducting to and from that room if you don't want a window unit due to noise or appearance (I could care less). Would take some effort and some holes which may not be what one would want. Could just duct through window with an adaptor which then is easily removed when not needed. I just can't stand a hot room, can't enjoy VR or anything on the computer unless it is nice and cool as well as dry. If you have extra money than an air handler with evaporator/fan in your room or atic and compressor/condenser outside is the slick way.
We shall see. I already have a 4u server to cool. I think I'm going to wait. I could always go for a lesser model to be honest.
 
Last edited:
Yes, power consumption is off the charts but Ampere is still the most power efficient architecture on the market today. It’s pretty difficult for reviewers to complain about power consumption right now. It would be silly to complain about the “best”.

Now if Navi 2 drops with much better power efficiency you will see opinions about Ampere change very quickly.
I agree. Not complaining about the best. Just was looking at the power and was scratching my head at the increase. Basically I wasn't planning on it.
 
ROFL!!! How are people not seeing that MLID has been on point? I can’t understand this corporate loyalty bullshit. Unless NVIDIA paid me 6 figures, I wouldn't waste my time defending a multi billion dollar company who’s MO is to fleece me. Some people are fucking strange.
facts!
 
Huh? Not really complaining, I'm underlining the difference between the older models vs this and that's about it. Using your own analogy if a Ford focus was sold as a Ford Focus but upon refresh turned into a Ford super duty then there's going to be questions, especially if on launch the Ford focus was crashing to the desktop.

you aren’t using my Analogy correctly. Ill help. Using my analogy would be like buying a 2021 SuperDuty that can tow more and is more efficient than the 2019 Super Duty, when towing up to the 2019s tow capacity but uses more fuel when you to IT‘S towing capacity and complaining about it.
 
Interesting, I seem to come away with what data I have seen to be between 1 to 5%. Hardly seems the same to me. But after reading this thread, I know if I go back into the review business, I certainly do not need to put as much work into it to satisfy a lot of the readership.

That is interesting. Can you post the reviews your seeing? Particularly the ones showing a 5% difference? I’ll post the ones I’ve seen. Because the ones I’ve seen show virtually identical performance. Sometimes there’s a variance of 1 FPS and it’s not always in the old drivers favor.
 
That is interesting. Can you post the reviews your seeing? Particularly the ones showing a 5% difference? I’ll post the ones I’ve seen. Because the ones I’ve seen show virtually identical performance. Sometimes there’s a variance of 1 FPS and it’s not always in the old drivers favor.
Good to see. Hardware Unboxed is my go-to for solid information and he seems to overall say that the perf difference is negligible. ~-1%

 
Back
Top