What is going on with H Reviews now leaving value out of them?!

xAlex79

Limp Gawd
Joined
Nov 21, 2010
Messages
179
In the last few reviews somehow you completely leave the value part of your reviews in?

Why is that?

I mean in the Skyrim review you keep saying two 580GTX in SLI delivers better performance. It does, but at what price?

A single 580 GTX is what? $450~500? How much is a 6970 ? $300~350? Why is that left out of your reviews? This is the one thing I liked about you guys is that you always had the "value" metrics along with the apples to apples. Where is the value now? Did Nvidia buy you out too? *sigh*

Edit: I mean on a value standpoint what you should be comparing is 6970 Tri-fire Versus 580GTX SLI....

With Anand and Tom being terribly bias and never presenting any value metrics this was the last place left...
 
mostly because its a computer enthusiast site not a computer value site. but ya ive always wanted them to review games with a rig thats 2.5 to 3 years old so us poor folks get an idea of game play experiences on our old rigs.
 
[H] usually adds value to the equation, but you can't expect them to do a single card, crossfire and tri-fire review 3 days AFTER the drivers come out? what do you think they have a time machine? and if so can i get in on that action, I wouldn't mind buying some msft / apple stock b4 it was too pricey :D
 
mostly because its a computer enthusiast site not a computer value site. but ya ive always wanted them to review games with a rig thats 2.5 to 3 years old so us poor folks get an idea of game play experiences on our old rigs.

Exactly, as an Enthusiast If my $900 can buy me either three 6970 or two 580GTX. I want to know what performs better...

When the value of the setup isn't matched it's not much of a comparison.

No one build a system on hopes and dreams. Everyone has a budget. In the enthusiast market the common limit for GPU is $1000 ish.

I can't believe I have to make this argument here.... :(
 
[H] usually adds value to the equation, but you can't expect them to do a single card, crossfire and tri-fire review 3 days AFTER the drivers come out? what do you think they have a time machine? and if so can i get in on that action, I wouldn't mind buying some msft / apple stock b4 it was too pricey :D

My problem is not that they didn't test tri-fire. It's that they left out the value metric entirely.

Because on a dollar for dollar I am not so sure Nvidia is the better option. It almost never is.
 
i agree

at least in previous reviews, they would at least mention that the 580 costs quite a bit more (especially in pairs) compared to the 6970
 
In the last few reviews somehow you completely leave the value part of your reviews in?

Why is that?

I mean in the Skyrim review you keep saying two 580GTX in SLI delivers better performance. It does, but at what price?

A single 580 GTX is what? $450~500? How much is a 6970 ? $300~350? Why is that left out of your reviews? This is the one thing I liked about you guys is that you always had the "value" metrics along with the apples to apples. Where is the value now? Did Nvidia buy you out too? *sigh*

Edit: I mean on a value standpoint what you should be comparing is 6970 Tri-fire Versus 580GTX SLI....

With Anand and Tom being terribly bias and never presenting any value metrics this was the last place left...


because that review had absolutely nothing to do with value, they were meant to show the performance improvements between AMD's top tier card and Nvidia's top tier card. theres no reason to go back and test them all, it doesn't take a rocket scientist to figure out the performance gain of the cards below it.

if you want to see value then go read the primary skyrim review. which had the entire line up from GTX 580 SLI/HD6970CFX, GTX 570/HD6950, GTX 560/HD6870.
 
I agree with the OP, this is why reading reviews takes so long. You have to read the review, cross reference/second opinion it with another review then go find all the pricing points on your own. Then figure out CFX/SLI PCI-X 4x/8x/16x performance, and CPU Scaling... LOL! Energy consumption idle/load/sli or cfx. Features 3D Vision vs Multi monitor/screen on 1 card.

And to all see a person get frame capped in Rage or NFS The run with SLI 580's is nuts! @ 60 or 30 fps CAP! So after reading all that and anaylzing the data is just so messed up. Or Skyrim using like 1-2 threads and capping out where it shouldn't if Bethesda coulda just hired a few better coders to optimize their game code on Quads or Hex's...hmmm. Wow. Life sux for PC gamers.

We Spend 2,3 or 4 times as much as Console gamers but do we really get that much more excitement in return? I'm starting to doubt one of my favorite hobbys futures in 2012. I hope I'm wrong and there will be surprises in 2012 for us die hard PC Gamers...(shrugs shoulders, cracks neck) I bet I'm correct and there ain't shit for us in 2012 tho lmao :D Oh well get a XBOX 360 and play the Forza exclusives, and all the other PC ports TWTMTBP on an XBOX ;) Hey I made a funny!

Seriously though this year I'm playing Batman AA and AC (in 3D Vision), NFS S2, NFS HP, and NFS The Run, Rage, Skyrim, Mass Effects, Dirt 3 etc. I'm sorta scratching my head why?

Except Metro 2033 and BF3 I didn't miss much....hmmmm Confucius thoughts now polute my brain...damn you PC gaming, damn you!
 
because those reviews had absolutely nothing to do with value, they were meant to show the performance between AMD's top tier card and Nvidia's top tier card. duh!

if you want to see value then go read the other skyrim review.

... I have and they only mention the price of the Nvidia cards. What I wanted to see was 580s SLI gives 86.1FPS so 86.1/900 = 0.096 fps per $ AMD 6970 Xfire gives 68.4 FPS so 68.4/600 = 0.114 FPS per $ Thus AMD is 17% Better value than Nvidia per dollar out of your pockets.

When I say they used to provide a value metric I mean a performance/$ one.

I can buy three 6970 for the cost of two 580s. And let me tell you three 6970 just take the lunch money of a 580 sli setup period. FOR THE SAME PRICE

Why on earth would I go out and buy two cards that will perform worse???
 
Last edited:
The last few years have been tough for me, I got in over my head with too much credit and have been working a second job off and on for as many years. With that said, I have a Q6600 CPU and an 8800 GT video card.

I'm in finally to the point where I feel like I HAVE to upgrade, turning everything down as low as it will go in the latest games is not always enough to make the game run fast.

After researching for a few days, I came to the conclusion that my Q6600 will probably be OK for another year or two. Not great, but it's my video card that's really holding me back.

So I start trying to find reviews that show how today's games run with very old technology like an 8800 GT and Q6600. They don't exist unless you use synthentic benchmarks (which is what I ended up using).

I would like to see a comparison showing the fastest video cards with everything turned up, and a series of different CPU's and different clocks showing at what point the games becomes CPU bottlenecked, and which graphical settings depend more on GPU or the CPU.

I was playing CIV5 on my laptop, which has a core duo 1.6ghz CPU and 8800 mobile GPU, and I had everything turned down all the way. I played that way for months before it dawned on me that I was totally CPU bound. I turned everything up as high as it would go in the graphics settings and it still played fine, I might have been getting slower FPS, but that hardly matters in a turn based game. The graphics looked great and the time between turns did not noticeably increase.

I did just barely order a Radeon 4890 to replace the 8800GT in my main rig for $70.00. All those years of paying down credit and not buying ANYTHING, on credit or even cash, paid off. I would just like to be able to make better decisions about what hardware to buy if I can't afford the latest generation. My next upgrade will probably be the video card yet again, it will probably offer me the largest gains for the money.

But like I said, I would really like to see what the slowest CPU is you can have and still not be bottlenecked in the latest games.

They will turn settings DOWN to make a game become CPU bottlenecked, but not use older CPUs with settings turned up to show when games become bottlenecked.

Also, I would be interested in seeing how many people play with a resolution greater then 1900x. I bought a 42" LCD, which in my opinion will give a much better gaming experience then a 24" or 27" LCD in a higher resolution.
 
The last few years have been tough for me, I got in over my head with too much credit and have been working a second job off and on for as many years. With that said, I have a Q6600 CPU and an 8800 GT video card.

I'm in finally to the point where I feel like I HAVE to upgrade, turning everything down as low as it will go in the latest games is not always enough to make the game run fast.

After researching for a few days, I came to the conclusion that my Q6600 will probably be OK for another year or two. Not great, but it's my video card that's really holding me back.

So I start trying to find reviews that show how today's games run with very old technology like an 8800 GT and Q6600. They don't exist unless you use synthentic benchmarks (which is what I ended up using).

I would like to see a comparison showing the fastest video cards with everything turned up, and a series of different CPU's and different clocks showing at what point the games becomes CPU bottlenecked, and which graphical settings depend more on GPU or the CPU.

I was playing CIV5 on my laptop, which has a core duo 1.6ghz CPU and 8800 mobile GPU, and I had everything turned down all the way. I played that way for months before it dawned on me that I was totally CPU bound. I turned everything up as high as it would go in the graphics settings and it still played fine, I might have been getting slower FPS, but that hardly matters in a turn based game. The graphics looked great and the time between turns did not noticeably increase.

I did just barely order a Radeon 4890 to replace the 8800GT in my main rig for $70.00. All those years of paying down credit and not buying ANYTHING, on credit or even cash, paid off. I would just like to be able to make better decisions about what hardware to buy if I can't afford the latest generation. My next upgrade will probably be the video card yet again, it will probably offer me the largest gains for the money.

But like I said, I would really like to see what the slowest CPU is you can have and still not be bottlenecked in the latest games.

They will turn settings DOWN to make a game become CPU bottlenecked, but not use older CPUs with settings turned up to show when games become bottlenecked.

Also, I would be interested in seeing how many people play with a resolution greater then 1900x. I bought a 42" LCD, which in my opinion will give a much better gaming experience then a 24" or 27" LCD in a higher resolution.

For "gaming" only save your money and get a i3 w/HT 2130 clocked at 3.4 you will save more than a hundred bucks compared to an i5 or i7 and it will not hold you back in games at all, unless you get a high end dual or triple GPU setup.

2500k and up are overrated for gaming. They are no doubt awesome for productivity, but it really ends there. I almost regret buying my 2600k... it literally never goes above 20% utilization no matter what game I play .. it's more often than not floating at about 10% and I have an OCed 6990 that often drives upwards of 100 FPS.
 
Last edited:
I agree. Crossfire 6970's cost $600. SLI GTX 570's cost $600.

That's the comparison. I always expect GTX 580's to win because it will cost $900 to get them in SLI.

50% cost increase better give much better performance!
 
this was the last place left...

for what?

Honest reviews that take into account value. Because it's the only metric that matters.

Everyone when they look to buy anything is looking for the best performance their money can afford. Recommending a 580 GTX SLI setup if you have $900 to spend is just downright silly on a value standpoint.

They ONLY way a 580GTX is a good buy is if you do not want a multi-card setup. Because then top performance is the end all since the top budget is still under $500.
 
Honest reviews that take into account value. Because it's the only metric that matters.

Everyone when they look to buy anything is looking for the best performance their money can afford. Recommending a 580 GTX SLI setup if you have $900 to spend is just downright silly on a value standpoint.

They ONLY way a 580GTX is a good buy is if you do not want a multi-card setup. Because then top performance is the end all since the top budget is still under $500.

value is not the only metric that matters, value is important as well, and honestly. It has been mentioned 50000 times by [H] reviewers already that the 6970 is considerably cheaper than the 580, and that tri-fire is the same price as 580s in SLI.

on top of that, absolute performance comes before value for me, first I want to know what will give me the best in this game, and then i want to know by how much compared to the competition and what the price/performance looks like. If we go by what you're saying a pair of 6950s is the best choice, or maybe even 6870s?
 
... I have and they only mention the price of the Nvidia cards. What I wanted to see was 580s SLI gives 86.1FPS so 86.1/900 = 0.096 fps per $ AMD 6970 Xfire gives 68.4 FPS so 68.4/600 = 0.114 FPS per $ Thus AMD is 17% Better value than Nvidia per dollar out of your pockets.

When I say they used to provide a value metric I mean a performance/$ one.

I can buy three 6970 for the cost of two 580s. And let me tell you three 6970 just take the lunch money of a 580 sli setup period. FOR THE SAME PRICE

Why on earth would I go out and buy two cards that will perform worse???

because in buying the nvidia cards you get better driver support (working SLI at any major game release for starters) and you less often have to contend with SLI/CFX microstudder. as an Nvidia SLI user i had to read about microstudder to find out what it was all about, i have yet to experience it in any game. from what i read Tri-CFX is almost needed to make sure your can't detect microstudder in many games while only dual SLI setups work just fine in the vast majority of games. then you've also got the fact that nvidia supports their cards in other OS'es like linux so if you run into problems with community made drivers you can always use nvidia's proprietary drivers. I get the value part on AMD cards but for me personally consistency and reliability are more important and i would hate to run into problems playing a new game when it comes out esp after dumping 500+ dollars on video cards only to have to wait for AMD to fix their damn drivers before something is playable.
 
For "gaming" only save your money and get a i3 w/HT 2130 clocked at 3.4 you will save more than a hundred bucks compared to an i5 or i7 and it will not hold you back in games at all, unless you get a high end dual or triple GPU setup.

2500k and up are overrated for gaming. They are no doubt awesome for productivity, but it really ends there. I almost regret buying my 2600k... it literally never goes above 20% utilization no matter what game I play .. it's more often than not floating at about 10% and I have an OCed 6990 that often drives upwards of 100 FPS.

you haven't played BF3 yet then have you, in benchmarks i have done on my system in multiplayer my minimum frame rates as long as im only running high settings with no AA are determined by how fast my CPU is, the higher i overclock it the higher my min and avg frame rates go. BF3 is the first game that has forced me to upgrade my core 2 quad i had been running for many years.
 
value is not the only metric that matters, value is important as well, and honestly. It has been mentioned 50000 times by [H] reviewers already that the 6970 is considerably cheaper than the 580, and that tri-fire is the same price as 580s in SLI.

on top of that, absolute performance comes before value for me, first I want to know what will give me the best in this game, and then i want to know by how much compared to the competition and what the price/performance looks like. If we go by what you're saying a pair of 6950s is the best choice, or maybe even 6870s?

So you reckon it's fair to compare a $900 setup to a $600 one? And conclude and say Nvidia is better?

Not in my book... and it did not use to be in H book's either is all I am saying.
 
because in buying the nvidia cards you get better driver support (working SLI at any major game release for starters) and you less often have to contend with SLI/CFX microstudder. as an Nvidia SLI user i had to read about microstudder to find out what it was all about, i have yet to experience it in any game. from what i read Tri-CFX is almost needed to make sure your can't detect microstudder in many games while only dual SLI setups work just fine in the vast majority of games. then you've also got the fact that nvidia supports their cards in other OS'es like linux so if you run into problems with community made drivers you can always use nvidia's proprietary drivers. I get the value part on AMD cards but for me personally consistency and reliability are more important and i would hate to run into problems playing a new game when it comes out esp after dumping 500+ dollars on video cards only to have to wait for AMD to fix their damn drivers before something is playable.

We are not talking about drivers here. I have owned SLI setups myself and has as much problems with them.

There is shuttering with SLI as well, you cannot start picking the games that work with SLI better to make your case.

What I am saying is that comparing a 6970 cfx setup that costs $600 to a 580SLI that costs $900 does not make sense.
 
So you reckon it's fair to compare a $900 setup to a $600 one? And conclude and say Nvidia is better?

Not in my book... and it did not use to be in H book's either is all I am saying.

yes it's ok, in the context that it's presented: AMDs best vs Nvidias best.look at the original tri-fire vs sli review, they were compared already, and we already know that tri-fire provides more performance for the same price. we don't need to constantly hear that 3 cards are better than 2. There are some situations where 6970 xfire is the same or faster than the 580sli, that is more interesting imo.
 
Not to mention power usage and heat output. That certainly factors into my requirements of "value".
 
yes it's ok, in the context that it's presented: AMDs best vs Nvidias best.look at the original tri-fire vs sli review, they were compared already, and we already know that tri-fire provides more performance for the same price. we don't need to constantly hear that 3 cards are better than 2. There are some situations where 6970 xfire is the same or faster than the 580sli, that is more interesting imo.

Right, As said above, plus how is seeing two 580s constantly beat two 6970 is any more interesting? For the occasional major fail from Nvidia when a $300 cheaper setup beat them? Really?

The point of reviews is to help consumers make educated choice based on performance of graphics cards. Everyone buys with a budget, it's only logical to make comparison based on price point. Pitting setups with wildly different costs does not make any sense.

If you want to see a 580 SLI in there.. you HAVE to pit it against a Tri-fire of 6970. If you want to put a Xfire 6970 you have to pit it against a 570 SLI. So on so forth.
 
you haven't played BF3 yet then have you, in benchmarks i have done on my system in multiplayer my minimum frame rates as long as im only running high settings with no AA are determined by how fast my CPU is, the higher i overclock it the higher my min and avg frame rates go. BF3 is the first game that has forced me to upgrade my core 2 quad i had been running for many years.

The i3 2130 is much faster than your core 2 and clocked higher. I'm sure it can run BF3 fine at 1200p or 1080p.

The guy said he was on a budget and he did not mention productivity. The high end i3 are perfect for gaming. You benefit much more for the extra money on GPU. Anyway you are free to disagree with me.
 
The i3 2130 is much faster than your core 2 and clocked higher. I'm sure it can run BF3 fine at 1200p or 1080p.

The guy said he was on a budget and he did not mention productivity. The high end i3 are perfect for gaming. You benefit much more for the extra money on GPU. Anyway you are free to disagree with me.

benchmarks i did on my 2500k with 2 cores disabled and downclocked to 3.4 ghz resulted in very bad bottlenecking on my system. the game was playable in mutiplayer at 1080p but it had CPU induced framerate dips into the low 30's and very apparent studdering when this occurred.

http://hardforum.com/showthread.php?t=1654043

is my benchmarks.

a decent quad core is a must in this game if you want to keep around or above 60 fps in multiplayer regardless of what GPU you have. if i ran a 4.2 ghz overclock with just 2 cores performance was much better min frame rates at 50 fps but you cannot overclock i3's so this point is moot.
 
benchmarks i did on my 2500k with 2 cores disabled and downclocked to 3.4 ghz resulted in very bad bottlenecking on my system. the game was playable in mutiplayer at 1080p but it had CPU induced framerate dips into the low 30's and very apparent studdering when this occurred.

http://hardforum.com/showthread.php?t=1654043

is my benchmarks.

a decent quad core is a must in this game if you want to keep around or above 60 fps in multiplayer regardless of what GPU you have.

Those i3 have HT so they technically have 4 cores... so how would it do on 4 [email protected] since you cannot OC an i3.
 
Those i3 have HT so they technically have 4 cores... so how would it do on 4 [email protected] since you cannot OC an i3.

no, hyperthreading does not make a dual core CPU technically have 4 cores quite the opposite, it makes the CPU virtually have 4 cores i.e. to software it sees 4 cores but really its just 2 paths to each core. this only results in a performance increase in ALU intensive multi-threaded applications and even then it will never double performance. games that rely on the CPU for physics calculations almost purely use the FPU section of a CPU and hyper-threading makes little to no difference in performance.
 
no, hyperthreading does not make a dual core CPU technically have 4 cores quite the opposite, it makes the CPU virtually have 4 cores i.e. to software it sees 4 cores but really its just 2 paths to each core. this only results in a performance increase in ALU intensive multi-threaded applications and even then it will never double performance. games that rely on the CPU for physics calculations almost purely use the FPU section of a CPU and hyper-threading makes little to no difference in performance.

Ok, well I will not argue. Since I do not have it to test. So in that ONE game IN multi-player only it does make a difference.

Otherwise refer to Tom's review where it performed on par with even the 2600k@stock.
 
Ok, well I will not argue. Since I do not have it to test. So in that ONE game IN multi-player only it does make a difference.

Otherwise refer to Tom's review where it performed on par with even the 2600k@stock.

if you read my benchmark thread that review tom's did as well as many other sites is what drove me to do those benchmarks. tom's like the vast majority of review sites made a fatal mistake when they tested CPU performance in the game and only tested single player and the level they tested specifically was probably one of the least graphics intensive levels as it was on an aircraft carrier going down a narrow hallway and then out onto the main deck and into a jet. the rest of the level you dont do anything but be a passenger in the jet targeting for the pilot. there is very little physics going on there compared to multiplayer where you got buildings being destroyed around you and other crazy destruction happening, all that rely on the CPU

this Swedish review site did the best cpu benchmarks on the game from what i have found so far. http://hardforum.com/showpost.php?p=1038065637&postcount=15
 
if you read my benchmark thread that review tom's did as well as many other sites is what drove me to do those benchmarks. tom's like the vast majority of review sites made a fatal mistake when they tested CPU performance in the game and only tested single player and the level they tested specifically was probably one of the least graphics intensive levels as it was on an aircraft carrier going down a narrow hallway and then out onto the main deck and into a jet. the rest of the level you dont do anything but be a passenger in the jet targeting for the pilot. there is very little physics going on there compared to multiplayer where you got buildings being destroyed around you and other crazy destruction happening, all that rely on the CPU

this Swedish review site did the best cpu benchmarks on the game from what i have found so far. http://hardforum.com/showpost.php?p=1038065637&postcount=15

Unfortunately the i3 isn't there. Look I am not saying an i3 is a beast cpu that can handle anything. BF3 Multi player is probably the most CPU hungry game ever. If you are on a budget and you cannot splurge it's a great value for a budget build is all I was saying.

Say if that $100 you save gets you a 6950 instead of a 6850. It should 99% of the time result in better performance, bar maybe BF3, where you will do avrg 66, which is not bad for a budget computer.
 
Unfortunately the i3 isn't there. Look I am not saying an i3 is a beast cpu that can handle anything. BF3 Multi player is probably the most CPU hungry game ever. If you are on a budget and you cannot splurge it's a great value for a budget build is all I was saying.

Say if that $100 you save gets you a 6950 instead of a 6850. It should 99% of the time result in better performance, bar maybe BF3, where you will do avrg 66, which is not bad for a budget computer.

i don't disagree, the vast majority of games an i3 sandy bridge is perfectly fine. people looking to buy one for gaming should know that in BF3 its more than likely going to bottleneck any decent video card in multiplayer and lead to sub 60 fps performance at times regardless of what video settings you run in the game. this is the issue i have with so many review sites, they are not properly testing CPU performance and misleading the public by telling them ah all you need is a quad core. if it were true i would still be running my Q8200 at 2.8 ghz and still be perfectly happy with it because it was fast enough in every game i played until i started playing multiplayer in BF3.
 
I use [H] reviews because the performance they show usually end up matching my own pretty closely. Some other sites I have no idea where they get their performance numbers from, not what I get at any rate.

That said, I go by what they numbers say, not what [H]'s conclusions, thoughts, or awards about "xyz" product are. Those are mostly hit & miss, with more misses lately for me. Like the 580 vs. 6970. The numbers are good, but subjectively, not really the comparison.

Just take what you need from it.
 
Back
Top