bigdogchris
Fully [H]
- Joined
- Feb 19, 2008
- Messages
- 18,706
First, obviously we all love [H] and they have the best reviews online, no doubt about it. One thing concerned me though when I was checking out some of the more recent reviews.
The review specifically I'm referring to is the recent GTX260 Black review. The card is considered a 'gamer' mid level card and the testing is being done with extremely high end processors. I understand this is to eliminate the possibility of a CPU bottleneck to help deliver more real world performance. But is a $1000+ CPU coupled with a $220 GPU by any means a real world setup, to represent the real world performance they want to present us with, for people who would run that gpu?
I think people that have $1000 cpu's are more likely to be running the best GPU available, and probably have SLI/XFire, than running a mid level card. Yes, it's allowing the gpu to be shown to it's fullest extent but that's not representing real world performance that the majority of people that buy this card is going to get. People with that card are more likely going to have $150-$300 cpu's, right? Not in all cases, but most. In a way it's similar to synthetic benchmarks, just purely the best available throughput of the card, not necessarily matching what you will get.
Think of it another way. If you review this card with a $1000 extreme edition Core 2, a $300 Core i7 and a $150 Core 2, you're going to all get different results. If you consider the lower resolutions that people that a GTX260 would run games at, the CPU plays a much larger role than say at 2560x1600, where people would be running high end GPU's and match that with the high end cpu they use here.
The reason I brought this up is because I have a 3ghz core 2 and game at 1440x900. When I look at the reviews, I can't decide whether it's an upgrade for me because I know just by having that CPU alone, my FPS would skyrocket with my current gpu setup. I think that alone is a very legit argument about this methodology for gamers like me looking at a mid level card to match our mid level cpu.
I think the attitude is right with 'real world' performance over canned benchmarks (except FC2 which we know is actually a legit tool). But all in all wouldn't the reviews for a mid level gpu help us mid level gamers more if the cpu's/motherboards/memory/resolution were what we have? Just like how the guys with the $1000 cpu's and 2-4 gpu's expect them to be bench marked with hardware they likely have? I understand hardware cost money and money doesn't grow on tree's but it's something to at least think about for the future, just to match high end gpu's with high end cpu's and mid level gpu's with mid level cpu's. That way your target audience for each product knows what they are going to get.
The review specifically I'm referring to is the recent GTX260 Black review. The card is considered a 'gamer' mid level card and the testing is being done with extremely high end processors. I understand this is to eliminate the possibility of a CPU bottleneck to help deliver more real world performance. But is a $1000+ CPU coupled with a $220 GPU by any means a real world setup, to represent the real world performance they want to present us with, for people who would run that gpu?
I think people that have $1000 cpu's are more likely to be running the best GPU available, and probably have SLI/XFire, than running a mid level card. Yes, it's allowing the gpu to be shown to it's fullest extent but that's not representing real world performance that the majority of people that buy this card is going to get. People with that card are more likely going to have $150-$300 cpu's, right? Not in all cases, but most. In a way it's similar to synthetic benchmarks, just purely the best available throughput of the card, not necessarily matching what you will get.
Think of it another way. If you review this card with a $1000 extreme edition Core 2, a $300 Core i7 and a $150 Core 2, you're going to all get different results. If you consider the lower resolutions that people that a GTX260 would run games at, the CPU plays a much larger role than say at 2560x1600, where people would be running high end GPU's and match that with the high end cpu they use here.
The reason I brought this up is because I have a 3ghz core 2 and game at 1440x900. When I look at the reviews, I can't decide whether it's an upgrade for me because I know just by having that CPU alone, my FPS would skyrocket with my current gpu setup. I think that alone is a very legit argument about this methodology for gamers like me looking at a mid level card to match our mid level cpu.
I think the attitude is right with 'real world' performance over canned benchmarks (except FC2 which we know is actually a legit tool). But all in all wouldn't the reviews for a mid level gpu help us mid level gamers more if the cpu's/motherboards/memory/resolution were what we have? Just like how the guys with the $1000 cpu's and 2-4 gpu's expect them to be bench marked with hardware they likely have? I understand hardware cost money and money doesn't grow on tree's but it's something to at least think about for the future, just to match high end gpu's with high end cpu's and mid level gpu's with mid level cpu's. That way your target audience for each product knows what they are going to get.