Radeon HD 2900 XTX: Doomed from the Start

I never did either, but we were all hoping it was going to lead to a better market with more competition and superior products. As it stands right now, consumers could end up losing what little competition there was in the graphics and processor markets. I'm wondering more and more if the merger worsened the R600 and Barcelona delays. Compund that with an unforeseen glitch in the choice to use GDDR4 memory looking like it will cause yet another delay, this year must rank as one of the worst in both AMD's and ATI's histories. Everyone, including users who purchased NVidia products should hope to God these benchmarks are not representative of the final products or we're all going to be spending a lot more for our future systems, and I'm not only talking about video cards.

I agree. If these numbers are real and if R600 is not what everyone was expecting it to be, I really hope that at least Barcelona is. I would like to see AMD beating Intel with it. At least, that would give AMD a way to get back on its feet and regain market share.
 
Everyone, including users who purchased NVidia products should hope to God these benchmarks are not representative of the final products or we're all going to be spending a lot more for our future systems, and I'm not only talking about video cards.

QFmfT

And I've bought 3 Nvidia products in the past two years (not all for one system) and was hoping to finally have some red blood in the house to run me some DC, ala Folding@Home.
 
Well more fps now = more fps later

so say you're playing obvlion now @ 150fps, and then the ATI card runs 200fps, in 1 year , maybe it will end up being the 8800GTX @ 30fps, while the ATI 2900 is at 50fps, thats why more fps now generaly means more fps later
 
Sounds like a plausible theory, personally at this point I'm just waiting to see the final benchmarks. As I think most people here are.

On another note, how are you getting four AIWs in the same box? Are two of them PCI, or some kind of PCIe AGP hybrid board? Sounds interesting, I'd be curious to see that running. I always kind of assumed that even if you could get them all working on the hardware level the driver would freak out. Pretty impressive though.
sorry I wasn't clear...
should have read: I've owned an AIW 9600Pro AGP, AIW X800XL AGP (downclocked whitebox version without remote), and recently an AIW X1900 PCIe. I would like to own a AIW 2900 if they change their minds about cancelling the AIW series.
Hey Sherlock, try reading, they say it is an OEM XTX and not a a final retail board.

It has the 1GB of DDR4, so it is not a XT.

Here is what likely happened. There was meant to be a significant clock speed delta between the 1GB XTX and 512MB XT. Most likely 750MHz (or a bit higher, but still not enough to catch GTX) is the fastest they can run R600 reliably. It turns out this is only good enough to challenge the GTS. So the XT gets the 750 MHz speed and the XTX is back to the drawing board to await a respin with much higher clock speed or more execution units to try and catch GTX.
Daily Tech has already overclocked the RETAIL 2900XT to 845mhz core and 995mhz (1.99ghz) memory...
http://www.dailytech.com/Overclocking+the+R600/article7044.htm
In that article he mentions he is using a factory overclocked GTX at 650mhz core (reference 575mhz) and 2ghz mem (reference 1.8ghz). That looks suspiciously like the $940 BFG Watercooled 8800GTX.:rolleyes: Are there any cards cheaper than that running at those speeds?
http://www.newegg.com/Product/Product.aspx?Item=N82E16814143085
And that OVERCLOCKED XT came damn close to that Factory OC GTX in 3Dmark06...I wonder why he really didn't post other benchmarks.:confused: :rolleyes:
Then again DailyTech's credibility is questionable ATM (no proof, no screenshots), so who knows if they actually overclocked the XT to those speeds.:rolleyes:
If the XTX is a better binned chip than the XT, then what's keeping ATI from clocking the XTX around 800-825mhz at stock and letting OverDrive take it to 850+ mhz?
Surely 800+mhz is enough to compete with a stock clocked 8800GTX.:confused:
 
That is really really sad. This is not good for Competition and for AMD. Come one AMD you can do better. :(
 
Well this is not Amd, its still Ati all the way. If its true its so sad.........:(
 
In that article he mentions he is using a factory overclocked GTX at 650mhz core (reference 575mhz) and 2ghz mem (reference 1.8ghz). That looks suspiciously like the $940 BFG Watercooled 8800GTX.:rolleyes: Are there any cards cheaper than that running at those speeds?

What are you getting at? Who cares what the card is? You don't need watercooling to get those speeds. If they could have said the name, they would have. They didn't say they slid over to the store and picked up that GTX.

Then again DailyTech's credibility is questionable ATM (no proof, no screenshots), so who knows if they actually overclocked the XT to those speeds.:rolleyes:

Where did that come from? Since when has DailyTech been known to just gin up benchmarks? What would that prove? Did they also make up the part about AIB partners not being satisfied with the cards? What proof do you want?
 
You people who jumped on these results would make the worst scientists ever. Do not jump to conclusions unless you double-triple check your facts(data) and right now you only have one source to rely upon, yet you can easily decide that X2900XT sucks. It could be a bug with the drivers, it could be some compatibility issue with the mobo they use, it could be anything why these results are much lower than expected. Do you really think the engineers working for ATI (the people who came up with 9700pro, X800XT, X1800XT/X1900XT and many other competitive products) would let this gpu be rolled out of the fabs after they taped it out and found out that it is only as good as a last generation product. It doesn't make any sense to me if they did. Besides, I think ATI has proven to be a decent competitor to nvidia since the Radeon 8500 and I doubt the management would the engineering staff drop the ball bigtime like this.
 
Well, this will end in two possible ways:

1. DailyTech was right and the card sucks and AMD will lose a lot of sales over this.
2. DailyTech was wrong and they completely ruin their reputation as being a credible source.

Only a few more days until we find out which it is. ;)
 
Well, this will end in two possible ways:

1. DailyTech was right and the card sucks and AMD will lose a lot of sales over this.
2. DailyTech was wrong and they completely ruin their reputation as being a credible source.

Only a few more days until we find out which it is. ;)

QFT.
 
Well, this will end in two possible ways:

1. DailyTech was right and the card sucks and AMD will lose a lot of sales over this.
2. DailyTech was wrong and they completely ruin their reputation as being a credible source.

Only a few more days until we find out which it is. ;)

I agree with #2 as they don't have a production card based on the thread at beyond3d. Something just isn't right.
 
You people who jumped on these results would make the worst scientists ever. Do not jump to conclusions unless you double-triple check your facts(data) and right now you only have one source to rely upon, yet you can easily decide that X2900XT sucks. It could be a bug with the drivers, it could be some compatibility issue with the mobo they use, it could be anything why these results are much lower than expected. Do you really think the engineers working for ATI (the people who came up with 9700pro, X800XT, X1800XT/X1900XT and many other competitive products) would let this gpu be rolled out of the fabs after they taped it out and found out that it is only as good as a last generation product. It doesn't make any sense to me if they did. Besides, I think ATI has proven to be a decent competitor to nvidia since the Radeon 8500 and I doubt the management would the engineering staff drop the ball bigtime like this.

+1

80% of the people in this thread are jumping down AMD's throat saying they are going to fail miserably because of a bs benchmark.
 
The fact that they think the XTX is the OEM 12in version kind of kills their credibility from the get go.
 
I have it on good authority that the #'s will be a bit better than what dtech has shown, for various reasons. However it will still will not come close to GTX performance, as far as the XTX.

Though this could bode well for the XT, as this same reasoning should give it slightly better numbers too.
 
I would be willing to bet that DirectX 9 will be more relevant at least until 2009.

I would rather have a card that performs well in most games, instead of some games.

DirectX 10 will be mostly a marketing term and maybe an unsupported patch to a couple games for another year or so.

Look how long it took SM3.0 to really take hold. It came out with the NVIDIA 6 series but now real games fully supported it until the 7 series.

It could be by the time R700\G90 cards are out that DX10 truly matters.

You should see 5-6 DX10 games launch by September according to public dates currently.
 
But, from a marketting perspective and at this point in time, that doesn't make sense at all,
We've been discussing R600 rumors for the past 5-6 months and most of them, were not good at all.
And now, this close to the actual launch, they purposely "want" to show their card, as weak and helpless, against a 6 month old 8800 GTX ?

As it was discussed before, all this hype surrouding R600, just hurts it. Expectations are so high (being so late in the game), that it doesn't matter how well it performs, since people will expect more from it.
If anything, AMD/ATI should've come forth before and showed that all those "bad" rumors, were false. If you have a card that delivers what's expected, why wouldn't you want to show that it creams the competition ?

I still have my doubts about these benchmarks, since it contradicts what others said. Basically, that the XT should be on par with the GTX. However, since they are all rumors and unconfirmed benchmarks, just like what that "supposed" AMD/ATI employee said, we have to take them with a grain of salt.
About the XTX, I strongly believe, it will not be much faster than the XT. ATI has done this at least two times before, so I have no reason to believe it will be different, this time around.

Those professionally marketing something know that rumors die hard, so they wouldn't risk putting something out there that could hurt their sales for months to come as the rumor propagates.
 
I really doubt those XTX numbers are what we will end up seeing when it final does ship. Thats why too much horse power with the clock speed, memory, ect to have such a low perfromace threshold. I can not believe any engineer over at AMD/ATI is that stupid to design something that powerful and yet barely out doing last gen. Just does not make any common sense..... But guess we will find out soon enough...






Except that is not what Epic and Cytex have been telling us as both claim that their next game due out this year willl make full use of DX10 cards. You also have other AAA Titles like HL2: EP2 and TF that it supposedly will make use of some DX10 functinaly. Now to what exctent is still uknow but thats what is what they are saying.

Also SM30.0 vrs DX10 is a huge difference. Almost everything you could do with SM3.0 you could do with SM2.0x. That is not the case this time with DX10. That and the fact that vista will force DX10 on us no matter what, we will see a much quicker adoption of DX10 than we did SM3.0

The funny part is pretty much everything you can do in DX10 you can do in DX9. It has some optimizations for having more on screen, allowing a developer to add to the realism by upping scenery, etc, but the graphics EFFECTS are all able to be created in DX9c.
 
80% of the people in this thread are jumping down AMD's throat saying they are going to fail miserably because of a bs benchmark.

It is not just one BS benchmark. There have been a number of leaks about the XT (Not the XTX) and some things are clearly emerging:

1: The 2900XT is a competitor for the GTS, It will be priced competitive with GTS. I don't think there is any disagreement on this point.

2: Many sources indicate that initially, the XT will be ATI's fastest card. Again there doesn't seem to be much disagreement here. The GTX competitor will be MIA.

So that alone is enough to point out that ATI screwed up. They are six months late and they don't yet have an answer to the GTX.

Now all the Daily tech benchmark does is give some explanation as to why. XTX is based on the exact same processor as XT, it really can't hope to catch GTX which has 33% more processing units than GTS. You should be able to figure that out without a benchmark.

All the Daily Tech benchmark does is point out the obvious to those who couldn't work it out for themselves and the give deniers something to squawk about because the test was done in a haphazard manner.

What happens Wednesday is likely a hard launch of the 2900XT and paper launch of some others. Probably no fixed day for the XTX as it has to go back to the drawing board awaiting a better chip version. But it is clear that ATIs best shot was late and short of the mark. The XTX benchmarks are all but irrelevant to that outcome.
 
I posted this elsewhere on the same topic and thought it might add a bit to the discussion here too.

_________________________________________________________________________
Their investment parts are actually fine. The XT (and presumably below) compete decently and seem to beat the 8XXX series.

i'm not so sure the XT is that competetive with the GTS. the "leaked" reviews were gpu limited @ 1280x1024, and they used a reference GTS running at 500mhz/1.6Ghz... and the ref. GTS still won 2 of 4 benchmarks (tho the XT dominated one; sounfs fishy at that res tho). how many GTS' do you see running 500mhz???

will be interesting to see performance comparisons in situations where the test isn't gpu limited, and against a more reasonable GTS running in the 550-600mhz range (or o/c them both to the max and see how they compare).
 
i'm not so sure the XT is that competetive with the GTS. the "leaked" reviews were gpu limited @ 1280x1024, and they used a reference GTS running at 500mhz/1.6Ghz... and the ref. GTS still won 2 of 4 benchmarks (tho the XT dominated one; sounfs fishy at that res tho). how many GTS' do you see running 500mhz???

will be interesting to see performance comparisons in situations where the test isn't gpu limited, and against a more reasonable GTS running in the 550-600mhz range (or o/c them both to the max and see how they compare).

You mean CPU limited. In my opinion the comparisons should be either stock for stock or overclocked for overclocked (although that element introduces the inherent randomness as to whether you've got God's gift to GTSs in your hands or an overclocking dud).

DailyTech was able to get the XT to 845/1.99 up from 745/1.6
 
How many people own a GTX you say ? Highest end market does not generate huge revenues like people always assume it does. You might be right about ATI not having a GTX competitor (I am still waiting for more thorough reviews on this) but how this relates to their downfall is a total mystery to me. Many people tend to assume that AMD will lose huge revenues because X2900XT/XTX won't be as fast as a GTX which I believe is not going to happen. Perhaps someone can elaborate on this but in my opinion as long as ATI has decent cards for $100-150, $150-200 segments with decent drivers to back they will be in direct competition with Nvidia for the major bucks in the gpu market. 9700 Pro was faster and more advanced than ti4600, then there was the 5800 disaster, 9800XT was faster and cooler than 5950 Ultra, 9600XT was faster than 5750 Ultra yet Nvidia still kept a decent market share because in this business what determines your financial situation is your board partners and OEM buyers who are more interested in the reliability and the price of the product rather than the 5% fps advantage it has over the competition. Some might consider ATI late to enter DX10 market but without a single DX10 game launched and considering the fact that last generations highest end gpus (crossfire or sli setups included) can still push decent framerates in recent games, I certainly can't see the market rushing for DX10 yet. I'm not defending ATI here, all I'm trying to say is that there are many unknowns (even if the framerates in those benchies are accurate) like image quality, like how the two GPU's will perform with the DX10 titles etc... to chalk one up for Nvidia yet.

edit : I was replying to snowdog's post but me being a slow typer I was late as usual...
 
Dont you mean damn Texans and Americans ? AMD and ATI are American owned :p

Oh, come on, you know full well that ATI was a Canadian company, and *then* they were bought by an American company. It's not like the nationality of the employees has actually, physically changed just because some executives signed some papers. Sheesh. :)
 
How many people own a GTX you say ? Highest end market does not generate huge revenues like people always assume it does. You might be right about ATI not having a GTX competitor (I am still waiting for more thorough reviews on this) but how this relates to their downfall is a total mystery to me. .

I think a lot of people agree with you. There is some financial damage to be had from losing the high end though.

In an earlier post I made the same point, that a disappointing benchmark for the XT doesn't equate to a financial disappointment for AMD. It certainly doesn't help AMD financially though. And with AMD in their current position they need all the help they can get. But you are right, even poor performance for the 2900 generation does not in any way spell "doom" for AMD.
 
I actually think falling behind in the high end is a pretty bad thing, because they are always trying to court console manufacturers to use their technology in their next console. If they flop in the high end it could be percieved as an indication that they won't be able to deliver on their promises. They really need a solid flagship product.
 
I'm gonna spit out a very last detail ( before they get me behind the bars for disclosing information )

The 2900XTX is 4fps in front of the 8800GTX ( in average ) in Crysis ( information directly from a developer of the game )

The 2900XTX won't be the G80 killer that everybody was expecting, but it'll be in front in most cases, with little to good margins depending on the game and settings.

nVIDIA did a great job with the G80 this time, and the performance leap over the previous generation is huge and only compared to the R300 launch in the past

from the amd cpu forum 0.0
 
From Rage3d.com:

You want some truth?

People who have the cards, affiliated, and replying on those Daily Tech reports:


Kombatant: www.kombatant.com AMD/ATI employee, (ex) Beyond3D/Rage3D mod:

On a serious note, I still see some stuff out there that have no merit. Patience.

They christened the OEM version we've known and loved for quite a few months as an "XTX". That should tell you a lot about their credibility actually.

Ok, so before this goes totally out of hand, let me say this, and this will be my final say on the matter, until the NDA is lifted: AMD made certain decisions concerning this card. I took a hard look out there, to see what's being leaked, and it seems there are still some stuff that are totally made up - tbh, it smells like a FUD campaign to me, if I take into consideration certain emails that are flying around lately. Certainly there are some stuff out there that are true, and you will know which is which when the NDA lifts soonish. The journalists that were in Tunis certainly know, and are probably laughing at some of those at this minute.

So, to recap: What I've said in the past stands about the card (Sound_Card been on a mission to put all of my quotes in his sig so that we don't miss anything ). Unfortunately I can't reveal more at this point due to the NDA. And for those who are wondering, no, I am not a moderator/staff on Rage3D anymore, I stepped down two months ago, because it wouldn't be ethical imo with the new job I now have to continue do work here.

As I said, some info out there are accurate, some are not. Whatever rumours are out there certainly won't force AMD to reveal stuff sooner, that simply doesn't work, whether you're called Intel, AMD, nVIDIA or whatever.


Bum_JCRules @ THG (under NDA with cards):

Total Crap.. well almost:
While I am required to follow the NDA, the stuff up on Daily Tech today is almost worthless. Yes Anandtech was present in Tunisia (signing Non-disclosure agreements like the Inquirer), why they are posting this stuff is beyond me because their numbers are off. They must be only using the XP drivers and OS because the numbers in CF vs the GTX are very much different. So until I can officially comment on the architecture and the performance.. hold all of this as useless until the rest of the world writes about it.

I really would love to comment on this stuff...

I understand that DT and Anand are seperate but that is so childish. Derek was there and his cards got to his place of business before he returned home from Tunisia. That long board they have ... Not what Derek should have gotten in his delivery. That is all I will say before I go too far.


Kink (under NDA with cards):

Dailytechs benchmarks are inaccurate. Atleast in terms of 3dmark 06 (commenting on HD 2900 XT)


Metro.cl (under NDA with cards): www.chilehardware.com

laughable (whistling emoticon (wasn't me))


BenchZowner (under NDA and has card):

1) These benchies from DailyTech are quite off the reality, that's all I can say.

2)The 2900XTX & the 8800GTX are performing on par in Crysis at the moment ( direct info from a developer of the game )

3) Is it a deja-vu ? Remember the Mr Sanders case ? The...non invited to the press event at Ibiza editor of Hardware Analysis who wanted to "punish" ATi by publishing his...numbers of desire for the R580 ? He he, Deja-Vu

http://i12.tinypic.com/2py29hz.png

"Do I really have to express myself here ?
If these guys are so called my colleagues ( as of being reviewers like me ) then I should feel ashamed, really Evil or Very Mad

What are we looking at here ?

a) They used a better test system today, better drivers ( supposed to be ) and they managed to get the 2900XT to perform worse than their previews bench session with a worse test system & worse drivers ?
How come ?
On April 24 they got 84FPS on F.E.A.R. with the 2900XT with a QX6700 and pre-release drivers, and they got 79FPS today with a QX6800 and retail drivers ? Oh really ? Very Happy

b) Where's the result for the 2900XT in Company Of Heroes today ? Why N/A ?

c) On Half-Life 2:E1 today they got the 2900XT to score 1FPS more than the 2900XTX.
What could've caused this ? A typo ? Quite angelic.
Something else ? Using a CPU limited resolution which would cause both cards to behave like they're the same.And then there's the GTX surpassing the R600s by ~40 FPS. Quite the real leap over the Radeon X1950XTX at that game. Evil ? heh

d) Now, the best part...they scored ~48FPS in Oblivion on the 24th, and now they present us a 54FPS gain by a move from the QX6700 to the QX6800 and the small gain from running the RAM at 1T Command Rate ?

e) A reviewer in order to conduct comparable and a valuable & trustworthy review must use the same testbed, quite they opposite is what they did ( if they really went through a performance testing process )

f) And for what reason would somebody present unclear results in combination with unclear drivers & filters ( AF & AA ) settings ?
***** right boy Very Happy

My two cents ( oh wait, I have another one ) [ I'll save it for later ]

P.S. The stock core clock for the 2900XTX as of current is 800MHz and not 745MHz as they state.

P.S.2. That's pretty much all I can say at the moment.

P.S.3. Now I have to finish a memory roundup and then pack my stuff for a trip.


What did Crysis lead Developer (has the cards) say about the two (G80GTX and R600XT)?

And remember, Crysis is a DX9 game with an additional DirectX 10 codepath which will only help in getting better performance (gains) for the ones with DX10 cards.

28th April:
I'm gonna spit out a very last detail ( before they get me behind the bars for disclosing information )

The 2900XTX is 4fps in front of the 8800GTX ( in average ) in Crysis ( information directly from a developer of the game )

The 2900XTX won't be the G80 killer that everybody was expecting, but it'll be in front in most cases, with little to good margins depending on the game and settings.

nVIDIA did a great job with the G80 this time, and the performance leap over the previous generation is huge and only compared to the R300 launch in the past


That's STOCK clocks guys, it can OC high


BTW, from Fudzilla by "mouth": GeForce 8800 Ultra get 14000 scores in 3DMARK06 with no detail of test platform and no driver detail:
http://www.fudzilla.com/index....=view&id=678&Itemid=1

From DailyTech: Radeon HD 2900XT get 14005 scores in 3DMARK06 with full detail of test platform and driver detail:
http://www.dailytech.com/Overc...+R600/article7044.htm


AND NONE of them are with with release drivers, look at the GPU clocks very carefully aswell, look for the silicon version, look at the benchmarks used aswell and the detail in each setting (compare), look at the motherboard used too and that 8800GTX is no where to be found BUT by BFG at $950 as a watercooled card. PLUS those are EARLY pre-release sample cards, ES samples if you understand what that means and do you remember the abysmal performance of the faulty 8800GTX early ES samples with the wrong resistor values on release? What about the driver optimization that took place over months on end to get decent performance? Joe is on the JEDEC board BTW, and an AMD/ATi employee who helped produce GDDR4 for X1950XTX and made it a killer product so now you are telling me after 2 years spent on it like nVidia had spent 4 years on the G80, and millions of dollars, it simply doesn't beat the X1950 in benchmarks and no higher clock on the GDDR4? BTW, those benchmarks are not remotely correct either. We have 100's of GTX split around in our corporation and my own work system has 2 in SLi at 650/2100 with watercooling. Seriously, some of them benchmarks for let's say Oblivion, are 100% increased by h*ll knows what!

Hint #200: Memory clocks can go much higher and bandwidth (i.e. performance too) is much greater in upcoming titles and higher res/detail.

BTW the owner of Fudzilla, Fuad, is the ex-Graphics Editor of The Inquirer if you didn't already know. Take everything with a brick of salt.


I'll leave you with a little pre-release bang....
 
That big-ass Rage3D post looks like damage control. Now everyone ever that you never heard of is under NDA, has a card, and is blabbing about it over multiple forums and blogs. If someone says, "Hey, brother, I can get a card out of the lab for like 15 minutes, you want to run some benches?" What is DT going to say? No?

I really expect an ATI/AMD employee to say, "Hells Yeah! This R600 shit sucks balls! Our engineers were looking over the DT writer's shoulder making sure he emphasized that we are late to the dance, power sucking, and still can't beat the GTX."

What does the card being an OEM version have to do with being an XTX or not? Are they mutually exclusive now? Is it the new FireGL card? If so, that's bigger news than the quickie benches.

Everyone is saying that the benches are off. If ATI aren't doing daily driver updates to squeeze out every last bit of performance they are retards and deserve to lose.

If the GTX and XTX are getting the same performance in Crysis according to a developer, I have to say that AMD is sucking ass unless that XTX card is selling for $399.
 
That big-ass Rage3D post looks like damage control. Now everyone ever that you never heard of is under NDA, has a card, and is blabbing about it over multiple forums and blogs. If someone says, "Hey, brother, I can get a card out of the lab for like 15 minutes, you want to run some benches?" What is DT going to say? No?

I really expect an ATI/AMD employee to say, "Hells Yeah! This R600 shit sucks balls! Our engineers were looking over the DT writer's shoulder making sure he emphasized that we are late to the dance, power sucking, and still can't beat the GTX."

What does the card being an OEM version have to do with being an XTX or not? Are they mutually exclusive now? Is it the new FireGL card? If so, that's bigger news than the quickie benches.

Everyone is saying that the benches are off. If ATI aren't doing daily driver updates to squeeze out every last bit of performance they are retards and deserve to lose.

If the GTX and XTX are getting the same performance in Crysis according to a developer, I have to say that AMD is sucking ass unless that XTX card is selling for $399.


why would ATI sell their XTX for 399 when it competes with a $600 card? silly a bit huh yea it is "late" but it still competes with $600 card there is no logical reason unless they make them REALLLY cheap and need to gain back ground at the cost of profits
 
It's either damage control, or maybe, just maybe, all these "crap" stats about the 2900 is bullsh!t started by persons on Nvidia's payroll.

Ever heard of "attack-ads" in politics? You think the same thing wouldnt happen (even in a more discreet manner such as pointing out flaws or poor benchmarks about a competitors product...) in the tech industry?

The problem is a LOT of consumers are NOT enthusiasts, and believe what most "reputable" sites have to say about hardware.
 
Back
Top