VisionTek Radeon X800Pro Review

I have said a lot of crap about [H], Kyle and company in the past for various reasons. But the Fact is.. [H] Now puts out the best review for a *Gamer* thats ever been produced. Your new methodology is the best i see going anywhere. This is indeed the kind of information that people that buy these cards to play games actually need.

What can I play at in game *X* if i buy this card. How does that compare to other cards.

There is flatly to much Driver hackery going on all over the place for people to take "apples to apples" comparrisons seriously any longer. They are pretty much worthless imo.

I tip my hat with a deep respectful bow to Kyle, Bret and the rest of the staff here for trying(and succeding) in pulling Gaming card reviews out of the PR mire.
 
Another excellent review. Graphing FPS versus time and listing the minimum FPS is a lot more useful than listing the average frame rate in a bar graph, and I'm glad to see there is a drop in the number of people complaining about the graphs.

How many ppl here ACTUALLY play any games on 1600x1280???

I play all the games I can run at 1600x1200 or @ 1280x1024/960.In fact I can't think of any game I run below 1280 right now. If your monitor cannot support high resolutions with reasonable refresh rates I'd suggest not paying $400 for a video card without first upgrading your monitor.
 
Hellbinder said:
I have said a lot of crap about [H], Kyle and company in the past for various reasons. But the Fact is.. [H] Now puts out the best review for a *Gamer* thats ever been produced. Your new methodology is the best i see going anywhere. This is indeed the kind of information that people that buy these cards to play games actually need.

What can I play at in game *X* if i buy this card. How does that compare to other cards.

There is flatly to much Driver hackery going on all over the place for people to take "apples to apples" comparrisons seriously any longer. They are pretty much worthless imo.

I tip my hat with a deep respectful bow to Kyle, Bret and the rest of the staff here for trying(and succeding) in pulling Gaming card reviews out of the PR mire.

I agree 100% that an apples to apples comparison is impossible these days. Kyle took alot of abuse from alot of people last year by talking about implementing this new era of "benchmarking". He still gets fanboy flak occassionally, but more people are starting to see the light.
 
Based on a 3GHz/1GBRam machine and either a x800XT or 6800U

With the new game engines coming out, will the feature set be so demanding that we won't be using AA/AF settings?? and just choosing to increase the resolution. or Will we be running at 1024x768??1280x1024??1600x1200?? or..
Is it impossible till the games are release to make an accurate prediction???

Can't Wait! :D
 
I have to admit I was totally wrong about the new methodology for [H]'s reviews. I was dead set against the idea, but great format it was very informative and useful. Good job.
 
Just wanted to add my kudos on the review (and the new fora). I haven't been following the video card/cpu/gaming scene for a few months and this was a nice way to get back into things.
 
From the Review:
We have been very strong supporters of buying ATI brand boxed video cards, but now that we have seen Vision Tek weather a choppy business climate over the last year and come out swinging, they are once again turning our heads. If warranty and support are an issue for you when you buy a video card, the Vision Tek X800 Pro is looking to possibly become the de facto standard for ATI cards in the USA.


I was looking in the Nvid forum and it seems that Vision Tek as of late isn't all it's cracked up to be in terms of support. For my own part, I attempted to connect to their site several times over the last 20 minutes and was completely unsuccessful.

I understand that this was a review on the product, and its capabilities in the scope of what it's designed to do--I think the comment on the company, although admittedly an aside, is out of scope in the context of the review. I feel that it would have been more appropriate to ask what your fans (and believe me, I am one) think of the support services etc, instead of imposing your own subjective estimation when we all know things like customer care and support are in fact quite objective (I work in out-sourcing, so please don't think I'm being sarcastic).

As an emissary to the consumer, to the Enthusiast, it would be interesting if you all went apples to apples on support services and customer care from various game manufacturers--develop your own Service Level Credits and benchmarks for this aspect of the industry.

---------------------------
That said, I have a question that tangents a bit-- is there any way that you, as people in contact with ATi, could let we, the consumers know why it is the Rage Theatre chip is apparently only shipping on the European models of these new graphics units? For something so expensive one would expect a little more elegance from the Premiere chip maker. Since I've gotten into tech 3 of my four cards have been of ATi manufacture (I took an ATi sabbatical with a Riva TNT2 Ultra in 1999); all of them robust---but this brilinear Rage Theatreless card is a bit thin for the ilk we've come to expect from them--what's the deal?
 
There's just a couple things I don't quite get, it's an informative review but please, even if you are comparing so called "playable" settings; for god's sake, never compare a set of screenshots to other screenshots of a lower resolution. Even lowering the resolution is questionable unless you also list the same resolution as the other cards were tested.

For instance, instead of just comparing benchmarks at 12x10 for the VT - 12x10 for the bbATI - and 10x7 for the 6800u, add the results from the 6800u @ 12 x 10 also and just say it was slower and unplayable, or let the numbers speak for themselves. Not listing it altogether is just plain incomplete and looks suspicious, the same way making IQ comparison screenshots with only one card shown at a lower resolution.. and that is the card that costs 100 dollars more! I can't get over how flaky that is :rolleyes: .
Why go through the trouble if you leave it incomplete? I understand the point behind it, just not the execution. Please rethink those decisions; I'm telling you, it looks bad. Don't take this the wrong way, but these things bugged the hell out of me, especially when it's supposed to be "apples to apples"- I won't even start about the "manual" tests. It would be nice to see the results with all tests done w/ 16x aniso too, especially the ones you only used 8x with. A nice review, but one of the strangest I've ever read.
 
BWX said:
There's just a couple things I don't quite get, it's an informative review but please, even if you are comparing so called "playable" settings; for god's sake, never compare a set of screenshots to other screenshots of a lower resolution. Even lowering the resolution is questionable unless you also list the same resolution as the other cards were tested.

For instance, instead of just comparing benchmarks at 12x10 for the VT - 12x10 for the bbATI - and 10x7 for the 6800u, add the results from the 6800u @ 12 x 10 also and just say it was slower and unplayable, or let the numbers speak for themselves. Not listing it altogether is just plain incomplete and looks suspicious, the same way making IQ comparison screenshots with only one card shown at a lower resolution.. and that is the card that costs 100 dollars more! I can't get over how flaky that is :rolleyes: .
Why go through the trouble if you leave it incomplete? I understand the point behind it, just not the execution. Please rethink those decisions; I'm telling you, it looks bad. Don't take this the wrong way, but these things bugged the hell out of me, especially when it's supposed to be "apples to apples"- I won't even start about the "manual" tests. It would be nice to see the results with all tests done w/ 16x aniso too, especially the ones you only used 8x with. A nice review, but one of the strangest I've ever read.

I think they are telling you this was their impression of the most playable settings for each given card. Reading back, it looks like the ability to maintain 30fps.
 
MrHappyGoLucky said:
I think they are telling you this was their impression of the most playable settings for each given card. Reading back, it looks like the ability to maintain 30fps.

Yeah I totally agree with that part, my only gripe is that they should also include the same resolution (1280 x1024) in the benchmark for continuity.
Same with the IQ test. I just want to see the missing ones right along side with a " * " next to it saying "unplayable" instead of just not including the bench test at all. Also include 16x Aniso in all tests- they're almost the fastest video cards on the planet, crank the IQ up man.. know what I mean? I can understand not including 6xAA, no one uses that, at least they shouldn't need to unless they're on a monitor that only does 800 x 600 or something-- even then, buy a new monitor and crank resolution. You cannot replace aniso with resolution, so I say crank it up.
 
I like the reviewing format however when I am looking at these cards they all have the performance in todays games, however having spent a fortune on a graphics card what I want to know is how well they will perform in the future? Say in 2 years time I still have this card want to buy the latest greatest game, I have to ask:
1) Will the card have the performance to play the game at all? Obviously the x800xt is king here cause it's the fastest, but the x800pro and geforce 6800 ultra's are also pretty fast, non-ultra 6800 too slow?.
2) These cards are so fast that in the future the factor stopping me playing new games is my processor speed. Will I be able to upgrade my cpu and still use this graphics card? Answer, probably not as pci express will have arrived, and these cards won't work on those boards (yes I know some pci express boards have an agp slot, but if you look into the details these are actually running on the 133Mb/s old pci bus!, so not suitable for a hight end graphics card).
3) Will these cards have not only the speed but the features to play next generation games? - looks like displacement mapping only supported by the geforce's will become a key feature (if you want best iq) at some point.

Particularly for such high end cards you could really do with a bit more thought about future-proofing. I want to buy the next geforce 4ti or radeon 9700 - great cards that keep performing well even in their old age, not nvidia 5800ultra - nuff said.
 
sbuckler said:
I like the reviewing format however when I am looking at these cards they all have the performance in todays games, however having spent a fortune on a graphics card what I want to know is how well they will perform in the future? Say in 2 years time I still have this card want to buy the latest greatest game, I have to ask:
1) Will the card have the performance to play the game at all? Obviously the x800xt is king here cause it's the fastest, but the x800pro and geforce 6800 ultra's are also pretty fast, non-ultra 6800 too slow?.
2) These cards are so fast that in the future the factor stopping me playing new games is my processor speed. Will I be able to upgrade my cpu and still use this graphics card? Answer, probably not as pci express will have arrived, and these cards won't work on those boards (yes I know some pci express boards have an agp slot, but if you look into the details these are actually running on the 133Mb/s old pci bus!, so not suitable for a hight end graphics card).
3) Will these cards have not only the speed but the features to play next generation games? - looks like displacement mapping only supported by the geforce's will become a key feature (if you want best iq) at some point.

Particularly for such high end cards you could really do with a bit more thought about future-proofing. I want to buy the next geforce 4ti or radeon 9700 - great cards that keep performing well even in their old age, not nvidia 5800ultra - nuff said.

I just installed my new Visiontek x800 Pro and it's able to run Far Cry at max resolution and all features turned all the way up, 4X AA and 4X AF with absolutely no problem. If it can handle Far Cry that well then I can't see anything coming along in the next year that can significantly slow this card down.
 
Dijonase said:
I just installed my new Visiontek x800 Pro and it's able to run Far Cry at max resolution and all features turned all the way up, 4X AA and 4X AF with absolutely no problem. If it can handle Far Cry that well then I can't see anything coming along in the next year that can significantly slow this card down.



Overclocked it yet? :)
 
PopCorn said:
Overclocked it yet? :)

Nah, not yet. I'm sure I'll start to play with it eventually, but when it can handle Far Cry as well as it does I figure I'll leave it be for now. I upgraded from a Visiontek GeForce 4 Ti 4400, so it's quite the performance increase. :cool:
 
The MSI X800 Pro overclocks like a m0f0 :)

x800.JPG
 
Dijonase said:
Nah, not yet. I'm sure I'll start to play with it eventually, but when it can handle Far Cry as well as it does I figure I'll leave it be for now. I upgraded from a Visiontek GeForce 4 Ti 4400, so it's quite the performance increase. :cool:

;) enjoy your new card.
 
I plan to. I'm a bit of a n00b at overclocking, and I plan to play a bit with this card. From the Visiontek x800 Pro review it sounds like it's quite overclockable, so I can have some fun. From what I've seen, it seems that ATITool is a great overclocking tool. Any other suggestions?
 
I really like Powerstrip (I've used it for years) http://www.majorgeeks.com/download.php?det=718. You'll need to use the 3.50 Beta to get x800 support but it works 100% perfect.

Be careful with ATItool. It has a setting to "auto find" maximum overclock. I don't trust this. I think it's best to manually find your overclock by taking it slow and testing along the way for visual anomolies. ATItool is great for finding info about your card, monitoring the Temps of the x800 Pro using driver independant probing (which is very nice) and it's good for scanning for artifacts. Just don't let any application auto find anything, do it yourself to be safe IMHO.

BTW, you can start with 500/1000 on your card, that should be be very easy & safe. Use ATItool to monitor your Temps while overclocking.

Later
 
PopCorn said:
I really like Powerstrip (I've used it for years) http://www.majorgeeks.com/download.php?det=718. You'll need to use the 3.50 Beta to get x800 support but it works 100% perfect.

Be careful with ATItool. It has a setting to "auto find" maximum overclock. I don't trust this. I think it's best to manually find your overclock by taking it slow and testing along the way for visual anomolies. ATItool is great for finding info about your card, monitoring the Temps of the x800 Pro using driver independant probing (which is very nice) and it's good for scanning for artifacts. Just don't let any application auto find anything, do it yourself to be safe IMHO.

BTW, you can start with 500/1000 on your card, that should be be very easy & safe. Use ATItool to monitor your Temps while overclocking.

Later

Excellent, thanks for the pointers. I guess it's time to play. :D
 
PopCorn said:
IBe careful with ATItool. It has a setting to "auto find" maximum overclock. I don't trust this. I think it's best to manually find your overclock by taking it slow and testing along the way for visual anomolies.

He is correct! I used the "auto find" option and watched it take my core to 617 before it locked the system up. I have been able to acheive 604 core but have not found the max on the memory.
 
Hellbinder said:
I have said a lot of crap about [H], Kyle and company in the past for various reasons. But the Fact is.. [H] Now puts out the best review for a *Gamer* thats ever been produced. Your new methodology is the best i see going anywhere. This is indeed the kind of information that people that buy these cards to play games actually need.

What can I play at in game *X* if i buy this card. How does that compare to other cards.

There is flatly to much Driver hackery going on all over the place for people to take "apples to apples" comparrisons seriously any longer. They are pretty much worthless imo.

I tip my hat with a deep respectful bow to Kyle, Bret and the rest of the staff here for trying(and succeding) in pulling Gaming card reviews out of the PR mire.

Holy shit. I have fallen and I can't get up. :eek: (speechless, for once)
 
I like that other suggestion though even though the guy might not have delivered it in the way that usually ends up getting things considered. In such a review where there is clearly one competitor or should I say comparison card.... throw up the "unplayable" line for the one item. So when the Raddie's are doing 1280*1024 at one IQ and the GF's are doing 1024*768 at another, show us on the same graph the 1280*1024 ATi settings results for the GForce. You guys still get your point across while maybe satisfying a bit of the "apples to apples" question that surely pops into reader's heads. It shouldn't take much more effort, I say that only because I am really trying to consider added work in my suggestion. This will satisfy many and make it feel more complete. Consider It. I know one 4xAA might be another 6xAA quality but thats hard to avoid. I think this method does settle a few cons that people might feel about the method, and then we're only left with unavoidable inconsistencies. So the review would still be max playable, with that dash of apples to apples.

Simply whenever we have one card laying some damage on another show us the other one's pain in attempting to hang with the big boy. You've already gone through a hell of an ordeal finding the maxes, just force the higher standard on the lower performing card in that instance and watch the slideshow. I gotta say 1024 res was used on the GF way more than I thought it would be so this added line would sure shut me up.

The unplayable line would be low and probably not clutter the graph too bad. I know you guys sometimes compare many cards but there is usually one or two that people want to see compared mostly. Even in a review with three different cards there would be say the two lines for the two 'losers' just trying to make it playable. Just as additional info with of course there playable graphed line as well.
 
I'm actually playing Far Cry with a x800 and all of the ingame settings (including shadows) cranked all the way up and I haven't encountered this problem. Is it a universal thing or does it only crop up every once in a while? I'd say that I just haven't noticed it yet but it seems like something you'd notice.
 
Dijonase said:
I'm actually playing Far Cry with a x800 and all of the ingame settings (including shadows) cranked all the way up and I haven't encountered this problem. Is it a universal thing or does it only crop up every once in a while? I'd say that I just haven't noticed it yet but it seems like something you'd notice.

have you loaded up the Boat level?
 
Only issue I have with the review is that it doesn't add in a 9800 series card or a GFFX 59xx card, because some of us, ya know, are poor. :p
 
I'm sorry but I'm still not satisfied as to the validity of this entire review.

Please help me to understand here - what is the comparable ATI part to the 6800 Ultra?

I *thought* it was the x800 XT and NOT the PRO version. Or is it that ATI has released their XT early and the comparable Nvidia part isn't out yet?

As I see it now, it seems a bit like comparing using the 9800 pro in a review when you should have been using the 9800 XT.

So you've got:

6800 Ultra -------- x800 XT:pE
6800 GT --------- x800 Pro
6800 -------- ?

Or would the x800 XT be comparable to the upcoming 6800 Ultra Extreme?

Man this crap is getting confusing again, lol
 
Darknyt said:
I'm sorry but I'm still not satisfied as to the validity of this entire review.

Please help me to understand here - what is the comparable ATI part to the 6800 Ultra?

I *thought* it was the x800 XT and NOT the PRO version. Or is it that ATI has released their XT early and the comparable Nvidia part isn't out yet?

As I see it now, it seems a bit like comparing using the 9800 pro in a review when you should have been using the 9800 XT.

So you've got:

6800 Ultra -------- x800 XT:pE
6800 GT --------- x800 Pro
6800 -------- ?

Or would the x800 XT be comparable to the upcoming 6800 Ultra Extreme?

Man this crap is getting confusing again, lol
That is where you are missing the point.

This review frees people from being forced to think in the little comparrison box that marketing would like to create. You know the games you play you can see the Resolution and performance over time card X, Y, and Z provide. If card X givrs you the best for 100$ less why care what is "supposed" to be compared or equal.

When PowerVR Series 5 hits the market (God willing) its going to be even more crazy. As its SM3 performance in some games is going to simply crush what you see today thanks to deferred shading technology. yet it may only be 250$
 
Hellbinder said:
When PowerVR Series 5 hits the market (God willing) its going to be even more crazy. As its SM3 performance in some games is going to simply crush what you see today thanks to deferred shading technology. yet it may only be 250$

Can you get me a link to that info about performance? I have been waiting since my 4 Kyro2's went out of style, and I haven't seen any info (besides pure speculation (not even rumors)) about it.
I do hope it will bring some much needed "open that window, this crap stinks" fresh air into the market.
 
So has anyone successfully XT-modded these Visionteks? :confused:

I might have to pull the trigger on one...
 
I have not been back around [H] for quite some time as I usually hang over at OC , but I have to say that this was an xint review IMO .

Theres quite a few peeps from OC that luv most of your reviews here at [H] .

Cudo's to ya and thx ;)
 
Back
Top