The real reason behind Crossfire shortcomings?

Hopefully Ati resolves this with R520 or R580, I think they're not thát stupid to make this a long term issue.
 
i think the inq said that it was a problem that went across to the r520 and r580 until at the the r600. but its the inq...

*insert grain of sailt here*
 
IMO, its simply that it conforms to the DVI spec.

http://en.wikipedia.org/wiki/Digital_Visual_Interface

1600x1200 GTF is 161Mhz, which is technically the maximum that DVI on GTF can do without going to Dual link mode.

If the DVI spec is someday raised, or the new HDMI connector shows up, then I'd be willing to bet Crossfire will support the higher modes.... Problem is right now not the capability of the videocard or monitor, its the cable inbetween that tends to lose signal.

Even you are running a dual SLI 7800, and using a DVI-D connector - you are probably running at 60hz if you are running digitally on an LCD monitor. Its a pretty big weakness of Digital that they set the standards awfully low IMO. Analog combining (original SLI) actually has an big advantage in that it allows as high a resolution and as high a refresh as necessary on one interface, albeit with loss of visual quality due to the signal loss in the cable.

IE: if you want to run 3840x2400 and still remain in spec for the capability of 4 DVI cables, you need to run at 13 hz.... Digital does have drawbacks.

ADD: Personally, I think the 30-pin DVI cable was a bad idea, it really should have been made with 50, with maybe an external SCSI type connector. Sure it would have been big, but at least it would have expansion for the future.
 
Dual Link DVI supports higher than 1600x1200x60hz.

More importantly, if you click the above article it has nothing to do with DVI at all, apparently - rather because the compositing engine chip has no frame buffer.
 
Yes that was also the explanation in the nVidia presentation. ATi should really do somethingabout that or they would be shooting themselves in the foot. Personally, if they did release a dual GPU solution limited to 1600x1200, then they should really try to make it up with quality. Is there still a refresh rate limitation with resolutions under 1600x1200? I dont think there is, but I want to be sure.
 
^ Precisely. But you have to admit CRT's are getting harder and harder to find...at least the decent ones. P95F's are gone with the wind. All that's around now is average Shadow Mask CRT's. LCD is the future, hate it or love it, but I'm not ready to jump ship yet and I dislike some things about it like the 60 hz refresh limit, slight ghosting even on expensive models, and fixed res.
 
ShuttleLuv said:
^ Precisely. But you have to admit CRT's are getting harder and harder to find...at least the decent ones. P95F's are gone with the wind. All that's around now is average Shadow Mask CRT's. LCD is the future, hate it or love it, but I'm not ready to jump ship yet and I dislike some things about it like the 60 hz refresh limit, slight ghosting even on expensive models, and fixed res.

Odd, but you seem to have skipped somethings i found most annoying. I have a new laptop with a 15" TFT and i disliked it for different reasons.

1. Not REAL black! The backlit nature makes it have a built in glossiness. RGB (1,1,1) is actually DARKER than RGB (0,0,0). Granted it's barely noticeable, but my experience with photo editing made me very sensitive to color variations.

2. The color of LCD's vary depending on what angle you're looking at, especially for large screens. It's already very noticeable on a fifteen incher. Combined with #1 at certain angles makes for images that look like they have inverted backgrounds. How the heck does Gabe cope with it?

Imagine watching the matrix, you lean back, and suddenly all the really dark places turns dark gray! Or image editing in photoshop, the top color doesn't match the bottom because your head is looking at them from slightly different angles (My eyeballs are aligned with the top of the screen at arms length).

Do professional artists actually use these for their work?! How the heck do they calibrate it?! I used to work parttime as a photo editor for a studio and it takes great pains to get a new CRT monitor calibrated to specs. How do you calibrate an angle dependent monitor?

If LCD does become standard, i certainly hope they atleast make the colors stable!

Sorry for the rant, i've been setting up this laptop and i'm seriously getting annoyed by the shifting colors everytime i move my head to check the other PC.
 
As an old graphic designer and animator, I probably wont ever convert to LCD's until black is black and colors and shades don't change depending on your angle.
Good point, Sly.
I'll stick with my sony trini CRT's until LCDs can compete.

Luff for the 85lb CRT!!!
 
ShuttleLuv said:
LCD is the future, hate it or love it

QFT! I personally still use CRTs both at work and at home, but it is safe to assume CRTs will practically no longer have a market share that is significant. And I personally have to admit that while I, too, used to say this "black is black"-stuff and was seriously amazed by how far better my friend's 2405FPW looked better than my CRT :p

...pretended it didn't though, and complained about it's washed out blacks :D
 
wizzackr said:
QFT! I personally still use CRTs both at work and at home, but it is safe to assume CRTs will practically no longer have a market share that is significant. And I personally have to admit that while I, too, used to say this "black is black"-stuff and was seriously amazed by how far better my friend's 2405FPW looked better than my CRT :p

...pretended it didn't though, and complained about it's washed out blacks :D
Try using professional Sony CRT's.
 
It always seemed like Crossfire was cobbled together (IMO).

ATI should have just swallowed hard and adopted NV's SLI approach to Mulit GPU rendering.
 
CrimandEvil said:
It always seemed like Crossfire was cobbled together (IMO).

ATI should have just swallowed hard and adopted NV's SLI approach to Mulit GPU rendering.

yeah especially the bit about using field programmable chips instead of a real piece of silicon...CrossFire looks thrown together in a big way...which makes it actually kinda impressive that they're getting it working within a year of SLi...
 
eno-on said:
Try using professional Sony CRT's.

Mitsubishi Diamond PRO 2070SB here, dual Sony Trinitron E530 at work.

CrimandEvil said:
It always seemed like Crossfire was cobbled together (IMO).

ATI should have just swallowed hard and adopted NV's SLI approach to Mulit GPU rendering.

Exactly my thoughts, but then again - didn't ATI base their multi GPU approach on known to be working tech previously used by Evan and Sutherland and such? I always thought they had developed this tech before nVidia, so theoretically they cannot possibly have borked it that badly. Don't really know, though, all speculation...
 
wizzackr said:
didn't ATI base their multi GPU approach on known to be working tech previously used by Evan and Sutherland and such? I always thought they had developed this tech before nVidia, so theoretically they cannot possibly have borked it that badly. Don't really know, though, all speculation...
it sorta reminds me of the voodoo2 days the way the cards are linked outside the case.
 
I think ATI seriously FK'd up the whole concept and thier implementation. The entire crossfire concept could have been implemented OUTSIDE the PC.

With 2 DVI inputs and a usb connection, the "blackbox" could be told what "mode" of SLI-ness it was to operate and would take the data coming out of the seperate DVI connections and put then together into a single signal which would then drive both a DVI and a VGA connector. The blackbox would have a frame buffer and actually be able to provide extreme flexibility as to what 2 cards are used to output the image portions (with the only constraint being they have DVI outputs) not that you would WANT to have a 9200 and an X850 as your two cards, but "technically" they would work together. But certainly any two current X850's would just plain work....non of this "master" card BS. Two "half" images, one from each card, are simply stiched back together ina frame buffer in the black box and sent out the display connector.

EVERYTHING else would be handled by ATI's driver. Exampe, if you are going to run 1600x1200 and AFR, the correct data would just flow out the connector, using any DVI tricks needed to help bandwidth, and the blackbox would be the key point to make sure high resolutions at high vsync were possible.

I just think trying to slap sli/crossfire onto the video card is leading to compromises in the capabilities of both that are needless. Its like making themselves work needlessly.

Stick to making the video card as powerful an engine as possible, and let the blackbox concentrate on doing the crossfire.

I suppsoe the question is begged, HOW MUCH would you be willing to pay for a blackbox that implemented the hardware of Crossfire/SLI-ness using any two off the shelf ATI cards with DVI outputs?

Two $400 X850XT's would then be ready to rock and roll right now, no need to buy "special" cards.

I'd pay $200-300 for such a box. Since iot would then be useable for future videocard generations, so its not a ONE generation purchase, the way X850 is vs X1800 etc etc.

Thoughts?
 
Back
Top