crossfire 16x12 60Hz limit, big deal?

Does the potential resolution limitation of CrossFire bug ya?

  • Don't care

    Votes: 111 44.4%
  • SLI for me now!

    Votes: 131 52.4%
  • I'm getting CrossFire.

    Votes: 19 7.6%
  • Can't run higher than 1024x768

    Votes: 13 5.2%

  • Total voters
    250
  • Poll closed .
Its a huge draw back to ATi if its true... and might make me a Nvidia fan for life -_-''
 
:eek: Wow ... if it's at Rage3d it must be serious.

I don't believe it. That's just incredibly stupid.
 
Zinn said:
:eek: Wow ... if it's at Rage3d it must be serious.

I don't believe it. That's just incredibly stupid.

One of these cards can play almost every game and every option for them at 1600x1200, so whats the use of getting Two if you can't play on a res higher then that.
 
Topweasel said:
One of these cards can play almost every game and every option for them at 1600x1200, so whats the use of getting Two if you can't play on a res higher then that.

The x800 series takes a pretty big dump in BF2, probably the current most popular game with high settings and 1600x1200.
http://anandtech.com/video/showdoc.aspx?i=2496&p=3

The X800 series isn't really (sanely) looked at as a high end solution right now anyways. Crossfire will be important with the X1800 series.
 
so we get 1600x1200 with very high af/aa. but what's the point?!?!?!? we want nice widescreen resolution!!!! 1920x1200 please
 
Hmmm... on second thought, this could explain why every pre-release "benchmark" of Crossfire is done at 1280x1024 :p

but if it's true I would say "wow... just wow" even though it annoys me when other people say that. It would really be sad.

No way. I can't believe ATI would burn that much R&D money on an SLI competitor that doesn't work. It can't be true. I just can't believe it... no way :eek:
 
This chip could probably be upgraded by the manufacturer. Maybe you can overclock the Silicon image chip beyond 165MHz? :p
 
on nv 6800 cards there are sil 164 chips with a max res of 1600*1200, and im sure there are ppl on this board running 6800 cards with 2405's @ 1920*1200.
 
i dont care and ill tell you why...i've never even considered ati in the first place. what losers. :p really though...that is kinda weird. :confused:
 
It will be reccommended max resolution. The reason being, Crossfire uses the Digital part of the DVI link (the part that allows you to display above 1600x1200 resolution) for sending signals between both cards. Therefore, if this area of signal is already used on the port, how can it also be used to display more information on a monitor?
 
1600x1200 isn't bad resolution, but 60Hz is a deal killer. I'd rather just use a braille display than be forced to watch a CRT monitor at 60Hz.
 
If it's true, look at it this way; you can still run 1024x768 for everyone's favorite game, 3DMark05. :p
 
I personally would find this shocking...maybe they were early boards...or maybe it's a driver bug...or or or...blah...this just can't be true...

If it is true...ATi just shot themselves in the foot...I can't imagine anyone would pony up for a CrossFire system with this kind of a limitation...
 
^eMpTy^ said:
I personally would find this shocking...maybe they were early boards...or maybe it's a driver bug...or or or...blah...this just can't be true...

If it is true...ATi just shot themselves in the foot...I can't imagine anyone would pony up for a CrossFire system with this kind of a limitation...


Nope its not cause its an early board or bug, its because of the dongle.
 
ATI is proving to be more and more incompetent as time progresses... if this is true, not only will that make all the money invested in Crossfire a complete waste, it will completely deface ATI. ATI has already lost a lot of credibility, this may be a fatal blow to ATI that gimps ati till she dies.

With ATI, its just been disappointment after disappointment.
 
ummm crossfire is a high end technology and this sort of technology makes up a very small percentage of overall sales. When SLI/crossfire becomes more mainstream I'm sure they will have the bugs worked out. The only people effective by this news will be the enthusiasts. .
 
Tiburon said:
ummm crossfire is a high end technology and this sort of technology makes up a very small percentage of overall sales. When SLI/crossfire becomes more mainstream I'm sure they will have the bugs worked out. The only people effective by this news will be the enthusiasts. .

thats nonsense. Companies live for the hype created by their upper end stuff.

And when the heck would crossfire or SLI be mainstream? It is almost NEVER a good idea to have SLI will less than the highest card available b/c performance is always more easily achieved by simply upgrading to a better card.
 
I was writing that in response that this would be the end of ATI. Which I thought was nonsense...
 
Kueller said:
1600x1200 isn't bad resolution, but 60Hz is a deal killer. I'd rather just use a braille display than be forced to watch a CRT monitor at 60Hz.

qft
 
Russ said:
thats nonsense. Companies live for the hype created by their upper end stuff.

You didn't read my post within the context it was meant for. I can't stand knee-jerk reactions by people who half read a post.

And when the heck would crossfire or SLI be mainstream? It is almost NEVER a good idea to have SLI will less than the highest card available b/c performance is always more easily achieved by simply upgrading to a better card.

I said as it gets MORE mainstream...meaning a greater percentage of market share. For the love of god man read the entire post first please and understand the context before flaming it up.
 
Tiburon said:
You didn't read my post within the context it was meant for. I can't stand knee-jerk reactions by people who half read a post.

I said as it gets MORE mainstream...meaning a greater percentage of market share. For the love of god man read the entire post first please and understand the context before flaming it up.

Eh, I got you. I just think what you said doesn't make sense to me. This will be an issue from day one if it is true. IMHO the same type of people will buy this from it's inception, so it'll have the same "market share" essentially. It can be an issue to people b4 they buy it.

I realize you probably agree with most of what I'm saying.....


So let's see, I agree with you in that it won't end ATI, but they will have to get a good solution out there sometime. There are certainly companies that don't make good high end solutions (Matrox for example), but does anyone buy them?
 
Bah. That sucks for sure. My Dell 2005FPW rocks... fuck being stuck at 1600x1200 max. Thats just horrible.

ATi, you keep on losing :(

Time to sell back my 35 Shares of ATi
 
its only for the x8 cards so who cares

who in their right mind would buy a crossfire mobo and another x8 card when they can just get the 7800 or 520 when it comes out.
 
Kueller said:
1600x1200 isn't bad resolution, but 60Hz is a deal killer. I'd rather just use a braille display than be forced to watch a CRT monitor at 60Hz.


2 things,

first its a rumored limitation of X8 series only, according to the rumor this does not effect the coming R5 line. So the wording is somewhat bias to people who dont read but rather skim, makes them think CrossFire as a whole is doing this which isnt true.

The next thing is the Hz limitation. A DVI output is topped out at 60Hz, this is fact, it cannot produce more then this do to the technology, since the dongle relies on DVI outputs, this is where 60Hz factor comes from. It does not mean you wont see a difference over 60, many places have already proved this. Nvidia for one way back during the Xbox release showed that there was a very noticable visual difference on a TV with its refresh capped and with the core producing twice the refresh cap of the TV, you still see a very big difference. So if your monitor is limited to 60, 80, 100, etc Hz, due to DVI, or the monitors refresh rate, you can still notice the higher frame rates because it will look better and play smoother. Hope this helps people.
 
doesn't really bug me.. seeing as how I only have one monitor.. and would never run a higher res then 1600x1200 if I had crossfire anyways.
 
Bo_Bice said:
shell out major $ for a flashing screen courtesy of 60hz? negative!



The DVI output of a 7800GTX is capped at 60Hz refresh (seperate from visual differences FPS can cause), SLI or not, email a card manufacturer if you like, have a good day :)
 
Shifra said:
The DVI output of a 7800GTX is capped at 60Hz refresh (seperate from visual differences FPS can cause), SLI or not, email a card manufacturer if you like, have a good day :)

how do you know this I've been using a 7800 gt at 85 hz at anything lower then 1600x1200, and thats because at 1600x1200 and higher my crt monitor max hz is 60. I'm using a dvi to analog but that doesn't change the fact that the dvi connector doesn't have the hz limitation the limitation is my monitor depending on res.
 
Shifra said:
2 things,

first its a rumored limitation of X8 series only, according to the rumor this does not effect the coming R5 line. So the wording is somewhat bias to people who dont read but rather skim, makes them think CrossFire as a whole is doing this which isnt true.

The next thing is the Hz limitation. A DVI output is topped out at 60Hz, this is fact, it cannot produce more then this do to the technology, since the dongle relies on DVI outputs, this is where 60Hz factor comes from. It does not mean you wont see a difference over 60, many places have already proved this. Nvidia for one way back during the Xbox release showed that there was a very noticable visual difference on a TV with its refresh capped and with the core producing twice the refresh cap of the TV, you still see a very big difference. So if your monitor is limited to 60, 80, 100, etc Hz, due to DVI, or the monitors refresh rate, you can still notice the higher frame rates because it will look better and play smoother. Hope this helps people.

Its not a rumored limitation for the x8 only there is also a problem with the way the compositing chip works.
 
Sounds like they're complying with LCD standards and fvcking the CRT owners... :rolleyes:
 
Back
Top