AMD's ATI Hybrid CrossFire Sneak Peek Exclusive @ [H]

Has ATI solved the annoying screen flickering while enabling crossfire after Windows has loaded?

Does not flicker as much, but the GUI is still shit when it comes to being intuitive and giving the user a feeling of confidence in the product. We discussed this at length with many AMD employees in the last month.
 
To me, the biggest deal is having this technology applied on laptop computers with low end dedicated GPUs and integrated GPUs...

Also, if ATi can be on top of this technology over nVidia, it would probably raise ATi´s value with AMD and theres a lot to espect from that....
 
So my question is, has the [H] thought about how they are going to be testing hybrid systems in the future where you will no longer be able to "decouple" the item being tested? I'm guess I'm more concerned that the big boys are going to start making their "systems" in such a way that mix and match style components will not work out for the enthusiast the way it does today.

You know, the fact is that we can hardly test multi-core CPUs now in a worthy way. As for testing Hybrid systems that could not be "decoupled," I really don't see that impact the enthusiast user, at least not anytime soon. If they could actually make it work that well and scale that good, it might not be a bad thing. :)
 
So this is VERY interesting for the mom's and pop's out there who buy a mid-level pc for their home, and have a kid grow into gaming while they own it. They can pop minimal $$ for a basic card, CF it, and get decent performance.
HOWEVER, I am very doubtful of the enthusiast possibilities. In the example in the article, the IGP is paired with a low-end dedicated card, which are often not MUCH better than an integrated chipset. I can definitely see where you can get 1.5-1.9X the performance.
But what happens when you pair this setup with a 3850/3870 class card? The respective "horsepower ratio" of the IGP to the dedicated card goes from ~1:1 (or close), to a much more one-sided ratio favoring the dedicated card. If you are using a card with 512 mb of VRAM, and you add the 16 or 32mb of the IGP, will that make any real difference (versus adding the same to a 256 card)? With a higher end GPU on the dedicated card, will adding the extra, much smaller, weaker gpu, make any difference? Seems like a case of EXTREMELY quick diminishing returns as you increase the capabilities of the dedicated card.

I am not 100% familiar with the way Crossfire delegates the work between cards, but it seems to em that the only way that this would work very well is if the IGP was given its OWN set of tasks that the RS780 intelligently assigns according to the capabilities of the inboard card. This, opposed to 'pooling' the work and just having both graphics chipsets working on it all.
Otherwise it seems like with higher end systems, with higher end dedicated cards, the IGP could become a bottleneck instead of a boost.

Uh....I don't see Hybrid coming to enthusiast class systems ever as it is presented here. I do however see Hybrid benefits like controlling power consumption of high end 3D cards as we explained specificially in the article. You might read that article again to get a better understanding of what we are truly discussing here.
 
Well I'm glad AMD is doing this, this can be VERY good for them if/when the main stream market picks it up
 
I wonder if any discussion has gone into offloading physics computation to the onboard GPU. I guess if you're running a $50 graphics card, using that horsepower for graphics makes more sense, but for the enthusiast market, where it's expected to spend $250+ on a graphics card, making a driver that lets you use the built-in GPU to do physics computations would be pretty hot.

I swear I heard talk about ATI looking at something like this a year or so ago, where they were talking about having a crossfire setup where you had a pair of X1900s doing graphics and an X1600 series card doing physics computations.

Hehe, now that you have AMD that owns ATI and Intel owns Havoc, I dont think you will be seeing GPU physics from ATI any time soon, and likely not from Intel either in the form of a GPU.
 
Very interesting technology. The questions in my mind are:

1) Will this work with my shiney new 3870, or only with future video cards? I'd have to get a new motherboard reguardless. Also, will that increase performance?

2) Seriously would it kill them to stick like a 64MB DDR4 chip on the motherboard for onboard graphics? can't cost that much money

3) I think the future of AMD could very well hinge on this technology. If they get Dell and HP to support it, they will propagate into the market very well, and this technology will be useful. Suddenly you have new amd and ati technology in millions of homes. It could also spur Phenom sales.

4) Could you run crossfire-x and hybrid crossfire in the same system?

Looks good AMD, keep it up.
 
How much of a performance increase, if any, would you anticipate seeing if a 2900 XT was used instead of a RV620?
 
How much of a performance increase, if any, would you anticipate seeing if a 2900 XT was used instead of a RV620?

That's been addressed and from what I could gather, that isn't/won't be what HCF is about. So far it seems to be specifically a low end GPU/IGP combination concept.
 
Very interesting technology. The questions in my mind are:

1) Will this work with my shiney new 3870, or only with future video cards? I'd have to get a new motherboard reguardless. Also, will that increase performance?

2) Seriously would it kill them to stick like a 64MB DDR4 chip on the motherboard for onboard graphics? can't cost that much money

3) I think the future of AMD could very well hinge on this technology. If they get Dell and HP to support it, they will propagate into the market very well, and this technology will be useful. Suddenly you have new amd and ati technology in millions of homes. It could also spur Phenom sales.

4) Could you run crossfire-x and hybrid crossfire in the same system?

Looks good AMD, keep it up.

1. No.
2. This is low end product.
3. Very much and when Fusion comes into play this will be a big deal.
4. Currently no.
 
^^^

Kyle, I understand this is a low end part, but just as we've been overclocking low end CPU's for years....

IF a MB manufacturer were to offer IGP on their current enthusiast boards and IF the price premium was minor (say $20) would it be possible to Xfire like this with a 3850 and IGP and if possible would it be worth the $20 investment?

Me.
 
I really do like the idea. I hope we can see in the very near future motherboards using IGP for desktop apps only and completely power down the main gpu, and only bring it to live in a game, ect... That is a pretty powerful combo that nvidia or intel can not even touch atm. Good job AMD.;)
 
Very interesting technology. The questions in my mind are:

1) Will this work with my shiney new 3870, or only with future video cards? I'd have to get a new motherboard reguardless. Also, will that increase performance?

2) Seriously would it kill them to stick like a 64MB DDR4 chip on the motherboard for onboard graphics? can't cost that much money

3) I think the future of AMD could very well hinge on this technology. If they get Dell and HP to support it, they will propagate into the market very well, and this technology will be useful. Suddenly you have new amd and ati technology in millions of homes. It could also spur Phenom sales.

4) Could you run crossfire-x and hybrid crossfire in the same system?

Looks good AMD, keep it up.
Only reason would be lower power consumption. Remember this is a budget low end setup, and to see that kind of performance in Crysis of all games is fantastic.
 
So I take it this is only limited to the RV620? How about low end cards from last generation?
 
It's obvious to me now that nvidia and AMD/ATI are going in different directions with their research. After viewing some video interviews with AMD/ATI talking about the future of their company, they always talk about developing around what's going to give the "average user" something to increase performance where they need it. Making a new CPU that can open MS word in 0.4 seconds instead of 0.5 seconds is not what they're after... they're after performance for the masses, I guess.

I really like the direction AMD/ATI is going with HCF and the HD 3800 series. They're going after performance FOR THE MONEY, not balls-out, top dog performance at any price. And I think this is where the industry needs to go. People are SICK of having to pay 300-400 for a video card to run the latest games at higher settings and resolutions! Not that AMD/ATI's latest offerings are going to really let you run the latest games at the highest settings... but you get closer for the money than nvidia.

I would wager that AMD/ATI can make a lot more money with HCF than nvidia ever will with the 8800 ultra...
 
I like it, I can see that it isn't honestly for us but it is a great idea. I'd love to use corss fire though in my current system which is a "bridge" system (DDR and DDR2 AGP and PCI-E) although I don't think anyone is thinking about that.


The good in this comes in when someone wants to develop or squeeze some extra life out of their OEM system. This can be included in the chipsets and drivers with not real extra costs, I'm assuming, and will provide a substantial upgrade to the system.
 
^^^

Kyle, I understand this is a low end part, but just as we've been overclocking low end CPU's for years....

IF a MB manufacturer were to offer IGP on their current enthusiast boards and IF the price premium was minor (say $20) would it be possible to Xfire like this with a 3850 and IGP and if possible would it be worth the $20 investment?

Me.


My initial thought is, "No."
 
I do like your take on it as far as the trickle down benefits to enthusiasts.

The rest of it makes me want to barf. Just improve the onboard graphics or educate buyers better. Selling someone a computer and then coming back and saying well if you spend another $50 you can play games at low resolutions and mediocre visual quality strikes me as the bass ackwards way of doing it. Uncle Bob should have had knowledgeable sales help/education to point out that a business class worker drone machine was not made to play games and most certainly not the high end ones. Do we not have mATX boards now with no onboard graphics and a PCI-e 16x slot ? ( A rhetorical question, of course we do, finally.). It also makes me both sad and angry that such a crappy marketing move is needed for a company to spend the time to get their software right. Why not do it because it is the right thing to do ? God forbid we should take the time and money to educate the consumer other than "bling ding dong, Intel inside". Everyone is falling over one another to sell this crap in Walmart, Kmart to noobs and reap millions. I guess I should buy some more Intel stock, or Walmart.

I rank this right up there with finding out the spare tire for a new car is optional and an additional charge.

edit, I do like this idea however, this makes good sense :

What I would truly like to see is an integrated IGP that will run all my desktop and video applications while I have a single or double high end video gaming card configuration powered down. I don’t run a multi-card configuration now for two simple reasons. First, I don’t want 200 watts worth of idling 3D cards sitting under my desk all day. Second, I don’t want to deal with the multi-monitor situation that continually forces me to reboot when toggling between multi-monitor desktop use and gaming on a single monitor. I want enthusiast level Hybrid that fully turns off (or incredibly close to it) my high end 3D gaming card when I am not using it. I also don’t want to juggle reboots every time I want to get a gaming session on
 
This is pretty cool, but I'm not optimistic that this will be implemented mainstream. I would think that the 'cows' that buy low-end systems probably don't care about games, and this technology, while inexpensive, would go way over their heads and seem like an unnecessary cost. Nevermind that gaming is basically now the domain of consoles with us PC users getting the scraps.
 
I honestly think that this will be a feature that the community is going to grab ahold of very quickly, and hopefully soon enough we'll be seeing the driver modders out there editing the drivers so the Hybrid Crossfire will work with the higher end cards. Even if they are a higher end card, the GPU onboard video is still horsepower that's not being used currently and if the right programming is done it can still benefit higher end cards even if it is only a marginal performance increase.

What I'd love to see is onboard AGEIA PhysX coupled with the RV core and able to be used with the Hybrid technology. Now that would rock.

I think the biggest factor here is that lower end machines with the tech available onboard will be able to give people who are just getting into gaming a chance to increase performance without going broke trying to beef up performance to keep up with the enthusiasts. It also gives us parents who only have a certain amount of cash hope for the future as we can buy a cheap PC, slap in a few more things at a reasonable price and have a solid platform for the kids to play.

It's a great option for someone who has a bit of PC knowledge who see's that low priced PC on sale at Staples with the capable onboard GPU and slot available...can buy that PC, slap a card in it and away you go. And we ALL know that onboard video is the mainstream with the big stores.
 
Interesting stuff, doesn't seem that revolutionary other than the fact that they were able to optimize it so well with bargain-basement parts... Even real CrossFire setups (or SLI for that matter) aren't often optimized so well for new games (read: Crysis). :rolleyes:

" the Hybrid CrossFire would come from the builder enabled since it now supports multiple monitors (not a feature we tested). "

That bit is particularly good to hear. Kind of ironic if they ever implement the kind of power optimizations Kyle describes, we'd almost be going full circle back to a config similar to the early 3dfx days. :p (at it's most basic level anyway)

Anyway, this is the direction both AMD and NVidia should be taking CF/SLI, not into the other extreme with Tri-SLI and QuadCFX etc. Power optimizations, improvements so even low-end dual-GPU setups have some merit, simpler configuration, etc.

This will make it a lot more profitable in the long run, so it's good for them and us.
 
I do like your take on it as far as the trickle down benefits to enthusiasts.

The rest of it makes me want to barf. Just improve the onboard graphics or educate buyers better. Selling someone a computer and then coming back and saying well if you spend another $50 you can play games at low resolutions and mediocre visual quality strikes me as the bass ackwards way of doing it. Uncle Bob should have had knowledgeable sales help/education to point out that a business class worker drone machine was not made to play games and most certainly not the high end ones. Do we not have mATX boards now with no onboard graphics and a PCI-e 16x slot ? ( A rhetorical question, of course we do, finally.). It also makes me both sad and angry that such a crappy marketing move is needed for a company to spend the time to get their software right. Why not do it because it is the right thing to do ? God forbid we should take the time and money to educate the consumer other than "bling ding dong, Intel inside". Everyone is falling over one another to sell this crap in Walmart, Kmart to noobs and reap millions. I guess I should buy some more Intel stock, or Walmart.

I rank this right up there with finding out the spare tire for a new car is optional and an additional charge.

edit, I do like this idea however, this makes good sense :
1024x768 medium crysis is mediocre?

Not for most consumers.
 
1024x768 medium crysis is mediocre?

Not for most consumers.

As it stands now, I don't think most average consumers are even thinking of playing Crysis currently, 'specially not after they hear the system requirements, etc. :p One of the reasons we enjoy PC gaming (the hardware!) is the reason many average consumers have since been turned off by it, and unto console gaming.

I'm not sure HCF has any hopes of turning back that trend, but it can't hurt either. Increased competition (thus, much cheaper cards) is the only thing that's really gonna reverse it.
 
Good work and an interesting read.:)

BTW on a side note, if you want to play "Crysis" try texture, object, shader, and water on high. That gives the most IQ return IMO and try the rest on medium. I tried HD 3850 256Mb Crossfire recently and was able to run the game like that @ both 1440x and 1680x with 4xAF. Naturally, if you have older hardware just set all to medium and call it a day. Or try the settings already named to medium and the rest to low. HD 3850 256Mb Crossfire actually ripped right through "Crysis" using all medium settings @ 1680x with 4xAF. The combination of high and medium I already outlined was also perfectly playable. I dropped a post in the ATi section of this board that links to the results if you are interested.

The only drawback I have found in use and reading feedback from others with HD 3850 cards is they are voltage limited so overclockers won't be able to break 770MHz without a vmod. Mine will do 760 all day and all night. Overdrive limits them to 730 though. If you want higher use Rivatuner. Of course the 256mb cards can't deal with AA in very demanding titles @ elevated resolutions.

I was kinda shock a single overclocked HD 3850 256mb card was able to play "The Witcher" well @ 1680x all high with 16xAF.
 
they still got some fire in them yet! perhaps why they skipped out on doing high end cards, they are looking at muliple GPU solutions for the masses that work and are worth it.
 
Hmm, about the idea of powering down a discrete GPU when not using the horsepower: where do you connect the monitors? I'd think the best idea would be to connect to the onboard GPU, and then use this Hybrid Crossfire to make that be the one that performs output.
In fact, depending on the speed of the IGP compared to the discrete video card, it might make the most sense to have the offboard GPU perform all 100% of the rendering to eliminate any overhead, and then merely pass the completed frames to the onboard GPU for output.
 
I wouldnt mind buying a laptop with this, especially since a 8600m or HD2600m are the most common and affordable "high" moblile GPUs being put into laptops currently.
 
I like the idea, although it reminds me of an idea ATI has already talked about: a hybrid mobile chipset with an IGP for on-the-go (much lower power consumption), and a discrete graphics part that gets enabled only when the laptop is plugged into mains.

This is the best of both worlds, assuming your software could seamlessly transition between IGP on-the-go, and hybrid crossfire when plugged in. Your idea to bring this to the desktop is also a good one, for long-term efficiency. After all, you can already run Vista's AERO with Intel's IGP, so why not just make it the only thing running until a game is loaded?
 
What I like about this is it might increase the installed user base of gaming capable PC's and that in turn might get more game companies to think about PC gaming as something other than an afterthought.
 
I agree with what some others are saying. Even though this is currently geared toward lower end users, if the IGP could pump out even a free 5% increase in performance, it would be worth enabling. How many of us scrounge around for the best drivers and overclock those extra few Mhz just to get that extra 5% FPS?

Or even better, if they could divide the loads between two unequal cards, then let me simply add my shiny new $300 card on top of my old card so I don't have to throw out the card I just bought 6 months ago. That would be nice. They could throw some clever marketing name at that capability like "Hyper Upgradeable" or some silly thing.
 
I agree with you 100% This a GREAT idea, or, ideas really - the two key points from my perspective:

1) Integrated graphics that are actually decent (this one isn't QUITE there, but almost)
-and-
2) Having 1 or 2 powerful GPU video cards sitting idle in your case completely asleep until it's time to unleash them like a couple wild beasts to rip through your 3D video games. WITHOUT having to reboot etc.

I've expressed before that I think it's RIDICULOUS that today's mid - high end 3D GPU cards can't switch to a low power (less than 50 watts) mode for regular desktop use.

When I put an 8800GTS 320 in my PC my ROOM GOT HOTTER, really. It's almost like having a hair dryer just sit there and run continuously. I like to leave my PC on 24/7, but not anymore since I gave up my 7900GT and put in an 8800GT, it just burns through too many watts.

Come on, it's well past about time that modern video cards have a wide range of power saving options, wtf?!?
 
Very Good Article! Concise, and just enough of an Intriguing new idea in Vid Processing, to justify inteligent discussion, and how best for AMD To deliver it. Kudo's Kyle.
 
2002 was roughly the year that all motherboards started having integrated audio in their chipsets. At around the same time, ethernet NIC was also being integrated.

Shortly afterwards, Nvidia became immensely popular with its integrated GF2 MX integrated graphics and powerful GF4 MX440 with NF2 chipsets. It was so powerful that it could beat a GeForce2 Ultra released less than 2 years beforehand when it sold for $500!!

Then Nvidia never did anything like that again with integrated graphics. I think that is because too many people bought motherboards with IGP that powerful and did not feel any need to buy video cards. So that was bad for Nvidia's GPU market where the big money lies.

Now, in spite of several years that have passed, I think that motherboards should finally start having an integrated 2-D graphics chip (like 2-D video cards of the 1990's). Remember when Voodoo2's were the 3-D add-on cards in addition to a 2-D video card. With current 65nm/45nm process, we can make dual 400MHz RAMDAC's on a chip smaller than a voltage regulator chip that accesses a small partition of system memory (with DDR2 getting so cheap these days and 8GB due to become mainstream in a few years). That will be enough to drive two monitors at high resolutions with 2-D applications. Then our 3-D cards can truly sit completely idle until a 3-D application is started.

No matter how high-end the motherboard is, it should at least have integrated 2-D video output. That would be handy when a gfx card fails or when we try to flash our video BIOS and something goes wrong. Basic audio is a must-have on all motherboards, even on $300+ enthusiast mobos of today, so why not 2-D video? If there were a motherboard sporting that feature, I'd buy it in a heartbeat along with 4GB of RAM. There is also the potential of having fewer compatibility issues as the motherboard will be using basic 2-D drivers to display everything while booting and loading up Windows.
 
Kyle,

Any news on when ATI will offer Fixed-Aspect Ratio Scaling on their drivers? I personally don't even view them viable to Nvidia until this is offered. I know their are quite a few more besides me as well wondering when they will get their act together.

Overall great article tho, I look forward to seeing this out in the wild.
 
so, ati can make hybrid CF to work great in games that has no profiles for it but not regular CF... this makes no sense :rolleyes:, great for the future though
 
be nice if it would work on a laptop lol then we would have better gaming experience on the go.;)
 
Back
Top