MSI Big Bang First Look @ [H]

Hey CJ1 as you obviously have some inside knowledge on this wonder if you could answer this for me....i read an article a while ago now on Fudzilla stating that the Hydra chip has to be run in one of two mode...a graphics mode or a gpgpu mode.
Was just wondering how that would work in a game where it would obviously have to be set in graphics mode but where for instance Nvidia's Physx was also running on the GPU....would that require the Hydra chip to be set in GPGPU mode? Will the hydra chip allow for gpu physic of whatever flavour and forthcoming gpu tasks like AI?
 
Let me get one thing straight here first. I'm foremost posting here as a tech-enthusiast and not as an MSI employee. Yes, I happen to work for the company, but I have been active in the online tech community for nearly a decade even before I joined the force (I just was not that active on HardOCP, but still lurking!). So I know quite well how things work and can definitely understand your reluctance to trust anything. Hopefully we can do something about it. :)

Before we talk about this tech, lets have a look at the Status Quo.

1) Stability - SLI & Crossfire is currently a very stable and problem free solution to multi-GPUs. Any proposed replacement must meet this goal.
2) Scaling - The biggest whine card about SLI & Crossfire is scaling. However these days scaling in most games is 70% or above. Yes, in older games scaling sucks. However older games a single GPU can easily max out so it's a moot point. Almost all new games are scaling better than 75-80%. A replacement has to scale as good or better
3) Cost. Currently there is no cost associated with SLI or Crossfire as anyone looking for a multi-GPU setup is going to be paring it with an P55 or X58 mobo. The prosposed solution must have no increase in cost over a standard mobo, or if it does pay for its self with better scaling.

Now lets look at this proposed solution:
1) Stability - currently, Lucid has not test, verified, stamp of approval, whatever many of the most demanding and newest games. I could care less about running multi-GPUs in Age of Empires III. This technology MUST be as robust as SLI or crossfire for it to even have a chance. With the limited number of approved games I am very worried on this point. I'm sure we will get more information as reviews come out, hopefully they can put my worries to rest.
2) Scaling - The first 3 previews all said the same thing. Scaling will be on par with SLI/crossfire. Unless this has improved I see no reason for the technology.
3) Cost - Seeing the price of the previous big bang mobo, this is going to miss the "value" market of people trying to reuse old video cards. Without the scaling it is going to miss the "high-end" market because those people are going to (right or wrong, but probably right) buy an EVGA classified.

When this was first announced I said they couldn't do it with perfect scaling. Now that its here it is clear they haven't. Without significantly better scaling in high end games this technology has no market. With any cost associated with it, it has no market for the value consumer. This will end up going the way of the PhysX card and the KillerNIC. Quickly becoming the BOSE of mobos.
 
I understand that the scaling was worse when using dissimilar cards and best when the cards are identical...wonder what the percentage scaling is with identical cards?
 
Vengence, I really wish you would shut the fuck up and give it a rest already, we all know your not a bit thrilled with this new tech, at least wait until a full and final review of the shipping hardware is completed and then your welcome to spout off all you want about it. The tech is new, early, and unreleased, it may need some time to fully mature, were the early days of SLI/Crossfire so grand? And in the end, it just may completely suck ass, but right now we do know that yet and I for one, am damn tired of hearing you run off at the mouth about it. Ahhhhh...

Oh, when we do get the final review and if it does wind up sucking ass, don't worry, its well known, you called it first. Hell, I think I'll even send you a blue fucking ribbon.
 
First review of the Fuzion board from guru3d http://www.guru3d.com/article/msi-big-bang-fuzion-lucid-hydra-review-test/ Predictably: great board like the Trinergy, but consider Hydra to be a "nice gimmick" until the game support is developed to the SLI/Crossfire standard, but of course LucidLogix have a long way to go to achieve that.

Ignoring Hydra, the board looks v. expensive and is missing USB3.0 & SATA 6GBPS.

Wow... color me not impressed. You might send this to Kyle so he can put the link up on the front page. (Unless of course he has his own review coming soon)
 
At least it actually worked. Though this is prolly as far as we will see it go. X mode is the only thing it really has of interest, and that has too many tradeoffs to be considered truly useful in any serious way.
 
At least it actually worked. Though this is prolly as far as we will see it go. X mode is the only thing it has of interest and that has too many tradeoffs to be considered truly useful in any serious way.

It was originally hyped to do "near linear scaling". But it can't beat crossfire. The X-mode requires massive driver support which isn't there. And who buys a 389$ motherboard to resuse a grahpics card. Much better off selling your old graphics card for 100+$ and saving 200$ on this motherboard and putting the 300$ towards a better graphics card.

As far as a 400$ mobo, I'd rather buy a P55 classified for only 339$. Better mobo and 50$ cheaper.
 
To be honest, we have not wasted time even playing with ours since I looked at all the literature. It just seemed as though it was a waste of money and I did not feel like spending the resources on giving this mobo press.

It has been VERY odd with MSI and this Lucid product. MSI has NEVER spoken to us on the record about this motherboard. They have never told us anything good about it. The board was shipped to us from overseas with a very long complicated NDA that we refused to sign. It looks like a total waste of time and energy.

That all said, I still think the GD65 and GD80 were two of the best motherboards built this last year. Kudos to MSI on that. This Lucid stuff looks to be a total waste of time and effort. After all the claims that have been made, neither Lucid or MSI have given any performance metrics...... What does that tell you?

Guru3D says it all, little else needs to be said.

We definitely like what Lucid is trying to accomplish, but we also have to acknowledge that we ran into quite a lot of problems, compatibility issues and sheer limitations. Where it works it can work well, but in the end, in this time and age, the combination of two similar Radeons in Crossfire or two NVIDIA GeForce cards in SLI will simple make much more sense as you by far will not have to deal with the profile support and compatibility issues.

Now bear in mind, it also took both ATI and NVIDIA a year or two to stabilize their drivers for SLI and Crossfire. Multi-GPU support is complex, very complex and that is exactly what will be haunting Lucid.

Some practical issues and things that raise questions:

What if you have a Radeon card and a GeForce card and want to select an NVIDIA AA mode... well you can't.
What if you want to enjoy PhysX in X-mode? - well you can't, as the minute the NVIDIA driver sees an ATI card, it will disable it.
Got a GeForce and Radeon card in X-mode? You can't access the primary graphics card control panel e.g. in our case we could not load up the NVIDIA control panel.
What about DirectX 11 games? Well currently there's no support.
Is there CPU overhead due to Hydra? -- Yes.
You want to run the latest Forceware driver yet use Hydra? You can't as the Hydra driver needs to support it.
 
Last edited:
Hey Kyle did Ya send this raw Turkey back to MSI yet? I agree on the time and resources as It can't be used with any X2, 59xx or GTX295, But I do like the Big Bang Trinergy of course, It seems to hold a good amount of promise. :D

Why the trinergy over the P55 Classified?
 
Why the trinergy over the P55 Classified?

Lets see It has an NF200 chip, It has a 16x and two 8x pci-e slots, Its overclocking is good too from what I've read. P55 Classified is 16x and 4x or at best 8x, 8x, 4x, So which is better for Yer video card needs? For Me It's the Trinergy, You do what You want.
 
This technology doesn't seem to make that much sense on the high end market. It would be more practical to the value end market where users on a budget would be wanting to reuse their old cards to squeeze out a little more performance.
 
Lets see It has an NF200 chip, It has a 16x and two 8x pci-e slots, Its overclocking is good too from what I've read. P55 Classified is 16x and 4x or at best 8x, 8x, 4x, So which is better for Yer video card needs? For Me It's the Trinergy, You do what You want.

Wow, a little defensive? Just a question. Sheesh. If you don't mind me asking what video card do you have that can maxes an 8x PCI-E 2.0 slot?
 
Wow, a little defensive? Just a question. Sheesh. If you don't mind me asking what video card do you have that can maxes an 8x PCI-E 2.0 slot?

I don't know If a GTX 295 can do that(I have a BFG), But I'm looking at the future as cards come and go and something new is always getting faster. :D
 
I don't know If a GTX 295 can do that(I have a BFG), But I'm looking at the future as cards come and go and something new is always getting faster. :D

Faster is better, but unsaturated bandwidth+NF200 is slower than unsaturated bandwidth. When the classified came out on the X58 it had NF200s, then they rereleased it without NF200s. They released it without the NF200s due to feedback from the enthusiast community and published tests showing NF200s adding latency and slightly lower frame rates, but of course added power draw (and of course heat dispation) and a failure point. They released the P55 classified without the NF200s because they couldn't sale the X58 classified with the NF200s after they released one without them. :(

Things are getting faster and maybe we'll see limitations. For right now, if it were me I'd spend the 50$ some where else in my build but to each thier own. :)
 
Faster is better, but unsaturated bandwidth+NF200 is slower than unsaturated bandwidth. When the classified came out on the X58 it had NF200s, then they rereleased it without NF200s. They released it without the NF200s due to feedback from the enthusiast community and published tests showing NF200s adding latency and slightly lower frame rates, but of course added power draw (and of course heat dispation) and a failure point. They released the P55 classified without the NF200s because they couldn't sale the X58 classified with the NF200s after they released one without them. :(

Things are getting faster and maybe we'll see limitations. For right now, if it were me I'd spend the 50$ some where else in my build but to each thier own. :)

Oh I see, Good point, Thanks. :eek:
 
Back
Top