So I just hooked up a 2GB 5850 to my laptop...

man this gives me some ideas for a couple of parts i have laying around. I wonder if i could rig up my thermaltake 250 W pure power express (5.25 bay video card power supply) with my X1900XT thats collecting dust to my vostro 1400. I'll have to figure out if the purepower can be made to work without being hooked up to a regular power supply. I could probably squeeze the card and PS into an enclosure that would maybe fit in my laptop bag.
Awesome find OP
 
man this gives me some ideas for a couple of parts i have laying around. I wonder if i could rig up my thermaltake 250 W pure power express (5.25 bay video card power supply) with my X1900XT thats collecting dust to my vostro 1400. I'll have to figure out if the purepower can be made to work without being hooked up to a regular power supply. I could probably squeeze the card and PS into an enclosure that would maybe fit in my laptop bag.
Awesome find OP

That would be cool, play the original FEAR again in all its glory! :D
 

That won't work, the HD5870 requires one 8-pin and one 6-pin PCIe Auxiliary power connector. Using that adapter might blow up the small dedicated power supply in the ViDock if it can't handle anything much more than the rated 225w of power draw.

lloose said:
ViDock 4 Plus adds a second 2 x 3 pin power connector to accommodate graphics cards that require up to 225W.

An HD5850 can exceed that, as can many other graphics cards. You're also limited to the maximum physical dimensions of the ViDock. There are going to be graphics cards that simply will not fit inside, especially those with aftermarket cooling fitted to make them quieter.

A ViDock is between $200 and $279 (depending on which version you get). The PE4L-EC2C (Expresscard to PCIe) adapter employed in this thread is $55, and can handle cards up to 75w without any additional hardware. So, if you wanted to use something like a low-power 9800GT ($89) with your laptop, it would cost you $145 using the DIY solution, or $290 using the ViDock 3.

In this scenario, using the ViDock doubles the price of this upgrade. You're effectively paying an extra $145 for a metal enclosure. I'll jigsaw some holes out of a $5 Project Box from RadioShack if it'll save me that much money.

For larger graphics cards that need their own power source, add a 450w Dedicated Graphics Power Supply for $79. That, plus the PE4L-EC2C, plus a ProjectBox to put it in, comes to $140, and you'll be able to fit graphics cards far in excess of what even the $279 ViDock 4 Plus will work with.
 
Last edited:
That won't work, the HD5850 requires one 8-pin and one 6-pin PCIe Auxiliary power connector. Using that adapter might blow up the small dedicated power supply in the ViDock if it can't handle anything much more than the rated 225w of power draw.

Uh, It will work. That adapter has TWO 6+2 pin connectors, not to mention 5850's don't use an 8 pin at all. Oh, and a 5850 also uses a max consmption of 170watts, well within spec for the power supply, so it will infact, work flawlessly.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150477&cm_re=hd_5850-_-14-150-477-_-Product

Refer to specs
 
Whoops, sorry. Small typo. I meant the HD5870, not the HD5850. Typo fixed.

I know the HD5850 has a 6pin + 6pin, but you posted a 6pin-to-8pin adapter (which would be useless with an HD5850) so I jumped up to the first model with 6pin + 8pin.

Yes, I know AMD's specs list 170w for the HD5850 and 188w for the HD5870, but they are quite capable of drawing substantially more power than that when fully loaded. This [H] review showed a reference HD5850 drawing 312w of power under load. I would be seriously concerned about a ViDock power supply failure using cards that require one of the 6-pin connectors be adapted to 8-pin.

I suppose you cloud slap one of those dedicated graphics power supplies I linked on to a ViDock to handle large graphics cards without fear, but that's just making an already-overpriced adapter (compared to the DIY solution) even more expensive.
 
Last edited:
Wrong again on all counts.

I did not post a 6 pin to 8 pin adaptor, I posted a 6 pin to a dual 6+2 pin adaptor. The "+2" is break away, you can use it or not use it, it doesn't matter.

Other than the 2GB models, all the 5870's I've seen are using 2 6-pin connectors, not that it matters though beucase the adaptor will work regardless of connector configuration.

The 5870 uses 188watts at full tilt, still well under the 225watts. The H review is stating total system power consumption, not the video card alone.

Seriously, do you even read before posting?

Again, it will work flawlessly. Did you make another typo and ment 5970 instead?
 
I did not post a 6 pin to 8 pin adaptor, I posted a 6 pin to a dual 6+2 pin adaptor. The "+2" is break away, you can use it or not use it, it doesn't matter.
That still makes it an 8-pin adapter. Optional or not, 6 plus 2 equals 8. Newegg even notes as much in the item description.

I still don't understand why you'd post such an adapter/splitter when, as has been mentioned, the ViDock 4 Plus already has two 6-pin aux power connectors.
You don't need a splitter at all, and you're adamant that you aren't going to use the extra 2 pins...so wtf use would it be?

Other than the 2GB models, all the 5870's I've seen are using 2 6-pin connectors
We're limited to a PCIe 1x link here. The more RAM on the video card, the better. This was discussed at-length earlier in the thread. If you were going to use an HD5870, the 2GB model would be the one you'd want to use for optimal performance with this kind of setup.

The 5870 uses 188watts at full tilt, still well under the 225watts. The H review is stating total system power consumption, not the video card alone.

Are you sure? They seemed quite clear when they said "The PCS+ HD5850 drew 388 Watts at load"

Not "A computer with an HD5850," just "an HD5850." If they did mean the entire system, then it's incredibly poorly worded.

Again, it will work flawlessly.
Again, why would you want an overpriced ViDock? Even if it does work, you're spending a LOT of extra money over the DIY solution, and all you get for it is a metal box to go around the card (which restricts your heatsink choices, no less).

You might even be able to get better performance out of the DIY solution. In some configurations, it can be attached to an ExpressCard slot and an internal MiniPCIe slot to create a PCIe 2x slot. The ViDock is restricted to ExpressCard only, and so is stuck at PCIe 1x.
 
Last edited:
Are you sure? They seemed quite clear when they said "The PCS+ HD5850 drew 388 Watts at load"

Theres NO way it pulled that much wattage alone. NO WAY.

Again, why would you want an overpriced ViDock? Even if it does work, you're spending a LOT of extra money over the DIY solution, and all you get for it is a metal box to go around the card.

Why do people want pc cases? I mean, just having the shit laying around your desk is a good way to save money.
 
Why do people want pc cases? I mean, just having the shit laying around your desk is a good way to save money.

I did not say you should leave the parts lying around. On the contrary, I already posted a link to some $5 project boxes from RadioShack that would do nicely for enclosing the DIY adapter and a graphics card.
 
I did not say you should leave the parts lying around. On the contrary, I already posted a link to some $5 project boxes from RadioShack that would do nicely for enclosing the DIY adapter and a graphics card.

Wheres the op's power supply? Im sure thats WONDERFUL looking. Oh wait, he hid it behind his desk because it probably looks like shit.
 
Wheres the op's power supply? Im sure thats WONDERFUL looking. Oh wait, he hid it behind his desk because it probably looks like shit.

Which, like I said, is where the project box comes in. Mount the adapter, video card, and (if the video card you're using is large enough to need one) power supply inside.

He reclaimed an old ATX PSU, which saved him more money but will also require a larger project box for mounting the assembly. That's why I also recommended and linked to a bay-PSU which is much smaller.
 
That still makes it an 8-pin adapter. Optional or not, 6 plus 2 equals 8. Newegg even notes as much in the item description.

I still don't understand why you'd post such an adapter/splitter when, as has been mentioned, the ViDock 4 Plus already has two 6-pin aux power connectors.
You don't need a splitter at all, and you're adamant that you aren't going to use the extra 2 pins...so wtf use would it be?


We're limited to a PCIe 1x link here. The more RAM on the video card, the better. This was discussed at-length earlier in the thread. If you were going to use an HD5870, the 2GB model would be the one you'd want to use for optimal performance with this kind of setup.



Are you sure? They seemed quite clear when they said "The PCS+ HD5850 drew 388 Watts at load"

Not "A computer with an HD5850," just "an HD5850." If they did mean the entire system, then it's incredibly poorly worded.


Again, why would you want an overpriced ViDock? Even if it does work, you're spending a LOT of extra money over the DIY solution, and all you get for it is a metal box to go around the card (which restricts your heatsink choices, no less).

You might even be able to get better performance out of the DIY solution. In some configurations, it can be attached to an ExpressCard slot and an internal MiniPCIe slot to create a PCIe 2x slot. The ViDock is restricted to ExpressCard only, and so is stuck at PCIe 1x.

The "+2" means it's optional, whether or not you want to call it an 8 pin or not doesn't change the fact that it will work. If you have a card with two 6 pins you simply leave the "+2" disconnected. If you have one with a 6 and an 8 you connect ONE of the "+2" and leave the other disconnected. Whether you use a 1GB card that has dual 6 pins or a 2GB card which using an 8 and 6 doesn't matter, it would work for EITHER one. Your replies on this matter tell me you either have no idea how these adaptor work or are desperately trying to cling to the notion that it wont work. Whatever the case is, you're wrong, about every single piece of info you tried to argue with me about.

Am I sure about the [H] power consumption figures? I sure am, I actually read the article instead of jumping to the conclusion page. There is a graphic that clearly states "SYSTEM wattage" there is additional info there that does allow you to deduce the consumption of the cards alone. Since you fail at reading, I'll let you go back and find it yourself.

Your over-zealousness to tell someone their wrong isnt doing you any favors here.
 
According to this Xbit review, the HD 5870 uses 233W at full load by itself:
http://www.xbitlabs.com/articles/video/display/gpu-power-consumption-2010_3.html#sect0

I think their methodology of measure graphic card output is fairly accurate.

Well I suppose if you're getting the VIDock to run OCCT stress test you'll need other power supply options. Tests like OCCD and Furnark are desgned to be extreme (not real world) examples of power consumption and heat. Both the H review and xbitlabs show power consumption even in the most demanding games to be within the 188watt rating that AMD specs out for the cards
 
Why the huge fight over this? The Vidock is better under most circumstances, just more expensive.
If you already have a micro case and high enough rated PSU laying around, you are a few minutes worth of modding and assembling away from having a very functional and neat set up like this on the cheap.

Even without having the case and psu already laying around the home brew is a lot cheaper. $25-40 for a micro atx case, $55 for the PE4H, and another $40-50 for a PSU that could run any card you wanted. At the high end that still only comes out to $145.
 
Which, like I said, is where the project box comes in. Mount the adapter, video card, and (if the video card you're using is large enough to need one) power supply inside.

He reclaimed an old ATX PSU, which saved him more money but will also require a larger project box for mounting the assembly. That's why I also recommended and linked to a bay-PSU which is much smaller.

Which runs me back into the circle of my first point. ViDock did it first and the cost difference between the two is so minimal that it wouldn't be worth the trouble.
 
hmmm...

I might actually try this..

The adapter is only $55, which is like, equal to 4 hours of work for me (not alot :)) and even if I didnt want to use my moms 4670 (which I easily could),I can grab a Geforce 9 series off the FS/FT Forum for less then $75...

And then I KNOW i have a Cooler Master Real power Pro 750 Watt Power supply laying in my garage. :D
And I can put that 24" monitor to good use again :)
AND if I finally get the back room of my garage in October (which SHOULD be happening), I'll have a way to run video to an LCD TV, which I cant do right now cause of only a VGA out on my laptop...

I mean, my laptop isnt anything special but I know it can run left 4 dead, and hell even half life 2 at medium settings would be a welcome change. I'd really love to upgrade, but right now (or this summer for that matter), have NOT been a good time to do that, too much fun :)
 
Which runs me back into the circle of my first point. ViDock did it first and the cost difference between the two is so minimal that it wouldn't be worth the trouble.

The cost difference is hardly minimal. In some configurations, using a ViDock would cost twice as much as the DIY solution. I posted an example of this a little ways back.

Your replies on this matter tell me you either have no idea how these adaptor work or are desperately trying to cling to the notion that it wont work. Whatever the case is, you're wrong, about every single piece of info you tried to argue with me about.

No, I know very well how the adapter works, but (as I mentioned previously) the ViDock 4 Plus already has two 6-pin aux power connectors...

You glassed over the fact that there's no need for that adapter/splitter AT ALL unless you need to split one of the two available 6-pin connectors to supply a card with three PCIe aux power connectors (I'm not aware of any that require this). I'm still trying to figure out why you even posted it. The only other thing you could use it for is adapting one of the 6-pin connectors to 8-pin, but then you would only need a straight-through adapter and not a splitter...

According to this Xbit review, the HD 5870 uses 233W at full load by itself:
http://www.xbitlabs.com/articles/video/display/gpu-power-consumption-2010_3.html#sect0

I think their methodology of measure graphic card output is fairly accurate.
Even if that is Furmark, it does show that the 5870 is capable of taking the ViDock beyond the rated specs of its small power supply. I wouldn't trust it for long periods of time.


ON AN UNRELATED NOTE:
Has anyone tried hooking two of these adapters up to a laptop with an AMD chipset to see if CrossFire works?
 
Last edited:
When you say "that won't work" and then continue to imply that it won't fit because it's 8 pin, then you clearly don't know how it works. I posted it as a reply to someone that claimed the dock wouldn't work with a 5850, you would still need the adaptor if you had a card that required an 8 pin, an example that you yourself brought up. You could do it with just an adaptor and not a splitter, but that doens't change the fact that it would work

You're absolutely right about one thing, I did gloss over the fact that the dock has 2 connectors. But guess what that changes about the debate you and I have? absolutely nothing. It would still work, it would fit, and power consumption is not a factor. Any way you try and justify your position, the end result is that you were wrong, get over it.
 
When you say "that won't work" and then continue to imply that it won't fit because it's 8 pin, then you clearly don't know how it works.
Where did I say it wouldn't fit? I never gave any indication that it wouldn't socket into the plugs, only that using such an adapter/splitter would either be pointless, or would allow you to plug in a card that is very likely capable of blowing up the ViDock.

I posted it as a reply to someone that claimed the dock wouldn't work with a 5850
Except the dock will work with a 5850, since the dock has two 6-pin connectors already. no adapter needed.

you would still need the adaptor if you had a card that required an 8 pin, an example that you yourself brought up. You could do it with just an adaptor and not a splitter, but that doens't change the fact that it would work
Yes, but you linked a splitter for a reason (because you didn't check the specs on the ViDock 4 Plus closely enough). If your original intention was to use it as a simple 6-pin to 8-pin adapter I would hope your first thought would be to get the proper adapter for the job and not a splitter.

You're absolutely right about one thing, I did gloss over the fact that the dock has 2 connectors. But guess what that changes about the debate you and I have? absolutely nothing. It would still work, it would fit, and power consumption is not a factor.
It'll fit the plug, but there's still no reason to use a Y-splitter for anything related to the ViDock 4 Plus.

And power consumption isn't a factor? You're saying that like it's an absolute across all cases? Tell that to the ViDock when it blows up due to over-current when someone tries to install a GTX480 in it.

Any way you try and justify your position, the end result is that you were wrong, get over it.
Not really. The Y-splitter is pointless no matter how you slice it. You failed to read the ViDock specs and posted the wrong adapter. It happens. Now can we please get on with this thread?
 
You said it won't work and it will, period. You said it wouldn't fit and it will, end of story

As far as power consumption not being a factor, we were talking about 5850 and 5870's, not 480's and you know that. Nice try though.

Absolutely we can move on, now that the facts are clear, unless you chose to continue to bend the facts and add other factors that weren't even part of the original discussion.

As far as saying "it won't fit" you didn't, you implied that it wouldn't by saying it "won't work" and the reason you gave is becuase it's an 8 pin. I tried to explain to you how the +2 works and your responce was that "it's still an 8 pin" The adaptor may not be needed at all, but that doesn't change the fact that it will fit, something you're still refusing to admit.

If you weren't wrong, then explain why "it won't work?" Don't tell me it's not needed or necessary, I want to know why it flat out why it will not work. Thanks.

What about the system vs gpu power consumption? Were you not wrong about that as well? (just incase you forgot, we were refferring to 5850's and 5870's)
 
Last edited:
You said it wouldn't fit and it will, end of story
Like I told you in my previous post, I never said it wouldn't physically plug in.

As far as power consumption not being a factor, we were talking about 5850 and 5870's, not 480's and you know that. Nice try though.
You said it as an absolute, not in reference to a specific card. Besides that, the HD5870 can draw more power than the ViDock is rated for (the Xbit review shows as much). Even if we do scale it back to the 5870, you still run the risk of issues with power consumption.

Absolutely we can move on, now that the facts are clear, unless you chose to continue to bend the facts and add other factors that weren't even part of the original discussion.
Just as soon as you stop trying to put words in my mouth that I never said ;)

As far as saying "it won't fit" you didn't, you implied that it wouldn't by saying it "won't work" and the reason you gave is becuase it's an 8 pin. I tried to explain to you how the +2 works and your responce was that "it's still an 8 pin" The adaptor may not be needed at all, but that doesn't change the fact that it will fit, something you're still refusing to admit.

I implied no such thing. I said, and I quote "That won't work, the HD5870 requires one 8-pin and one 6-pin PCIe Auxiliary power connector. Using that adapter might blow up the small dedicated power supply in the ViDock if it can't handle anything much more than the rated 225w of power draw."

All of that was correct.
1. The 2GB HD5870 has a 6-pin and an 8-pin power connector (I pointed this fact out to convey the extra power the card is capable of drawing).
2. When fully loaded, the 5870 can draw more power than the ViDock is rated for.
3. Installing a card that draws more power than the ViDock is rated for may very well burn out the ViDock.

Nowhere does it say the adapter won't physically fit, just that it's quite easy to blow up the ViDock with cards that require it.

If you weren't wrong, then explain why "it won't work?" Don't tell me it's not needed or necessary, I want to know why it flat out why it will not work. Thanks.
Erm, already said why. It can blow up the ViDock. I'd call that "Not working."

I wouldn't want to spend $300 on a ViDock that can be destroyed simply by running FurMark on it. I suppose you could also gimp the graphics card to prevent it from ever being fully loaded, but I wouldn't want to spend $300 on a ViDock that requires a $300+ graphics card to be gimped in order to eliminate the possibility of burning out the ViDock.

What about the system vs gpu power consumption? Were you not wrong about that as well? (just incase you forgot, we were refferring to 5850's and 5870's)
I read it exactly as [H] wrote it. It's unfortunate that they misrepresented the information on the conclusion page by saying that it was the cards themselves drawing that much power.

Moot point, however, as all I was trying to prove is that these cards can draw more power than the ViDock is rated for. The Xbit review confirms this.
 
Last edited:
lol, blow up the vdock? The adaptor uses 0 power by itself guy and we were talking about 58xx cards which we already know will work. Use the cards as they were intended and power consumption is a non issue and never will be. The whole idea behind OCCT and Furmark type tests is that they're NOT real world, that's why people use them for ultimate stress tests that they know will NEVER happen in a real world scenario. Power consumption was under 188watts in every game at every setting tested in both the xbit and [H] review. Why would you even want to run furmark on it? The only thing xbit review confirms is that you can overload the power supply if you use the cards in ways they were not intended.

I read the [H] review and I did not find it to be a misrepresentation at all. I did not find where they said "the cards by themselves" can you? Or is it more likely that you're the one misrepresenting to try and back yourself up? Unless you can show where they said that, I'll go with the latter.

Again, nice try but "it won't work" is still an incorrect statement, even a dozen posts later. The only thing you've been correct about is that it isn't necessary, that much is true. But not necessary and not going to work aren't the same thing.

Unless you've got something NEW to add, I'm done here. I don't want to crap on this thread any further and piss off the powers that be.
 
Last edited:
lol, blow up the vdock? The adaptor uses 0 power by itself guy and we were talking about 58xx cards which we already know will work.

They'll work until you fully load them. Then you're over the rated limit of the ViDock and run the risk of burning it up.

Use the cards as they were intended and power consumption is a non issue and never will be.
[snip]
The only thing xbit review confirms is that you can overload the power supply if you use the cards in ways they were not intended.

Last time I checked, loading the cards was intended usage (be it games or number crunching). Sorry, but I can't go along with that unless you can show me where Furmark or OCCT voids the warranty on the card (as "unintended usage" would).

I read the [H] review and I did not find it to be a misrepresentation at all. I did not find where they said "the cards by themselves" can you? Or is it more likely that you're the one misrepresenting to try and back yourself up? Unless you can show where they said that, I'll go with the latter.

I already quoted them once, I'll do it again. This is exactly what they said on the conclusion page:
"The PCS+ HD5870 under load drew 422 Watts where as a reference Radeon HD 5870 drew 338 Watts"

They said the cards draw that much power there, not the cards and the system. Like I said before, if they meant the cards AND the system, then that is worded extremely poorly.

Again, nice try but "it won't work" is still an incorrect statement, even a dozen posts later. The only thing you've been correct about is that it isn't necessary, that much is true. But not necessary and not going to work aren't the same thing.
No, my original assessment is perfectly correct. Using a 5870 with a ViDock (with that adapter of yours or otherwise) won't work.

It "wont work" with the ViDock in the same way a 400w PSU "wont work" in a system with GTX480 SLI. It might boot, it might even work fine in 2D mode, but as soon as you get the 480's going, something is going to ether shut down or burn out.
See, those cards (the 480's in this example) physically plugged in, they powered up, but they ultimately won't work. A 5870 in a ViDock is the same way, though you'll get a little bit farther with it before the ViDock finally gives up and packs in.

Cards like the HD 5870 have been shown to draw more power than the ViDock is rated to supply. This means that cards such as the HD 5870 can trip the over-current protection in the ViDock (possibly damaging the dock's Power Supply and the video card attached to it). Using a ViDock with such a large graphics card is an under-engineered solution and a ticking time bomb. End of story.

Unless you've got something NEW to add, I'm done here
Well, I did ask weather or not using two of these with an AMD laptop would allow crossfire (that that got lost in all this noise). I'd still like to know how that works.
 
So nothing new to say. If you read the actual article, the part that confused you would have been put into perspective. I don't think you can blame the editors for assuming, oh I don't know... That you'd actually READ the article? That's like trying to write a 5 page book report based on the 1-2 paragraph summary on the back of the book when you were in grade school and then blaming the book for your failure to comprehend.

So it will work until you run a completely meaningless stress test. I can agree with that.

Oh, and as far as crossfire. If it's even possible, I would think the lack of bandwidth would severely limit the benefits of multiple GPU's.
 
Last edited:
If you read the actual article, the part that confused you would have been put into perspective. I don't think you can blame the editors for assuming, oh I don't know... That you'd actually READ the article?
I'm not confused. They quite clearly say "X card draws this much" when they supposedly mean "X card, and the system it's installed, in draws this much"

I can blame them for not writing it consistently. One part wildly contradicts the other.

That's like trying to write a 5 page book report based on the 1-2 paragraph summary on the back of the book when you were in grade school and then blaming the book for your failure to comprehend.
More like "20,000 leagues under the sea" suddenly becoming "1,000 leagues under the sea" and acting like 20,000 never happened. Or an owners manual saying a monitors maximum refresh rate is 60Hz in the manual but 120Hz on the back-page summary.

So it will work until you run a completely meaningless stress test. I can agree with that.
I'm sure there are other things besides OCCT and Furmark that will get power usage up that high. That is beside the point, however. The hardware should be able to handle a worst case scenario, a ViDock will not. I wouldn't gamble $600 (or more) worth of hardware on a timebomb like that.

ATi themselves obviously built their cards to handle a worst case scenario. Intended or not (and there's no evidence that this isn't intended usage) they don't overheat or have their VRMs explode while running Furmark when they're installed in a case with correct airflow.

Oh, and as far as crossfire. If it's even possible, I would think the lack of bandwidth would severely limit the benefits of multiple GPU's.
I figured as much, though I imagine it would result in some seriously weird performance curves. Would still be interesting to see how it reacts, especially with 2GB graphics cards.

The DIY solution allows you to add mini-PCIe slots together in order to give the card a higher bandwidth slot. I wonder if adding more bandwidth to one card would help more than attempting to crossfire two cards...
 
Last edited:
Well none of the games pushed power consumption past 188watts and they had some demanding ones between the two reviews. Meaningful number crunching like F@H actually uses less power than a demanding game. Video decoding uses hardly anything over idle on the newer cards. I haven't checked encoding yet, but I would bet it's somewhere between F@H and gaming in terms of power usage. So what else is there?

Barring moderate-heavy overclocking, I'm pretty sure that fuzzy doughnut (and it's variants) is going to be the only thing that may push the ViDock power supply past spec when paired with a 58xx card.
 
So is there any definitive answer to what cards are PROVEN to work with a ViDock 4 Plus? It's much more expensive but also a much more portable and convenient solution than building your own and having a separate PSU as well and also having no enclosure.
 
Be better if you could find a way to use the existing laptop screen and make some kind of case for the external video card and external power supply so that you can travel with it...wait a minute, then it would be like carrying around a small desktop. Still like this regardless if you don't need to take the gaming power with you. If you do business in other countries from time to time, or constantly travel etc., a gaming laptop would still be the better solution. I mean you don't want to be carrying around a big 5850 video card and 500 watt psu with your laptop wherever you go :)
 
fyi: $300 on a vidock, $300 on a 5850, $100 on a psu...man that is getting way too close to building a very good desktop. I understand you want one pc to do everything but a little external usb can solve that issue. Then you just transfer those files to your laptop when needed. Solves the need for a vidock and then you have an extra desktop in case your laptop gets fried.
 
this would be great for using your laptop for a 3d gaming setup with a projector...gtx280 and 3d projector.. and not have to build a pc to do it seems nice...
 
hi i have lenovo thinkpad t400 with:
PE4H 2.0a with EC2C Expresscard adapter 16x
MSI Cyclone 460gtx graphics card
Corsair CMPSU-400CX 400-Watt CX Series 80 Plus Certified Power Supply
HDMI mini (C) to HDMI (A) regular 10 ft cable

ok, so the problem is, i try to put the ec2c into the expresscard slot on my t400 and it doesnt fit.
slot is bottom right corner of laptop and i dont know how to put the card in?

please help! trying to play FFXIV!!!!!
any suggestions on what i need to complete this set up or how to insert ec2c into my lenovo thinkpad t400 would help! thank you
 
hi i have lenovo thinkpad t400 with:
PE4H 2.0a with EC2C Expresscard adapter 16x
MSI Cyclone 460gtx graphics card
Corsair CMPSU-400CX 400-Watt CX Series 80 Plus Certified Power Supply
HDMI mini (C) to HDMI (A) regular 10 ft cable

ok, so the problem is, i try to put the ec2c into the expresscard slot on my t400 and it doesnt fit.
slot is bottom right corner of laptop and i dont know how to put the card in?

please help! trying to play FFXIV!!!!!
any suggestions on what i need to complete this set up or how to insert ec2c into my lenovo thinkpad t400 would help! thank you


Sorry can't help you...


Back on subject. This is a cool mod.

Notebook makers should take note and build a full PCI16x connector to notebook. This would be a great option in a notebook. Do it with a lucid chip so that onboard graphics could render in parallel with whatever card is connected.
 
This is a good workaround to do some gaming with a machine you would never think of doing such a thing with. I wish i had known this earlier before i had purchased an entire new rig for gaming as this would have stretched out the life of my M1730 (which i believe would work loads better and further enhance the effect of this mod)Also there looks to be a newer version then the one the original poster used that can do better bandwidth. I would love to see some testing done with different cards and see where the price/performance for doing this outweighs just buying a new rig for gaming.

For a quick gaming fix this is good.For serious gaming it wont do. But nice mod.
 
Back
Top