AGEIA in DeLL XPS 600 damn!

ReaveN

n00b
Joined
Dec 31, 2004
Messages
23
watch the red picture...

lpmx.jpg


now they see the superior part of this photo...

ageia.jpg


somebody had seen this in another one website?

jojojojo
 
Those things must run pretty hot. I was expecting the first couple of cards to just be passively cooled.

Why is it a shock that someone has put an PhysX card in a XPS 600?

It's the fact that it s an actual PhysX card...none have been availale for consumers and this picture (fingers crossed) suggests they're coming to the market soon. I hope Dell put it into most of their desktops - in the rigs of dell users = becomes a hardware standard e.g. implemented into many games (hardware support)
 
tornadotsunamilife said:
Those things must run pretty hot. I was expecting the first couple of cards to just be passively cooled.



It's the fact that it s an actual PhysX card...none have been availale for consumers and this picture (fingers crossed) suggests they're coming to the market soon. I hope Dell put it into most of their desktops - in the rigs of dell users = becomes a hardware standard e.g. implemented into many games (hardware support)

QFT

If dell starts shipping PC's with Ageia cards, things are looking good :D

Terra...
 
Terra said:
QFT

If dell starts shipping PC's with Ageia cards, things are looking good :D

Terra...

QFT

This is what Ageia needs to make software companies offer support for this kind of thing. It will only help push graphics to new levels.
 
tornadotsunamilife said:
Those things must run pretty hot. I was expecting the first couple of cards to just be passively cooled.



It's the fact that it s an actual PhysX card...none have been availale for consumers and this picture (fingers crossed) suggests they're coming to the market soon. I hope Dell put it into most of their desktops - in the rigs of dell users = becomes a hardware standard e.g. implemented into many games (hardware support)

Ohhh...I guess Dell just didn't want to shock everyone with that tidbit while they were debuting a dual SLI rig.
Sorry, but my money's on someone stabbing their review sample PhysX card into an "off the shelf" XPS 600, because I can't imagine #1 that this card didn't appear in their "no holds barred" XPS renegade, and #2 they didn't announce it.
 
OMFG DROOLS

If this is half as neat as what they showed on G4, then WOW.

4gpu and a physics card

:happy tear:
 
Naldo said:
when do those PhysX things come out?

SUPPOSEDLY they were going to be out Q4 of 2005, but since that has passed, I don't know. Maybe they are imminent?
 
Sweet. Who's producing these new cards? Wasn't BFG the first to sign on as a partner followed a few months later by ASUS (as pictured)? I haven't heard of anyone else planning on manufacturing these though.
 
They showed off demos and the reference chipsets in August '05 at the Science Center here in St. Louis (it was a LAN). It's definitely coming out very soon. I hope the best for AGEIA because all of the stuff they showed was amazing. Think of games like Far Cry where a wave from the ocean comes in and crashes into a bunch of objects and making them react accordingly. The rep they had at the Science Center was talkin about all the wonders the cards will do for AI
 
Those dual GTX cards look like two single GTX cards superglued.
 
Swishbish33 said:
SUPPOSEDLY they were going to be out Q4 of 2005, but since that has passed, I don't know. Maybe they are imminent?

The hardware has been complete since Q4 2005, but isn't being sold because no available game can take advantage of it. They said they'll start selling in Q1 or Q2, when the first PhysX titles reach the shelves.

Hopefully this extra time allowed them to work out any fab. issues and these things will be cheap and plentiful.

And while that looks like the Asus PhysX card, that isn't the Quad-SLI rig, as explained here:
http://www.anandtech.com/tradeshows/showdoc.aspx?i=2664&p=2
 
I really hope Dell doesn't get all the good gaming hard ware first. It won't make anyone happy.
 
STR said:
The hardware has been complete since Q4 2005, but isn't being sold because no available game can take advantage of it. They said they'll start selling in Q1 or Q2, when the first PhysX titles reach the shelves.

Supposedly City of Villains and Bet on Solider will take advantage of the card... Are those games not really supporting it?

http://www.ageia.com/products/games.html
 
That sure looks a lot like the ASUS engineering sample PhysX card, hmmm...

- - - - - - -

TekSomniaK said:
They showed off demos and the reference chipsets in August '05 at the Science Center here in St. Louis (it was a LAN).
Can you tell us more of the AGEIA demo, please? In detail, if possible :)
 
Obi_Kwiet said:
I really hope Dell doesn't get all the good gaming hard ware first. It won't make anyone happy.

Why not...

Dell gets these things like the Ageia PPU and the 7800512 GTX Quad and makes them well known, and also helps get the foot in the door for the developer. The more DELL sells, the better it looks for the rest of us, because prices can go down faster once they hit mainstream.
 
Dell get's them first. Ok. Marketing and Money ploy.

Screw the enthusiasts as soon as possible in the game. :rolleyes:

I'd rather see the enthusiasts get a taste first (us), then go to the mainstream users and cookie cutter PC makers. Kinda makes it look like they care about money and not the actual benefits of the product. "We want money, not fun.".
 
Of course they care about money. Its probably a good move to go for the money route first so they can support themselves and branch out later to the enthusiast market or support it more heavily than they would otherwise.
 
tornadotsunamilife said:
It's the fact that it s an actual PhysX card...none have been availale for consumers and this picture (fingers crossed) suggests they're coming to the market soon. I hope Dell put it into most of their desktops - in the rigs of dell users = becomes a hardware standard e.g. implemented into many games (hardware support)


If that's the case I'm glad Dell isn't the standard. The local shop in town stopped selling systems with only 128MB of memory many months before Dell. I guess we should all get ready for 0-90 Day warranty computers then as well. Dell 90 Day System

Dell also refuses to use AMD Processors, and at the moment they're the best for games. Give it another cycle refresh and that may change, but, if I'm buying a gaming rig I'm not going to hobble it with an Intel processor.

If I was buying a video editing rig, I'd buy Intel, otherwise bring on the Athlon 64!
 
My comment was based on the fact that a lot of people buy dell computers. If game developers see this they can create more games with the hardware support, it will then become a standard (much like graphics cards today).
 
tornadotsunamilife said:
My comment was based on the fact that a lot of people buy dell computers. If game developers see this they can create more games with the hardware support, it will then become a standard (much like graphics cards today).

I despise Dell and other makers "Walmart'ting" computers. Yes it's nice to be able to bring the price down, but at the same time, the sacrifice in quality the last couple of years has been pretty depressing. Unless your buying a high-end workstation, gaming rig, or server, you can pretty much be assured your system will be dead within 1-2 years.

I remember when the local dealer stopped offering 5 year warranties. Dell was down to 1 year at that point (with more sliding to come), you can't guarantee a system for 5 years at the Dell price. It's not just Dell, don't get me wrong, but at least Snapper had the nuts to tell Walmart, "No, we're going to continue with quality." and thus remove themselves rather than be pressured to sell for less. I wonder how long until someone tells Michael Dell, "Nope, we're not going to be associated with bargain junk."
 
Inglix_the_Mad said:
Dell also refuses to use AMD Processors, and at the moment they're the best for games. Give it another cycle refresh and that may change, but, if I'm buying a gaming rig I'm not going to hobble it with an Intel processor.

If I was buying a video editing rig, I'd buy Intel, otherwise bring on the Athlon 64!
http://accessories.us.dell.com/sna/...=&prEnd=&InStock=&refurbished=&fe=&mnf=116&k=

I don't know if they have them in their systems, but this is sure showing some promise.
 
sc4r4b said:
Supposedly City of Villains and Bet on Solider will take advantage of the card... Are those games not really supporting it?

http://www.ageia.com/products/games.html

I may not work for Cryptic or NC, but I can confirm that City of Villains will support the Ageia PhysX. (This has been confirmed by Cryptic and NC on two occasions, which I can't be bothered to link to right now.) I have a strong suspicion that a lot of the work is already done, because Issue 6 (AKA City of Villains release) introduced what they ephemistically called "dual core" support.
Yes. I'm a CoH/CoV addict. I play on dual AthlonMP's mostly, though I have also tested on dual Opterons. There is absolutely no question that threads have been broken out in the game, which is an obvious key step for PhysX support. It happily eats both MP1600's alive, and there is a marked performance difference between single and dual processor. I'm not talking burst either; I'm talking a 15FPS average difference across the board. All conditions; walking through Skyway or taking on Neuron with a bunch of elec blasters and storm summoners.
In order to use the PhysX, you need to feed it it's own thread to work on. The easiest way to break out threads is to just do it. Meaning; thread everything so you can scream the "dual core" buzzword, and more easily seperate the Ageia threads when the time comes.

As far as the hardware itself, there was a quiet announcement a few weeks back that it'd been pushed to 1Q06 and that we should expect a PCI interface version initially. That's mass-production release, mind you. My understanding is that silicon has taped and they're tweaking the reference boards. PCI-Express parts from 2Q06, though I have a suspicion that may be moved around depending on demand. (e.g. If the demand isn't there for PCIe parts, they may move it back to 3Q.)

The funny thing to me is something that I doubt anyone here will get; in essence, gaming is finally going the route OpenGL's been on for years and years, with a seperate physics processor. The reason a 3DLabs Wildcat5110G was smoking every game card out there, and still does, in OpenGL games is because of a part called a "geometry processor." It's not unlike the PhysX. (Half-Life on a single AMD TBird 1GHz with a 5110G will never drop below engine frame limit. I've tested this.)
All a geometry processor does is perform geometry calculations. Very, very quickly. All the PhysX does is perform physics calculations. Very, very quickly. The similarity is obvious. Both reduce or eliminate general processing load for a very specific task. The PhysX is both a relief and a disappointment to me.

On the one hand, the PhysX is a step that has been long needed. On the other hand, I would have rather seen nVidia and ATI learn the folly of their ways and start working on seperate geometry processors. (Which, by the by, allows you to produce significantly cooler running cards, at power consumption lower than the current line.) But, at the same time, I have to recognize the fact that ATI and nVidia, had they taken that route, would have produced two completely different processors, which would have their own proprietary set of instructions, presenting developers with another's Occam's Razor.
The reason the Wildcat's geometry processor (AKA the GAMMA) works so well is because it's an OpenGL part. Not DirectX; the 5110G barely supports 60% of DX8. Pure OpenGL, which is standardized. You can feed the same instructions to any card that properly follows OpenGL, and it will always behave the same. This is, as we all know, absolutely not the case with ATI and nVidia. Frequently painfully so. Even though they "support DXn" whatever, they have hundreds of little cheats and tricks, specific orders, and entire paths that developers have to use in order to get the best performance out of their cards.

That's why the PhysX is such a relief. It may be proprietary to an extent, but the simple fact is that it's one manufacturer, who has no interest in the eight-way war of the GPUs and VPUs. All they do is the physics. You have to support their product seperately, but it has no dependence on anyone save themselves. It doesn't matter if you're using an nVidia or an ATI, it doesn't matter if you're Intel or AMD. It's just a physics processor. I only have to worry about whether or not software supports it. And I don't see that changing any time in the forseeable future, unless ATI and nVidia decide to enter the fray. (Gods, I hope not.)
 
AreEss said:
The funny thing to me is something that I doubt anyone here will get;
This part confused me.

Are you saying: The concept that amuses me, others may not understand: etc, etc, etc...?

Or: Physics cards are funny because no one will buy them, but, in addition, because: etc, etc, etc....?
 
Fantastic, more companies accomodating the casuals. It's such crap that they can get stuff as easily/easier than people who actually know/give a damn about the stuff.
 
Yeah, most of the people who buy Dells probably don't even look at game setting and can't tell the difference between low and high texture settings.
 
Project_2501 said:
Fantastic, more companies accomodating the casuals. It's such crap that they can get stuff as easily/easier than people who actually know/give a damn about the stuff.

As much as I dislike Dell, they have a reputation for boosting the popularity of things that would normally be unpopular (aka Pentium 4)

At leasy PhysX is something that would have widespread enthusiast support.
 
woot for aegia. Definitely getting one the day they come out
 
I can't wait for the physics card, however, I have no open PCI slots to put one *wimper*

*waves fist at A8N32* :mad: :mad: :mad:
 
AreEss said:
That's why the PhysX is such a relief. It may be proprietary to an extent, but the simple fact is that it's one manufacturer, who has no interest in the eight-way war of the GPUs and VPUs. All they do is the physics. You have to support their product seperately, but it has no dependence on anyone save themselves. It doesn't matter if you're using an nVidia or an ATI, it doesn't matter if you're Intel or AMD. It's just a physics processor. I only have to worry about whether or not software supports it. And I don't see that changing any time in the forseeable future, unless ATI and nVidia decide to enter the fray. (Gods, I hope not.)


ATI has said that they can use their GPU to render the Havok FX engine. :eek: I believe nVidia could do the same...

http://www.ggl.com/news.php?NewsId=1595
http://www.gamespot.com/news/6136639.html
 
sc4r4b said:
ATI has said that they can use their GPU to render the Havok FX engine. :eek: I believe nVidia could do the same...

http://www.ggl.com/news.php?NewsId=1595
http://www.gamespot.com/news/6136639.html


old news
its still no were near as fast as a discrect device just for it
and do you realy want you vid card loaded down with some thing else when you cant even max out the game as it is look at games like FEAR that bring all but top end SLi setups to there knees
 
Elios said:
old news
its still no were near as fast as a discrect device just for it
and do you realy want you vid card loaded down with some thing else when you cant even max out the game as it is look at games like FEAR that bring all but top end SLi setups to there knees

:rolleyes:
 
sc4r4b said:

You got any argumentation to back your rolling eyes? :)
A dedicated PPU will be faster at physics than a GPU, just like a dedicated GPU is better for grahips than a CPU...like it or not...

Terra...
 
texuspete00 said:
^ What, that's not true? :confused:

I sure think so.

That's not why I am rolling my eyes... I was simply pointing out "old news" to someone else showing other possibilities/alternatives and this guy comes in with an attitude and horrible grammar.
 
Back
Top