How fast is your PPU?

BossNoodleKaboodle said:
three add in cards isnt a problem as long as its not bottlenecked.

hmm.. whats the max throughput on a 4x pci-e card?
 
it sounds good to split the tasks onto multiple processors, but I wonder if AI would be a better first step. I suppose graphics and environments is what sells games to the average user though. and btw, its Tim Sweeney.
 
one or the other, it doesnt matter.

freeing up the physics allows more cpu time for the AI... viceversa.
 
http://www.theinquirer.net/?article=21681

WE WROTE ABOUT the "physics processing unit" yesterday and today can reveal more and answer some of the many questions that we got from your mails. You can read our original story here

Ubisoft, Sega and Epic have announced support for the processor.

All of these companies are working with AGEIA for quite some time using its Novodex engine and the companies said that their next generation titles are going to be powered with this architecture. Some of the companies have worked with AGEIA for the past fourteen months, we are told.

At first we expect that some of the game levels are going to be optimised for the PPU and you won't be able to play them if you don't have this card. Some online titles will use this architecture as well and as I said it definitely looks cool and can change your feeling during a gameplay. The company will bundle some of the smaller shorter games and cool demos as soon as the card hits retail.

We managed to snap a picture of the prototype card with both PCI and PCIe ports, using a real chip and with 128MB of GDDR3 memory. The PhysX chip is made by TSMC and it's quite a large piece of silicon. We expect to see these cards in action soon. It's all about waiting for the games titles but you should be able to buy cards by the end of the year at the latest. Here is the picture, but the retail prodcut will have PCI or PCIe interface and the printed circuit board will end up much smaller. µ

Joy.....have to spend even more money to play games now :mad:
 
defiant said:
http://www.theinquirer.net/?article=21681



Joy.....have to spend even more money to play games now :mad:

I'm wondering if there isn't just a little marketing spin going on there, sites seem to be interpreting it as these companies supporting the PhysX PPU, but I'm not seeing "Tim Sweeney: Our next game will need a PPU" it could also simply refer to the Novodex physics middleware system, which doesn't require the PPU, but can apparently be accelerated by it.

I mean let's be clear here, there a difference between "supporting" and "exploiting", really using the PPU will result in a whole other type of game environment, one that simply can't be done with only a regular cpu/gpu set up, until PPUs are shipping in mid-range Dell systems, no sane game developer is going to waste time/money writing code to really exploit them, just like 64-bit cpus (still waiting...).
 
black_b[ ]x said:
it sounds good to split the tasks onto multiple processors, but I wonder if AI would be a better first step. I suppose graphics and environments is what sells games to the average user though. and btw, its Tim Sweeney.

The more you offload to other (co)processors, the more cycles are left for the main CPU to do AI.
 
TheTrebleKing said:
Ray-Tracing Cards (yes another card, that u lot probly aint heard of ;) )

Interesting. Link please.

And could you guys also give links on how PPU's work? With 3Dcards you send the mesh, the textures, etc. to the GPU and it returns the rendered scene. How does it work with PPUs?

I can see how this'll help with current games. In Doom3, i tried spawning a couple of dozen sentries and killed them. The framerate's actually pretty decent straffing around the pile with all of them onscreen at once. Now try throwing a grenade to get them all to move. Your FPS will drop to 1 until they turn static. This is a physics issue right?
 
rather than a PPU, why not just add a general purpose programmable processor as an add-on card (pci-e). Not quite the same as our normal CPUs but more DSP/vector - like. Even if the performance wasn't as good as a decicated PPU, I'd be more than willing to take a performance hit for the ability to dynamically program the processing unit. Imagine the capabilities then! You could just have separate "cores" you could upload to the card changing the functionality. mpeg2 enc core! mp3 enc core! physics core! FFT core! audio effects (reverb, echo, flanger, delay, etc)!
 
Stray thoughts popping into my head:

Offloading more stuff from the CPU has been on the cards for a while, and I was sort of thinking that I can't see the big boys, ATI and Nvidia sitting still and letting a newbie in. If this PhysX system has been public (i.e, via collaboration with developers) for nearly a year and a half then you can bet your ass that it's only us it's news to.

I seem to recall some indications of major behind-the-scenes shuffling in both ATI and Nvidias roadmaps a while back, ditching future products with no clear indication of alternatives and such.

Add to that the rumours/news of Nvidias next "so-major-that-they've-dropped-the-NV-prefix" architecture revision, and I'm wondering if the PhysX PPU is really going to have the field all to itself...
 
I think this is the next inevitable step that Physics "rendering" was going to take. What came before 3d graphics cards? Commercial 3d graphics engines. What came before Physics cards? Commercial physics engines.

I agree that this thing is going to have to be sub-$100 to get off the ground, but with DDR3 on board, I don't see that happening.
 
It sounds kind of cool, but only if the PPU is <$150. I'd rather spend $400 on a new video card than watch a bunch of bricks fly all over the place or watch pointless super rag dolls fall to the floor. Improved lighting and graphics in general will do a lot more for gaming than super duper hollywood style explosions and death animations.
 
Now, if it were software upgradable. That'd be another story entirely. If I only had to buy one every few years, with software upgrades, then I'd be willing to pay >$100
 
kick@ss said:
It sounds kind of cool, but only if the PPU is <$150. I'd rather spend $400 on a new video card than watch a bunch of bricks fly all over the place or watch pointless super rag dolls fall to the floor. Improved lighting and graphics in general will do a lot more for gaming than super duper hollywood style explosions and death animations.

I understand you but i dont agree. I think with current IQ and the kind of realism this thing could bring, the PPU would do a lot more than just more FPS in games.

Just imagine being able to shoot thru a door and watch the door slowly break into peaces and having holes that you can see thru.

Shooting a RPG and make a real whole in a wall.

Having "real" footsteps in terrain.

Nades that make holes in the terrain. And actually blow up stuff, including body parts (kinda gore...just pointing some of the capabilities). Or actual fragments flying around from the Frag granade.

Grass, bushes and trees that react to wind, bodies, bullets, other objects, etc.

Making bullets react to distance, wind and materials, etc.

Blowing a helmet off an enemys head.

Hitting an enemy in x part of the body and simulating the real effects of the hit.

And those are just some of the things this could help accomplish in a First Person Shooter like Americas Army. Dont even want to expeculate in the realism of sports games, racing sims, flight sims, etc etc etc x1000.

Then the fact that the CPU is free to work in AI and other tasks.

IMHO i would put this in my priority list over a 1 generation newer video card and 10 FPS more of Doom3...
 
IMHO i would put this in my priority list over a 1 generation newer video card and 10 FPS more of Doom3...

I definately agree with that!
I dont know if any of you took a look at this one
http://www.gamers-depot.com/interviews/agiea/001.htm
but the pics at the bottom have me wondering if they are in fact real. There is a connector at the top? SLI? haha but seriously Im wondering if this is going to be a connector to go to the mobo, which is going to mean specialized mobos and/or chipsest or even videocard... = more money to upgrade more stuff...
 
videogamer323 said:
I definately agree with that!
I dont know if any of you took a look at this one
http://www.gamers-depot.com/interviews/agiea/001.htm
but the pics at the bottom have me wondering if they are in fact real. There is a connector at the top? SLI? haha but seriously Im wondering if this is going to be a connector to go to the mobo, which is going to mean specialized mobos and/or chipsest or even videocard... = more money to upgrade more stuff...

It is probably a connector to allow 'SLI' type operations, or to connect to a video card.

Seriously, I see this kind of tech being absorbed by video card vendors and incorporated into vidcards come a couple of generations. This just screams of a company who wants it's IP bought up.
 
EvilAngel said:
I understand you but i dont agree. I think with current IQ and the kind of realism this thing could bring, the PPU would do a lot more than just more FPS in games.

Just imagine being able to shoot thru a door and watch the door slowly break into peaces and having holes that you can see thru.

Shooting a RPG and make a real whole in a wall.

Having "real" footsteps in terrain.

Nades that make holes in the terrain. And actually blow up stuff, including body parts (kinda gore...just pointing some of the capabilities). Or actual fragments flying around from the Frag granade.

Grass, bushes and trees that react to wind, bodies, bullets, other objects, etc.

Making bullets react to distance, wind and materials, etc.

Blowing a helmet off an enemys head.

Hitting an enemy in x part of the body and simulating the real effects of the hit.

And those are just some of the things this could help accomplish in a First Person Shooter like Americas Army. Dont even want to expeculate in the realism of sports games, racing sims, flight sims, etc etc etc x1000.

Then the fact that the CPU is free to work in AI and other tasks.

IMHO i would put this in my priority list over a 1 generation newer video card and 10 FPS more of Doom3...
A LOT of these things would require significantly more work from the GPU, in addition to work by the PPU. I think you're kind of stretching the limits of what a PPU would be able to do. It will (basically), be able to make things falls and interact with forces. While that sounds nice, the accompanying visual effect will require much work from the GPU.
 
videogamer323 said:
but the pics at the bottom have me wondering if they are in fact real. There is a connector at the top? SLI? haha but seriously Im wondering if this is going to be a connector to go to the mobo, which is going to mean specialized mobos and/or chipsest or even videocard... = more money to upgrade more stuff...

As the article states, this particular demo card was created with both PCIe and PCI connectors on it. On the top of the card in that pic is the PCIe connector.


Really though, for those complaining about having to purchase another card, it sounds to me like this will be an "accelerator", where by if you have a PPU, those types of operations will be offloaded to that unit, and the realism in the game will be greatly enhanced. It doesn't sound like you will HAVE to purchase one to play the next gen games, but that there will be a massive difference if you do have one. I could be wrong on that, but that was what I got out of it.


Regarding the separation of processors, this seems to be the common thread in next generation consoles. Why not do it on the PC as well? Rumor has it Xbox Next will have at least three Power processor cores running at least 3Ghz, probably dedicated to specific tasks. That is in addition to the R520 (or similar) GPU core. Sony's Cell processor will also consist of several Power cores dedicated to different tasks. There seems to be a limit to how far we can push existing processors (ex. P4 hitting the wall at 3.8GHz), so increasing processing power further will likely require delegation of tasks to other units in addition to multiple cores on the CPU.


In terms of Brent's comment about placing the PPU on the video card PCB, is that even possible? With vid cards already maxing out what power can be delivered to them via an x16 PCIe slot (not to mention AGP), could they stand having another piece of silicon added to the same PCB and requiring 25+ watts more power? It seems like vid cards will only grow in their thirst for power and that it might be more sensible to run this PPU on a separate x1 PCIe or standard PCI slot. I can see nVidia and/or ATi possibly getting into this tech and marketing their own PPUs, but I don't know if I can see them integrating it with vid card architectures.
 
What is it that you mean by AI in games? It looks like the discription of the physics engine meets this need.

Or are you talking about real thinking entities like the Steven Speilberg movie.


Please forgive my noobieness (with that special card another user was mentioning you can simulate my noobies giggling and bouncing :D )
 
HighTest said:
What is it that you mean by AI in games? It looks like the discription of the physics engine meets this need.

Or are you talking about real thinking entities like the Steven Speilberg movie.


Please forgive my noobieness (with that special card another user was mentioning you can simulate my noobies giggling and bouncing :D )

AI with regards to games usually refers to the decision making and other autonomous actions of entities run by the computer as opposed to a human player.
 
videogamer323 said:
I definately agree with that!
I dont know if any of you took a look at this one
http://www.gamers-depot.com/interviews/agiea/001.htm
but the pics at the bottom have me wondering if they are in fact real. There is a connector at the top? SLI? haha but seriously Im wondering if this is going to be a connector to go to the mobo, which is going to mean specialized mobos and/or chipsest or even videocard... = more money to upgrade more stuff...

On another interview it was explained that their own card was both PCIe (PCI-Express x4) and PCI compatible. So you see the PCI connector on the bottom (with reduced bandwidth) and also a PCIe on top, just flip the board over.

Sweet for those that don't have PCIe systems yet but want to keep the card.
 
HighTest said:
On another interview it was explained that their own card was both PCIe (PCI-Express x4) and PCI compatible. So you see the PCI connector on the bottom (with reduced bandwidth) and also a PCIe on top, just flip the board over.

Sweet for those that don't have PCIe systems yet but want to keep the card.

Wait...you might just be right! The connector on the top looks like a PCI-E 1x connector.
 
he is right..it states it in the article, but hes also wrong at the same time :confused:
We managed to snap a picture of the prototype card with both PCI and PCIe ports

AND someone else in this thread said that too...

anyway...

here is another quote from the second article on this...the one with the picture
but the retail prodcut will have PCI or PCIe interface and the printed circuit board will end up much smaller.

spelling mistake made by inq highlighted in orange

key word ("or") in that statement is in bold

pci OR pci-e


so yes. it has both connectors, but no. the retail product will have one or the other...
 
I makes sense that a reference board would have both connectors to suit various developer's needs.

Why be pedantic and point out a spelling error by the Inq? It has gotten to the point where if I don't see a spelling error in one of their pieces then I think it's fake. Horrible editors!
 
EvilAngel said:
I understand you but i dont agree. I think with current IQ and the kind of realism this thing could bring, the PPU would do a lot more than just more FPS in games.

Just imagine being able to shoot thru a door and watch the door slowly break into peaces and having holes that you can see thru.

Shooting a RPG and make a real whole in a wall.

Having "real" footsteps in terrain.

Nades that make holes in the terrain. And actually blow up stuff, including body parts (kinda gore...just pointing some of the capabilities). Or actual fragments flying around from the Frag granade.

Grass, bushes and trees that react to wind, bodies, bullets, other objects, etc.

Making bullets react to distance, wind and materials, etc.

Blowing a helmet off an enemys head.

Hitting an enemy in x part of the body and simulating the real effects of the hit.

And those are just some of the things this could help accomplish in a First Person Shooter like Americas Army. Dont even want to expeculate in the realism of sports games, racing sims, flight sims, etc etc etc x1000.

Then the fact that the CPU is free to work in AI and other tasks.

IMHO i would put this in my priority list over a 1 generation newer video card and 10 FPS more of Doom3...

lol, its a physics processor, do you really think that all of this could be implemented easily by just having the card in your system? It would require alot of work dedicated to a specific hardware device that most people wont own. And we all know that software developers have alot of free time to add in features for a small percentile of the market... This tech wont work without being on the graphics card, simple as that.
 
ThomasE66 said:
It is probably a connector to allow 'SLI' type operations, or to connect to a video card.

Seriously, I see this kind of tech being absorbed by video card vendors and incorporated into vidcards come a couple of generations. This just screams of a company who wants it's IP bought up.


nah, its a PCI/ PCI-E mockup.. the top connector is the 1x slot. in reality it will have 1 or the other, not both unfortunately. this is the kind of thing that will make PCI-e worthwhile IMO

and yes, i would pay for it. im not going to need a new GPU for a few more generations yet, so i'd be willing to upgrade w/ a PPU in the meantime to enhance the physics effects. as someone above me said, the reality of games will be increased exponentially! just imagine a REALLY fully modifiable environment, where trees fall over when you blow a nade next to them, where you have real ruts left behind a car, and just... CRAZY effects that will just totally change the nature of the game.

as has been pointed out countless times, theres more to games than graphics. theres more to games than graphics and a story line.. its the total imersion that makes games with good storylines great. now just imagine a game with a great story line, good graphics, good SOUND and good PHYSICS.. it suddenly became much more in-depth and immersive than just good graphics and a great story line but mediocre physics (source.. i consider source good now, but nowhere near the full potential) and mediocre or poor sound
 
Activate: AMD said:
nah, its a PCI/ PCI-E mockup.. the top connector is the 1x slot. in reality it will have 1 or the other, not both unfortunately. this is the kind of thing that will make PCI-e worthwhile IMO

and yes, i would pay for it. im not going to need a new GPU for a few more generations yet, so i'd be willing to upgrade w/ a PPU in the meantime to enhance the physics effects. as someone above me said, the reality of games will be increased exponentially! just imagine a REALLY fully modifiable environment, where trees fall over when you blow a nade next to them, where you have real ruts left behind a car, and just... CRAZY effects that will just totally change the nature of the game.

as has been pointed out countless times, theres more to games than graphics. theres more to games than graphics and a story line.. its the total imersion that makes games with good storylines great. now just imagine a game with a great story line, good graphics, good SOUND and good PHYSICS.. it suddenly became much more in-depth and immersive than just good graphics and a great story line but mediocre physics (source.. i consider source good now, but nowhere near the full potential) and mediocre or poor sound

See my subsequent post, I realised that it was a PCI and PCI-e setup.
 
dumb question maybe but do current GPU's have raytracing abilities at all? are they only for specialized cards? do the IBM and SGI cards that cost 25,000 dollars and have only a PCI bus do such things? is that why they use those kinds of POWER 4+ workstations with those cards to do massive simulation, because the other cards like Quadro etc can't do it?
 
BossNoodleKaboodle said:
dumb question maybe but do current GPU's have raytracing abilities at all? are they only for specialized cards? do the IBM and SGI cards that cost 25,000 dollars and have only a PCI bus do such things? is that why they use those kinds of POWER 4+ workstations with those cards to do massive simulation, because the other cards like Quadro etc can't do it?

There are projects out there to use current state of the art GPU's to do raytracing far faster than a CPU can.

Check out http://www.gpgpu.org/ for a list of projects that use the GPU for things other than realtime graphics.
 
some of you bring a vaild point.

i also wonder how much, if any the gpu will have to do in order for the ppu to demonstrate its maximum ability.
 
The Inq said:
Big guys like Gabe Novell, the developer of Half Life 2...

Dang, The Inq needs better journalists. Someone who is a bit more in touch with things.

By the way, what the heck is "marchitecture"?
 
Jason711 said:
some of you bring a vaild point.

i also wonder how much, if any the gpu will have to do in order for the ppu to demonstrate its maximum ability.

The GPU will just have to render the scene. Presumable the PPU will provide the physics, and the CPU will be the glue between the PPU and the GPU.

Whew...too many PU acronyms flying around.
 
^^ thats the cliff notes version

also don't forget the cpu has to throw in the AI also
 
4b5eN+EE said:
^^ thats the cliff notes version

also don't forget the cpu has to throw in the AI also

Yes, but isn't that the point of a separate physics accellerator, to allow the CPU to spend more time on AI and less on physics?
 
Lord of Shadows said:
lol, its a physics processor, do you really think that all of this could be implemented easily by just having the card in your system? It would require alot of work dedicated to a specific hardware device that most people wont own. And we all know that software developers have alot of free time to add in features for a small percentile of the market... This tech wont work without being on the graphics card, simple as that.

Easily? I never said that. But for first time it would be possible

Actually, what you say could be valid only if most people dont own it.

This thing [the PPU] only needs to be picked up by 1 console maker, then the kind of stuff the console would be able to do would force the PC games designers to add those features in order to keep the PC at the same level or above such console{s}.

I really hope that happens. True realism is needed more than more FPS right now.
 
ThomasE66 said:
Yes, but isn't that the point of a separate physics accellerator, to allow the CPU to spend more time on AI and less on physics?
yes


but what he was saying was that the gpu renders he game, the ppu renders (or i dunno what you would call it) the physics and whats there to do the AI? the cpu..thats what i was saying..he left that out, i was simply adding it in
 
ThomasE66 said:
The GPU will just have to render the scene. Presumable the PPU will provide the physics, and the CPU will be the glue between the PPU and the GPU.

Whew...too many PU acronyms flying around.

i understand that...

however, isnt it plausable that the more you can do with physics.. then you will have to do more graphically.

for example.. more things flying around from explosions... grass, wind blowing through the grass... thats more rendering.
 
Back
Top