[techeye] reasons for kepler delays? Release for 2nd half '12?

xoleras

2[H]4U
Joined
Oct 11, 2011
Messages
3,551
Today's AMD Radeon 7950 release saw the embargo lifted on a funny little saga which you can read about here. But another hell-on-earth yarn from the rumour mill is even more interesting. TechEye has heard Nvidia has been asking hacks why the reviews have been so good.

Nvidia absolutely denies the rumour which suggests representatives have been asking journalists to file a short report on what is exactly so great about the 7950. Whether or not Nvidia is canvassing reviewers, there may be reasons the Green Goblin would like hacks to do its homework for it.

More industry whispers suggest the reason Nvidia's upcoming hope, the GK104, is late to the table is all the redesigns it has had to go through. Parts on the 32nm had to be redesigned for 28nm, but along the way there were challenges with packaging and interconnects. Some industry watchers suggest that Nvidia gave up a lot of space on its chip, trying to buff up Kepler by bringing Ageia to the hardware.

But the murmurs suggest Nvidia has been dedicating a lot of resources to get physics and fluid dynamics operating properly, which has so far, allegedly, taken half of its gaming engineers and six months to get right.

One industry watcher said to TechEye the company is in "holy s**t" mode - having been confident that the GK104 would fight off and trounce the competition, but the timing is out of whack. When Nvidia does get its high-end Kepler chip out in the second half of the year, the competition is going to be ready with something else.

There are no doubts that full fluid dynamics is going to wow the crowds - on demos specifically catering to fluid dynamics. Writing code for games to get the performance right, though, is trickier. While Nvidia's team is working overtime, its rivals just may be able to clean up.

"Their SDK is great," someone familiar with the matter said to TechEye. "But it was foolish of them to try putting Ageia hardware onto a GPU. The amount of engineering work you have to do to make that useful in games - you get one game every six months that gives you the benefits of that."

pretty interesting if true. The aegeia part, I wonder if that is the reason for delays?
 
As I posted in your other thread...

Maybe, but I know that physx had a huge performance cost on single GTX 580s and was unusable in games like metro 2033 unless you have SLI. Even Batman: AC had a huge performance penalty for physx.

Maybe they added additional logic in the chip for further on die physics improvements? Also, if you read the article, it seems to differentiate itself with "improved" dynamic physics. This would explain their CUDA 4.1 sdk and such.
 
Sounds plausible to me...look how Fermi was paper launched months before it was released and yet nothing about Kepler so far. I think this time AMD will have an answer to Kepler even if it's faster than the 7970.
 
MAkes ya wonder if those 2300 shader GCN pics were true...and AMD is waiting for kepler!
 
Kepler itself is a year late and from what I've seen still on a bottom>top release schedule, so reading into "why" is going to be difficult, and inherently more difficult due to the way Nvidia does things. Where AMD delays are actual delays (and delays and delays), Nvidia delays tend to be delays with a 'it was meant to be released at this date all along' note attached.

Nvidia absolutely denies the rumour which suggests representatives have been asking journalists to file a short report on what is exactly so great about the 7950

w0t? Of course they do. Couldn't they just read the reviews themselves? This doesn't even make any sense...
 
Hmm, very interesting...

The "holy s**t mode" seems inline to the little bit I've heard at any rate.

The rest... Makes for a good read at any rate. Hardware physics would make sense, that nvidia would be borrowing a page from ATI there...
 
Interesting to hear about them trying to add "ageia hardware" to Kepler.

We went from having PhysX cards with dedicated hardware to Nvidia preaching that it could be done better via Cuda, but now we're going back to dedicated hardware?

It would be cool if they supported the older PhysX cards again as a result (since I have one), but that's probably wishful thinking.
 
Physx can be down on a 7970 and there are no more demanding games coming out for a long time.

When Nvidia does get its high-end Kepler chip out in the second half of the year, the competition is going to be ready with something else
What is the point in waiting to see what kepler has to offer. Its going to be along time to wait when you can get a 7970 almost a month ago and max games out.

No point in buying kepler because we should wait to see what the 8970 has to offer LoL or new console to change the game up
 
Really dumb. Hardware physics is next to worthless. If the consoles don't support it (no next gen console is even looking Nvidia's way) it won't be implemented. Game timelines are tight already, there's no way anyone's going to add crap that a handful of cards implement without a ton of cash put in their pockets.

We'll have a handful of Nvidia way it's meant to be played games which half-ass this stuff so that it has no real impact on gameplay.
 
No point in buying kepler because we should wait to see what the 8970 has to offer LoL or new console to change the game up

not true - they could price kepler cards very attractively. then they would certainly be worth it.

there is no way anyone can say there's "no point in buying kepler" until it comes out and we have [H]ard data on its performance.
 
Really dumb. Hardware physics is next to worthless. If the consoles don't support it (no next gen console is even looking Nvidia's way) it won't be implemented. Game timelines are tight already, there's no way anyone's going to add crap that a handful of cards implement without a ton of cash put in their pockets.

We'll have a handful of Nvidia way it's meant to be played games which half-ass this stuff so that it has no real impact on gameplay.

you never know, new consoles could include a larger gpu and a smaller cpu in order to do things like physics on the gpu, in which case, nvidia's advantage in this area could pay off. Of course, consoles would using something like opencl, but i'm sure the performance tweaks nvidia is doing would translate nicely from cuda to opencl.
 
I didn't want to wait that long. I love my games maxed super smooth in eyefinity. Skyrim is awesome maxed with tweaks and 8aa through ccc 5760x1080
 
Really dumb. Hardware physics is next to worthless. If the consoles don't support it (no next gen console is even looking Nvidia's way) it won't be implemented. Game timelines are tight already, there's no way anyone's going to add crap that a handful of cards implement without a ton of cash put in their pockets.

We'll have a handful of Nvidia way it's meant to be played games which half-ass this stuff so that it has no real impact on gameplay.

Which is why you have to view this article skeptically. Nvidia is stubborn, but they aren't completely stupid, so I can't really see them spending the time and effort to integrate Ageia physx into card hardware. Getting away from that was kind of the whole point of CUDA (well, not the point, but a nice side-effect).

Surely they can't believe that hardware physx on-chip is going to be a selling feature. :(
 
Really dumb. Hardware physics is next to worthless. If the consoles don't support it (no next gen console is even looking Nvidia's way) it won't be implemented. Game timelines are tight already, there's no way anyone's going to add crap that a handful of cards implement without a ton of cash put in their pockets.

We'll have a handful of Nvidia way it's meant to be played games which half-ass this stuff so that it has no real impact on gameplay.

Both the Xbox 360 and PS3 support physx. Not sure next gen consoles will. too early to know.
 
The real kicker is the new semi-accurate article about how optimized the GK104 is for certain code and whether or not NVidia is going to go out of its way with embedded programmers and cash support to the gaming companies so as to sabotage AMD.
 
Physx is bad marketing for pc games. Every single person I know that owns a AMD card does not pay for any game that has physx. They say if they can't run it they should get a discount. So they perform the discount themselves
 
Physx is bad marketing for pc games. Every single person I know that owns a AMD card does not pay for any game that has physx. They say if they can't run it they should get a discount. So they perform the discount themselves

Nice to see your back on the AMD marketing troll wagon...
 
Physx is bad marketing for pc games. Every single person I know that owns a AMD card does not pay for any game that has physx. They say if they can't run it they should get a discount. So they perform the discount themselves

Thats pretty short sighted when you can buy a physics card for about $30.00 that would be more than sufficient for most games
 
Physics is cool; Physx is lame

This kinda sucks, I am waiting for Kepler so prices will become a bit more reasonable an Tahiti, or maybe get a Kepler if they are that much better. But by then the next gen/refresh Radeons will be getting close, making me feel like waiting again.

I liked it better when both companies released their new lines closer to one another.
 
Thats pretty short sighted when you can buy a physics card for about $30.00 that would be more than sufficient for most games

Not needed with a 7970 now. It can easily handle physx. Thank god for that alice madness looks awesome eyefinity with max physx
 
Not needed with a 7970 now. It can easily handle physx. Thank god for that alice madness looks awesome eyefinity with max physx

WTF?

Sonny boy, i'm pretty sure that THAT has jack squat to do with the 7970... if it isn't green, it isn't running it on the gpu.
 
Not needed with a 7970 now. It can easily handle physx. Thank god for that alice madness looks awesome eyefinity with max physx



Really, so then why is it that every single person you know that owns a AMD card does not pay for any game that has physx?
 
Physics is cool; Physx is lame

This kinda sucks, I am waiting for Kepler so prices will become a bit more reasonable an Tahiti, or maybe get a Kepler if they are that much better. But by then the next gen/refresh Radeons will be getting close, making me feel like waiting again.

I liked it better when both companies released their new lines closer to one another.
Just bite the bullet and get a 7970 and enjoy maxed games right now. Its so awesome. Remember this all about enjoying games. Who gives a flying fuck what company is trying to out do the other or pathetic fanboyism shit that whiners cry cause someone else has something better than them and they feel stuck with a brand that is not as good at the time

WHEN KEPLER COMES OUT SELL YOUR 7970 AND GET KEPLER
 
I JUST FINISHED PLAYING ALICE MADNESS RETURNS EYEFINITY MAXED PHYSX!!!

Same just played Mirrors Edge Maxed Physx with 16xAA 60FPS locked 2560x1600 on 7970... with no extra nvidia card, either Sandy-e is doing the job or 7970 is that fast no idea...
 
Your comments are seriously laughable. I think you just like "listening to yourself talk", as it were.
 
... and whether or not NVidia is going to go out of its way with embedded programmers and cash support to the gaming companies so as to sabotage AMD.

Is that a question? Of course they are. That's what they do, and if AMD could afford it, they'd do it also.

Unless something changes, and it would be big news if it did, then AMD cards can't do Physx on the GPU. Period. Why is this even being discussed?
 
Sounds plausible to me...look how Fermi was paper launched months before it was released and yet nothing about Kepler so far. I think this time AMD will have an answer to Kepler even if it's faster than the 7970.

Could be the opposite. Often when companies have nothing they make up for it with bluster, but when they do they are very quiet. An example would be the GeForce 8800. They said -nothing- about it before it came out, despite rumors that it would be an old style split-pipline card (ATi had already said there's would be unified). Then they dropped the 8800 out of the blue.

I have no idea what is going on, but remember that silence doesn't have to mean "We got nothing," it can mean "We got something really cool and don't want it leaking early."
 
Back
Top