AMD Hawiian Islands Details - Faster Than Titan 1020mhz, 512bit memory

It's still a surprise, albeit a pleasant one. Really looking to how well these cards scale to stupid resolutions across various arrays of displays.
 
Nvidia still waves the PhysX flag in front of PC gamers but I have yet to play a game that used it for anything worth while, I don't know how important it will be in the future though - doesn't the PS4 and XBone have PhysX hardware acceleration? So might future console ports need it?

OpenGL still performs better on Nvidia, or is that just a rumor at this point?
 
Nvidia still waves the PhysX flag in front of PC gamers but I have yet to play a game that used it for anything worth while, I don't know how important it will be in the future though - doesn't the PS4 and XBone have PhysX hardware acceleration? So might future console ports need it?

OpenGL still performs better on Nvidia, or is that just a rumor at this point?

PS4 and XB One have AMD hardware. Physx is an nVidia exclusive. There will be zero physx functionality. That's not to say they won't be capable of the same effects, it just won't be called Physx.
 
PS4 and XB One have AMD hardware. Physx is an nVidia exclusive. There will be zero physx functionality. That's not to say they won't be capable of the same effects, it just won't be called Physx.

Although I think Physx is a big joke. I thought I read awhile back the Nvidia will support Physx on the PS4?

Hmmm might have to research that one..been awhile

Edit: Yap I did read about it.

http://www.techradar.com/us/news/ga...ffer-physx-support-on-amd-powered-ps4-1136063
 
Being a POS made that card flop :). The followup wasn't much better either, really.

But increasing the bus-width? They'd only do that if they were increasing the performance potential considerably, as that's expensive every way you look at it.

Maybe they got a deal on some cheap 5GHz RAM?
 
Hmm, interesting... Had not read that before. Thanks for the link.

The thing i wonder is, Sony had not released any info on the CPU/GPU of the PS4 until the summer.

Nvidia made the announcement in March about the PS4. I wonder if Nvidia will still suppost Physx on the PS4.

And if they can support it on the PS4, I dont see why they cant support the Xbone.

Either way interesting times ahead!
 
The thing i wonder is, Sony had not released any info on the CPU/GPU of the PS4 until the summer.

Nvidia made the announcement in March about the PS4. I wonder if Nvidia will still suppost Physx on the PS4.

And if they can support it on the PS4, I dont see why they cant support the Xbone.

Either way interesting times ahead!

If they can support it on the PS4 I don't see why they can't support it on AMD GPUs period lol. But we all know it's not a matter of what they can do, but what they WILL do. It would make sense for nVidia to allow ATLEAST one of the consoles to use Physx, particularly if they want to see more widespread adoption for it, both in general and PC ports.
 
You guys think the driver update for eyefinity crossfire frametimes will be in the 290x?
 
You guys think the driver update for eyefinity crossfire frametimes will be in the 290x?

If AMD was smart, they would put make a hardware frametimes built into the card like the 690 GTX. It worked so well for Nvidia, I think it would be a mistake not too.

That way they wouldn't have to keep updating drivers for certain games to work. It would work out of the box.

Anyway my .02c
 
A lot of games use relatively simple CPU-based PhysX, not the craziness you see in BL2/Batman games. That's likely what the PS4 supports.
 
The thing i wonder is, Sony had not released any info on the CPU/GPU of the PS4 until the summer.

Nvidia made the announcement in March about the PS4. I wonder if Nvidia will still suppost Physx on the PS4.

And if they can support it on the PS4, I dont see why they cant support the Xbone.

Either way interesting times ahead!

if they did it would be CPU accelerated physX instead of GPU. which shouldn't be an issue for the ps4 or xbox to handle given that the xbox 360 had 0 issues with the cpu accelerated Havok engine in BFBC1/2, and BF3.


If AMD was smart, they would put make a hardware frametimes built into the card like the 690 GTX. It worked so well for Nvidia, I think it would be a mistake not too.

That way they wouldn't have to keep updating drivers for certain games to work. It would work out of the box.

Anyway my .02c

lets hope they do.
 
if they did it would be CPU accelerated physX instead of GPU. which shouldn't be an issue for the ps4 or xbox to handle given that the xbox 360 had 0 issues with the cpu accelerated Havok engine in BFBC1/2, and BF3.

I am talking about Nvidia's Physx not Havoc. I know Havoc can run on CPU's with no issues, but we all know Nvidia purposely doesnt want Physx (there Physx) to run on a CPU so they can say it can only be ran on Nvidia hardware.

Well what Nvidia hardware is in the PS4?......My point is I don't think Nvidia would allow Physx to run on the PS4 since it would prove that you dont need an NVidia card to run Physx (eventhough we already know it will)

Anyway
 
Or save die space as well as power consumption while allowing for larger DRAM capacities.

How do you save space and power consumption by adding additional lanes to ram which require tons of transistors ?

So far we had 2900XT and GTX280 using 512 bit bus.

Neither of those were small chips and definitly neither of them can be desribed as low power consumption :D
 
Supposedly Havoc is working on OpenCL acceleration wich would alow it to function on both cards.
 
How do you save space and power consumption by adding additional lanes to ram which require tons of transistors ?

So far we had 2900XT and GTX280 using 512 bit bus.

Neither of those were small chips and definitly neither of them can be desribed as low power consumption :D
Because Dave Baumann stated that Pitcairn's PHY is ~50% the size of Tahiti's PHY, per 128b.

So you are saving die space and PHY by simplifying the memory controller and driving the GDDR5 at a lower speed. You are saving power consumption because again, a more simple memory controller and lower speed/voltage of the GDDR5, albeit having more GDDR5 ICs probably negate some/most of that savings.

Power consumption doesn't scale linearly with voltage/clocks.
 
Pleasant surprise I must say. The question becomes how legit is the source?
 
Last edited:
Should be interesting to see how gpgpu performances compare...

This is actually really exciting- I hope they get support for Intel and AMD iGPUs out there too. HD4000+ looks to be actually useful for this sort of work, especially if you could get an Iris Pro setup in your desktop with a 14nm Intel CPU next year, or whatever AMD eventually updates their APUs with (supposing they're even remotely useful for gaming). Would also help for all of those systems with dual-core Intel CPUs and discrete graphics too, I'd think.
 
do reviewers already have these cards?
How long after big poopie event wednesday can we expect to see some reviews?

how many months before they are available at a decent price lol
 
do reviewers already have these cards?
How long after big poopie event wednesday can we expect to see some reviews?

how many months before they are available at a decent price lol

Decent price appears to be right when they are released. If the performance is what the early leaks suggest it's a much better deal than the Titan and the 780
 
Decent price appears to be right when they are released. If the performance is what the early leaks suggest it's a much better deal than the Titan and the 780

Better than a Titan??? Really? one or two FPS in any given benchmark, less RAM.
I'd say it can run as fast frame per frame as the Titan, but nothing else.
Lower price, yes.

Is this card "better" than a 780 is what I think you'd need to address.
It looks to me like they may be a little bit better, and if the card has 4 GB of RAM, then it's a good buy.

It doesn't "destroy" the Titan like I've been hearing, but it certainly can match a 780 and has more RAM if you believe these early leaked specs. But then it should, give the development cycle. The price IS better and that's nice.:D
 
Better than a Titan??? Really? one or two FPS in any given benchmark, less RAM.
I'd say it can run as fast frame per frame as the Titan, but nothing else.
Lower price, yes.

Is this card "better" than a 780 is what I think you'd need to address.
It looks to me like they may be a little bit better, and if the card has 4 GB of RAM, then it's a good buy.

It doesn't "destroy" the Titan like I've been hearing, but it certainly can match a 780 and has more RAM if you believe these early leaked specs. But then it should, give the development cycle. The price IS better and that's nice.:D

said much better deal, and I'd stand by that statement.
 
I will get one of these or a 780. Waiting on reviews.

Think that gets me is when the 8800 came out N could not brag enough about more memory and 384 bus made AMD look like a looser and it did.

Now they are selling low memory and bus and asking BIG bucks. Not sure about the new AMD but the memory used on the new N cards is the lowest of the low.
 
I am talking about Nvidia's Physx not Havoc. I know Havoc can run on CPU's with no issues, but we all know Nvidia purposely doesnt want Physx (there Physx) to run on a CPU so they can say it can only be ran on Nvidia hardware.

Well what Nvidia hardware is in the PS4?......My point is I don't think Nvidia would allow Physx to run on the PS4 since it would prove that you dont need an NVidia card to run Physx (eventhough we already know it will)

Anyway

We already know it doesn't need nVidia hardware. nVidia allows Physx to run on CPU in Borderlands 2. The caveat being it's gimped to a single thread.

As far as OpenCL is concerned, I'll believe it when I see it.
 
These numbers are great and all but I kind of don't care if it's a 600 dollar card.

Does AMD make more money from a 300 dollar graphics card or a 600 dollar graphics card? (total unit profit, not per unit profit)

Unless there are that many people willing to drop 600 plus on a top end gpu, it seems to be it's the wrong market to focus on primarily. I really hope they don't forget the upper mid tier in the 300 dollar gpu price range.
 
I am seriously considering getting one of these to replace my 6950 x 2 crossfire setup. It would save power at idle and might even be quieter let alone faster. (What I have is pretty fast already but this would be even better.) Is it correct that this will be out in October?
 
Gotta love all these bullshit predictions every time a new graphics card is released. Funny thing is nearly every one is all wrong, (especially the ones with 'Leaked Benchmarks' in them).

It will be about the same speed as the GTX 780, but cost $1-200 less, just like always, forcing Nvidia to lower the price of their 780.

GG SSDD.
 
Depends on if they both come from the same die or not =P

Yup, there's a ton of factors that come into play here; but I'd expect that AMD and Nvidia make most of their margins with the Quadro/Tesla/FirePro cards, while they make most of their volume with the smaller die parts that barely outrun APUs. The net product-level profit probably favors the Pro cards, though.
 
Gotta love all these bullshit predictions every time a new graphics card is released. Funny thing is nearly every one is all wrong, (especially the ones with 'Leaked Benchmarks' in them).

It will be about the same speed as the GTX 780, but cost $1-200 less, just like always, forcing Nvidia to lower the price of their 780.

GG SSDD.

how do you know this? or are you just speculating?
 
how do you know this? or are you just speculating?

Everything in this thread is speculation. But what he's saying does make a lot of sense- hell, if AMD is able to prove out the gate that they have taken care of all of the frame-pacing issues- ALL of them, for every configuration, including 4k- then they could easily price them as high or higher than competitive Nvidia parts.

'Could' being the operative word here- we don't know what AMD's strategy is, or where they really are with their technology and software, and hell, those are probably still fluid, given that they're probably struggling to get their drivers straight and that Nvidia is (quite rightly) flinging mud in their eye.
 
As much as I don't want to downplay the speculated extraordinary performance of the R9-290X, there will be no card as legendary as the 9700 Pro. It's pretty much in its own tier. It completely wiped the floor. The innovation with that card was simply tremendous; it was the world's first chip to feature 8 pixel pipelines, as well as fully support AGP 8x bus standard. Under normal conditions it beat Nvidia's current offering at the time (Ti4600) by anywhere from 40-100% when AF/AA was enabled. Doublling the performance of the competitor's card would be godlike in this day and age.

Except for the 9800 non-pro flashed to a 9800 pro. It offered same performance for 1/2 the price :) Essentially the 2 cards were the same only the non-pro used a different down clocked and undervolted bios that could be flashed with the pro bios. :)
 
Back
Top