AMD Fury series coming soon.

Well HD FHD just means 1080p so it shows that the chip has the processing power roughly similar to the titan X doesn't really relieve the concerns that a flagship card won't be future or current proof for the high end systems it might go in ie multi-monitors 4k etc.
 
I know what heat output is. What you're describing is not heat output but heat management =].

Again, you know exactly what is implied when anyone uses the term "heat output" to describe the means of heat absorption and dissipation resulting in a measured operating temperature. It's nothing new amongst enthusiasts in the CPU or GPU worlds. It really shouldn't be garnering such high levels of debate or argument. But, if you want to settle on something so petty when picking your battles...please enlighten the whole class as to why it's such an extrodinary issue with you.
 
Again, you dont make sense at all. I am saying, if technology or process allowed AMD to add more memory, they would have easily added more.

Why do you think the 390X has 8GB whereas the Fury has 4GB? It has nothing to do with AMD's performance goals. But to their own admission, they couldn't do it which is absolutely fine for now.

That isn't much of an arguement. This can apply to anything. Technical (and financial) limitations are always there, for any technology. I'm sure Nvidia would like for the Titan X to have 32 GBs of memory. They can't.
 
I just don't see how a 390X 8GB would suddenly zoom right past both Fury and 980 Ti at 8K. Yeah sure hitting the vram wall, but considering 390X is consistently 75% of 980 Ti at every resolution, and I mean 75% +/- 0.5% consistent, but becomes 3x 980 Ti's performance at 8K?

Unless 6GB vs 8GB is such a limitation that at 8K it becomes completely vram capacity driven and GPU horsepower no longer matters.
You have to realize that by the time the VRAM runs out it's swapping out data to/from System memory or in the worst case a HDD. That means GDDR5 to GPU is going at 228 Gigabytes/sec, when it copies data from the System RAM to GDDR5 RAM it can only do about 31.5Gigabytes/sec, or even slower if it's hitting a HDD. So the best case scenario is if the vram ran out of stuff to feed the GPU, it is now getting fed at best at the speed of the system bus.
 
What do you think was holding them back? Theoretically, they could have gone with a larger organic interposer that has more layers and could be made to, at least, 45mmx42mm. They could have gone with a more expensive silicon interposer with more layers and gone with a custom base die, if they really wanted to. They decided to go with the cheaper passive silicon interposer at the reticle limit with a minimal amount of layers, most are expecting 4 layers or less.

David Kanter has specifically stated that there isn't any limitation to using more than 4 stacks.

I also believe that Joe Macri hinted or alluded to the same thing, even though he didn't come right out and say it.

What do you expect them to say? 4 isn't enough but it will have to do? Why are they planning for 8 in their next series of 4 is enough? 4 is fine but a year later games are suddenly going to need 8? You're the most ridiculous fan boy.
 
BOOM!

JZ53KNi.jpg


dI9ibgB.jpg
 
What do you expect them to say? 4 isn't enough but it will have to do? Why are they planning for 8 in their next series of 4 is enough? 4 is fine but a year later games are suddenly going to need 8? You're the most ridiculous fan boy.

I do believe there is an immediate limitation, this being the first use of HBM. But I don't believe that 4Gb isn't enough. I think the highest I hit so far is 3.2Gb with highly modded Skyrim. Of course there are some recent new games that seem to push the limits that I don't have. I do have Witcher 3 running full Ultra VSR @1800p and I think it sits around 2.2Gb. I don't run higher than 2 MSAA/SSAA with VSR @1800p, don't see the need. Just like 4XMSAA with 4K, maybe if you are running a 50" and sitting 3 ft away, but then would your eyesight still be good enough to tell.
 
You have to realize that by the time the VRAM runs out it's swapping out data to/from System memory or in the worst case a HDD. That means GDDR5 to GPU is going at 228 Gigabytes/sec, when it copies data from the System RAM to GDDR5 RAM it can only do about 31.5Gigabytes/sec, or even slower if it's hitting a HDD. So the best case scenario is if the vram ran out of stuff to feed the GPU, it is now getting fed at best at the speed of the system bus.

I understand that. I'm just taking from my own experiencing of hitting the vram wall in certain games that stuttering only occurred in certain points or when I looked around rapidly, and was not a universal occurence.

I guess the takeaway here is that at 8K the entire benchmark run needs >6GB vram to run properly, so anything less than that and it's choked the entire time.

FULL COVER PLATE! I WAS RIGHT!

Nice, thanks for sharing. The CLC has an industrial look to it lol
 
Last edited:
HAHAHA HOLY FUCK DAY9 just dropped the damn briefcase. LMAO

I'm out of the loop, what is this "day9" that you speak of? (been running around like a headless chicken at work today, in fact still in the lab as I type this...)
 
I'm out of the loop, what is this "day9" that you speak of? (been running around like a headless chicken at work today, in fact still in the lab as I type this...)

He is a popular caster for SC2, or was for SC2 and is now just a well known caster.
 
Richard Huddy gave the host of the PC Gaming Show a briefcase with a mystery object inside.
He picked it up and slammed it down on the desk. It had a dual Fiji PCB inside.
 
I feel like the transition from tube to copper pipe could easily leak...

Depends on what is going on under that tube. You can see the fitting under the edge of the tube so it doesn't look super thick but I wonder if its epoxied or otherwise affixed beyond big pipe wedged inside small tube that stretches.
 
I feel like the transition from tube to copper pipe could easily leak...

No more than any other connector/fitting. That actually looks much better and more secure than most custom H20 loop's fittings.
Is it a "weak point?" Yep.
Will it be a problem? Not likely.
 
What do you expect them to say? 4 isn't enough but it will have to do? Why are they planning for 8 in their next series of 4 is enough? 4 is fine but a year later games are suddenly going to need 8? You're the most ridiculous fan boy.

I still don’t get the 4GB issue. Yes, it is nice to over spec everything but I still have not seen a list of games that just shut down on millions of 2, 3, 3.5 and 4GB cards out there still in use. I know I did not look that hard but I found a few reviews with 4 and 8GB cards and none of the games showed any difference. If you plan on playing games at triple-monitor 8K resolutions, 8GB of onboard graphics memory may be an issue. Even so, these new 4GB+ VRAM needing games are not going to let you choose lower res textures or something so the game still works?
 
People keep forgetting that with DirectX 12 and Vulkan a pair of 4GB video cards become 8GB as there is no need for frame buffer mirroring anymore.
 
People keep forgetting that with DirectX 12 and Vulkan a pair of 4GB video cards become 8GB as there is no need for frame buffer mirroring anymore.

Exactly. The future is DX12/W10. Looking forward to the new update.
 
The PC gaming show has been going for nearly 2.5 hours... Maaaaaan...
Officially the LONGEST conference at E3.
 
The PC gaming show has been going for nearly 2.5 hours... Maaaaaan...
Officially the LONGEST conference at E3.

Isn't it supposed to be 24hours?
Glad I didn't take the day off of work to drive to LA to go to AMD's E3. We got beer at work today.
 
People keep forgetting that with DirectX 12 and Vulkan a pair of 4GB video cards become 8GB as there is no need for frame buffer mirroring anymore.

Don't the games have to support DX12 first?
 
You know I am impressed by the Fury's for sure. To me the card I want to know about is the Nano.

So its my calculations are right (which im probably wrong) It will use 125-135w and be faster then the 290x? and half the size?

It is the card that blew my mind......impressive

Not getting any of the new cards, but....Fury X is no joke of a video card.
 
Found a different list.
Pretty sure it is due to Amkor handling the packaging and assembly.

So we still don't know exactly who fabbed the Fiji die.
Interposer = UMC
HBM = Hynix
Packaging and assembly = Amkor
GPU die = ???
 
Back
Top