Fiji has only 4 GB VRAM per GPU

FUDzilla...yeah, great solid source right there. :p

AMD will probably have varying VRAM options for us to choose from. 4GB for the 390/390X, then perhaps an 8GB Gigahertz Edition (speculating big time...).

Or, to quickly regain the ground they have lost, just bring out 8GB cards right off the bat and not screw around with the NVidia mindset of "offer a little for a lot of cost now, then offer more for even higher cost soon after."
 
Well kinda explains the water cooling. But still, this seems more like rumor than concrete fact here. I assumed that the "up to 8GB" came from the eventual release of a card with HBM2 which would allow for the doubled memory size.
 
I guess the $700+ price tag goes out the window.

// edit, They're suggesting the 390X is a dual-GPU card. Lost me.
$700 price tag back in the window.

Does that mean single GPU Fiji will be 380/X @ $300-$400? Shit, I'm confused.
 
Last edited:
Well I think Robert Hallock tweeted something about lower latency when using SFR on a single dual GPU card. That would eliminate the PCI Express bus latency coming into play when using SFR. Which makes sense as DX12 will pool memory sources instead of the current system in DX11.

In DX11 if you have (2) 4GB GPUs and SFR them, then you still only have 4GB of VRAM to utilize. In DX12 if you have (2) 4GB GPUs and SFR them, then you have 8GB of VRAM to use. 2 Titan X in SLi would be seen as a single 24GB card to the system if you want to look at extremes.

This is one of the tenets of Mantle.
 
I guess the $700+ price tag goes out the window.

// edit, They're suggesting the 390X is a dual-GPU card. Lost me.
$700 price tag back in the window.

Does that mean single GPU Fiji will be 380/X @ $300-$400? Shit, I'm confused.

If they want to put any pressure on Nvidia it needs to be at least price competitive with the 970 (and at least close to the 980 TI in performance) so that's a pretty good guess on price.
 
Why would anyone buy this over Nvidia's cards if it only has 4 GB of VRAM?

You lose out on the Nvidia superior drivers, graphics options, etc. and for what? Absolutely nothing.

The GTX 980 Ti will have 6 GB of VRAM so if this thing only has 4 GB then AMD is finished. At least with 8 GB they could be somewhat competitive.
 
Why would anyone buy this over Nvidia's cards if it only has 4 GB of VRAM?

You lose out on the Nvidia superior drivers, graphics options, etc. and for what? Absolutely nothing.

The GTX 980 Ti will have 6 GB of VRAM so if this thing only has 4 GB then AMD is finished. At least with 8 GB they could be somewhat competitive.
Without knowing the prices or performance of these cards, this is all FUD.
You've got this skewed image inside your brain and you're projecting it onto reality.

8 GB, 6 GB, 4 GB. Means nothing to me and the majority of PC gamers, we're still at 1080p bro. I'd prefer 4 GB since it saves me the most cash, especially with HBM if that ends up being the case.
 
Why would anyone buy this over Nvidia's cards if it only has 4 GB of VRAM?

You lose out on the Nvidia superior drivers, graphics options, etc. and for what? Absolutely nothing.

The GTX 980 Ti will have 6 GB of VRAM so if this thing only has 4 GB then AMD is finished. At least with 8 GB they could be somewhat competitive.

I'm fine with 4GB at 1440 and with AMD -- depending on the price. Last go round the Titan and Ti were pretty pricey. Same thing seems to be happening this time around too -- the Titan X is $1000. I do see the value in it as you could have the performance of two GPUs in a single card if you overclock the poop out of it, but I don't have the budget for that. Something around $650-700 with that sort of performance would be reasonable (Nvidia or AMD).
 
Why would anyone buy this over Nvidia's cards if it only has 4 GB of VRAM?

You lose out on the Nvidia superior drivers, graphics options, etc. and for what? Absolutely nothing.

The GTX 980 Ti will have 6 GB of VRAM so if this thing only has 4 GB then AMD is finished. At least with 8 GB they could be somewhat competitive.

For one reason alone....The Ntrolls/fanboys whatever you want to call them have been extremely idiotic lately. I usually switch back and forth between green and red. Now I think I will stick to red from now on.
 
I'm fine with 4GB at 1440

Okay, but here's what HardOCP is seeing for VRAM usage at 2560x1440 (on the next page, 4K VRAM usage approaches but doesn't exceed 8GB... yet):

14265930473m7LV4iNyQ_2_1.gif
 
Why would anyone buy this over Nvidia's cards if it only has 4 GB of VRAM?

You lose out on the Nvidia superior drivers, graphics options, etc. and for what? Absolutely nothing.

The GTX 980 Ti will have 6 GB of VRAM so if this thing only has 4 GB then AMD is finished. At least with 8 GB they could be somewhat competitive.

Why would anyone buy a 970 with it's faux 4GB of VRAM? Oh right... no one knew.
 
I guess the $700+ price tag goes out the window.

// edit, They're suggesting the 390X is a dual-GPU card. Lost me.
$700 price tag back in the window.

Does that mean single GPU Fiji will be 380/X @ $300-$400? Shit, I'm confused.

There is no way a full Fiji with 4GB HBM will go for $400, let alone $300, not unless nVidia releases the 980 Ti for $500, which would never happen if unprovoked by AMD.

Calling total BS on this article.
 
Okay, but here's what HardOCP is seeing for VRAM usage at 2560x1440 (on the next page, 4K VRAM usage approaches but doesn't exceed 8GB... yet):
Doesn't it fill VRAM to capacity similar to RAM prefetching?
I mean just because it says 4 GB doesn't mean it's actually utilizing 4 GB.

Most of the games I play flatline at 3000 MB from start to finish but it doesn't cause any performance issues. I just assume the game is actually using 2~2.5 GB.
 
Okay, but here's what HardOCP is seeing for VRAM usage at 2560x1440 (on the next page, 4K VRAM usage approaches but doesn't exceed 8GB... yet):

14265930473m7LV4iNyQ_2_1.gif

It's important to distinguish between a game actually needing that much vram, vs it just going bonkers with the vram because of shitty optimization.

If you look at the Dying Light framerate graph, you'll see the 980 does not have any unexpected dips in min framerate, which would be a telltale sign of textures being thrased in and out of system memory. This is how I conclude that Dying Light (and quite possibly all the games HardOCP tested) doesn't actually *require* 4GB at 1440p.
 
Without knowing the prices or performance of these cards, this is all FUD.
You've got this skewed image inside your brain and you're projecting it onto reality.

8 GB, 6 GB, 4 GB. Means nothing to me and the majority of PC gamers, we're still at 1080p bro. I'd prefer 4 GB since it saves me the most cash, especially with HBM if that ends up being the case.

If you are buying a flagship card and not running most of your games at 4K then you are wasting your money.

AMD and Nvidia's flagships have sufficient power to drive 4K resolutions and thanks to Nvidia and AMD officially building downsampling support into their drivers last year there is no longer any excuses for not PC gaming at 4K, even on a 1080p screen.

Nvidia DSR @ 4K on a 1080p screen looks absolutely gorgeous. It is one of the most important image quality enhancements you can make short of actually getting a native 4K display. I have GTX 980s I drive games at 4K at and I can tell you I am constantly hitting that 4 GB VRAM limit. The framerates are not the problem, the cards can push good framerates in 4K most of the time. It's the stuttering and lag spikes caused from the cards running out of VRAM and having to load in new textures in the middle of gameplay that is the problem.

So that leaves the VRAM as #1 bottleneck PC gamers will be facing because even for PC gamers gaming on a 1080p display they should still be rendering their games at 4K.
 
Well Xizer the game developers need to fix the text size issues. Most of the time I find that games I downsample also render the in game text illegible. I'm one of the weirdos that likes to turn on Japanese voices and English subtitles for maximum enjoyment.
 
It's important to distinguish between a game actually needing that much vram, vs it just going bonkers with the vram because of shitty optimization.

If you look at the Dying Light framerate graph, you'll see the 980 does not have any unexpected dips in min framerate, which would be a telltale sign of textures being thrased in and out of system memory. This is how I conclude that Dying Light (and quite possibly all the games HardOCP tested) doesn't actually *require* 4GB at 1440p.

Also worth pointing out that using 8XMSAA at 1440P is a little excessive.
 
Fudzilla was the original rumormonger for 8GB. What's your point?

Point? Seriously? You just made it...first they say 8GB, now they say 4GB. So which is it? Could have 16GB for all we know. Hell, maybe 32GB. Should we place bets on 64GB?

Fact of the matter is, FUDzilla is nothing but a click-bait site that spreads crap by means of thriving off of people's fear, uncertainty, and doubt (FUD). No one on the outside knows anything about the R9 series until AMD officially releases that info and the NDA expires.
 
/takes note of usage of "only 4GB per GPU" as a negative thing on March 27, 2015

*Sniff*, what progress we've made.
 
4 GB for a flagship is pretty disappointing.
290X had it 2 years ago.

HBM is wasted on such low capacity. I think AMD knows this.
 
Why would anyone buy a 970 with it's faux 4GB of VRAM? Oh right... no one knew.

but now people do know 970s are still selling like hot cakes, more 970s have probably sold in the past few months than AMD will sell gpus this year.
 
A 970 is very cheap...a 390x won't be.
I can't even imagine how AMD will compete with the 970. I mean, the 290X doesn't... It actually has 4 GB of real VRAM and that's about it.

Rebranding Hawaii isn't going to do them any favors. Nobody even likes the 290X.
 
I can't even imagine how AMD will compete with the 970. I mean, the 290X doesn't... It actually has 4 GB of real VRAM and that's about it.

Rebranding Hawaii isn't going to do them any favors. Nobody even likes the 290X.

They are already winning at DX12. That's where it really counts. And 4GB + 4GB configurations will be seen as 8GB of VRAM in DX12. 2 way Titan X SLi will be seen as 24GB of VRAM for example. I don't see the issue personally if the card is dual GPU. I guess you can still complain about older DX11 games still seeing 4GB of VRAM, but Unreal 4, Unity 5, and more have upgraded to DX12 already. And since AMD owns the console market, and DX12 is 100% coming to XBOX One, then expect newer games to support DX12. Hints are that we will see the first DX12 game at the end of the year.

http://www.pcper.com/reviews/Graphics-Cards/3DMark-API-Overhead-Feature-Test-Early-DX12-Performance

http://community.amd.com/community/...mance-in-new-3dmark-api-overhead-feature-test
 
I like your optimism, I hadn't considered DX12 yet... Mostly because by the time it's being implemented we will have the 400 series and Pascal on the market, so it's only useful for people who buy new Fijis and keep them for several years.

multi-GPU still has "issues", especially with AMD. They take months to release CF profiles. It's just a hassle. Although I didn't realize we were giving the Fudzilla article any credit. The 390X is going to be a single GPU card... It's fun to entertain other ideas but let's remember there is a line between fantasy and reality.
 
The way that DX12 is hinted to work, Nvidia and AMD cards will be able to be combined to render scenes together. The OS uses them as workers for a common goal. So it doesn't see Crossfire or SLi; it balances the load across the workers it has access to.

Remember that's what is being rumored as to how that DX12 works. If it doesn't pan out that way don't shoot me. To me this sounds like the end of CrossfireX and SLi as we know it today.
 
If the next gen high end AMD GPU's are dual GPU cards only, then that pretty much guarantees I will never buy one.

My minimum requirements for buying a next gen AMD video card are as follows:
1.) Single GPU
2.) Must beat my 2013 Titan by a noticeable margin
3.) Mixed resolution/mixed orientation Eyefinity as promised by the Omega drivers.

If they fail to deliver on any one of these, I'm out.

I had high hoped for a next gen AMD GPU, but if they fail to deliver any of the above, I'll go Nvidia this time again.

My philosophy is single strong GPU or bust. There are just too many problems with multi-GPU setups from both vendors, but especially AMD and their always behind crossfire profiles.

Even when profiles exist and everything is working as it's supposed to, SLI/Crossfire never is as smooth and playable as a single GPU.

DX12 may help with this, but I doubt it will come even close to the hype and rumors.
 
The way that DX12 is hinted to work, Nvidia and AMD cards will be able to be combined to render scenes together. The OS uses them as workers for a common goal. So it doesn't see Crossfire or SLi; it balances the load across the workers it has access to.

Remember that's what is being rumored as to how that DX12 works. If it doesn't pan out that way don't shoot me. To me this sounds like the end of CrossfireX and SLi as we know it today.

Someone has to program game to work like this.

Can You imagine ubisoft doing this level of optimilisations ? :D
 
Someone has to program game to work like this.

Can You imagine ubisoft doing this level of optimilisations ? :D

And the most significant piece of the equation is the display driver being coded to allow it. Not going to happen.
 
And the most significant piece of the equation is the display driver being coded to allow it. Not going to happen.
AMD would allow it. Nvidia wouldn't.
We'll see headlines about it within the next year. AMD will put it in their drivers then make a press release blaming Nvidia for dodging.
 
AMD would allow it. Nvidia wouldn't.
We'll see headlines about it within the next year. AMD will put it in their drivers then make a press release blaming Nvidia for dodging.

Depends on how well the R9 competes. If it is going to be a "Bulldozer", then I can see AMD being the first to support that. If it comes out kicking ass and taking names, then it makes sense to lock it down.
 
Back
Top