Sooo no new cards this year eh?

rinaldo00

2[H]4U
Joined
Mar 9, 2005
Messages
2,178
What happened to the 8GB 290X that was due in Nov and the 8GB cards from Nvidia to compete with it?

Is there really nothing new for Christmas sales?
 
Teh latest edition of maximumpc has a review of teh sapphire 8 gig card. 8 gigs of ram does NOTHING not even in 4k..........
 
Teh latest edition of maximumpc has a review of teh sapphire 8 gig card. 8 gigs of ram does NOTHING not even in 4k..........

Tom's seems to indicate that the 8GB version is an improvement for higher than 1080p resolutions: http://www.tomshardware.com/reviews/sapphire-vapor-x-r9-290x-8gb,3977-6.html

From their conclusion:

In general I tell graphics card buyers that the amount of RAM is overrated as a graphics card attribute, and that's still true. But if you plan on playing games at triple-monitor or 4K resolutions, 8GB of onboard graphics memory becomes far more compelling. Based on this comparison alone, Sapphire's Vapor-X R9 290X 8GB card looks like a good idea for high-resolution duty.

Of course the charts make the 970 look even more appealing, especially since the price increase to the 290X 8GB doesn't equate to a performance increase of the same magnitude. The 970 actually does better at 4K than the 290X 8GB in a few scenarios.
 
8gb on a 290/x or a 970/980 is mostly useless. When you are gaming at 4k the extra ram will help, but most likely youre going to run out of gpu power. now if you take 8gb with a 290/x or a 970/980 at 4k and run cf/sli, that will make the 8gb more useful. without the gpu power, the ram is useless.
 
8gb on a 290/x or a 970/980 is mostly useless. When you are gaming at 4k the extra ram will help, but most likely youre going to run out of gpu power. now if you take 8gb with a 290/x or a 970/980 at 4k and run cf/sli, that will make the 8gb more useful. without the gpu power, the ram is useless.

Yeah, I can't wait to see what the 980 Ti or Titan variant delivers. Also the 390X or whatever they end up naming these things. I'm quite happy with the 970 for right now, though I had been hoping they would've dropped the 960 in time for the holidays. It would have made a nice present and should be a solid performer for a secondary rig.
 
Aren't the consoles based on last gen video cards? I don't think either AMD or Nvidia is in any rush to make something new when the current gen won't ever be utilized by game devs anytime soon if ever. Well, unless we get another Crysis-like game, which is pretty doubtful considering that companies aren't willing to invest much in innovation for the sake of it.
 
290X/980 can't even run all current games at 1080p@60fps at Ultra settings so we need those cards ASAP.

And 1440p/2160p as well as 120+ fps gaming can easily consume any amount of gpu power you throw at it.
 
Tom's seems to indicate that the 8GB version is an improvement for higher than 1080p resolutions: http://www.tomshardware.com/reviews/sapphire-vapor-x-r9-290x-8gb,3977-6.html

From their conclusion:



Of course the charts make the 970 look even more appealing, especially since the price increase to the 290X 8GB doesn't equate to a performance increase of the same magnitude. The 970 actually does better at 4K than the 290X 8GB in a few scenarios.

That comparison is total BS by toms. They make a disclaimer that they match the core clock and memory clock between the reference and vapor X but make no attempt to compensate for the throttling nature of the 290x reference fan profile.

Anyone who have ever had experience with the reference 290/290x should know that the default fan profile will lead to throttling and affect gameplay. A simple trip to AMD Catalyst Center software and adjusting fan profile will remove all throttling.
 
Aren't the consoles based on last gen video cards? I don't think either AMD or Nvidia is in any rush to make something new when the current gen won't ever be utilized by game devs anytime soon if ever. Well, unless we get another Crysis-like game, which is pretty doubtful considering that companies aren't willing to invest much in innovation for the sake of it.

Consoles for the vast majority of their life, are behind PC's.... That's never stopped AMD or nVidia from releasing new cards.
 
The extra ram only helps if the bandwidth is there to support it. :D
 
I wouldn't expect anything new until AMD's new cards hit which rumors point to a February launch from what I've heard. Of course who knows how accurate those rumors are.
 
No way we're going to see anything this year. No one is working from now until the 5th.

I'm waiting for the Titan 2 or 980 Ti or whatever to come out. Even at that I'm waiting for the 300 series from AMD. I'll never run anything but Intel + NV for the foreseeable future, but whatever AMD comes out with should result in some sort of price drop from NV.

8GB 980 seems kinda pointless. Lacks the power to drive framerates at resolutions high enough to need that much vram. I think we'll probably see a 990 dual gpu card. Power consumption of the 980 seems to make it a no brainer to slap a second one on the same PCB. But I'm sure it'll only have 4GB x 2 of VRAM and then it'll be a conundrum piccking something that's 50% more powerful than a Titan 2 with half the VRAM per gpu.
 
8gb on a 290/x or a 970/980 is mostly useless. When you are gaming at 4k the extra ram will help, but most likely youre going to run out of gpu power. now if you take 8gb with a 290/x or a 970/980 at 4k and run cf/sli, that will make the 8gb more useful. without the gpu power, the ram is useless.

Good answer. People seem to keep forgetting that some users enjoy running SLI/CF or even more cards in the rig, and in those cases having the extra memory could help especially at 4K.
 
I don't mind waiting on the new AMD cards as long as I am waiting on a 144Hz, 1440p FreeSync monitor to go with them..... and I am assuming the new cards will be out before we get any such panels. The dual 290's I picked up on 11/7/2013 for $760 shipped have aged phenomenally well at 1440p and still keep most games I play pegged at a solid, V-sync'd, 110FPS. So as it stands I will be buying my next GPUs, AMD or nVidia, based solely on what FreeSync or G-sync monitor I eventually go with come Spring/Summer 2015 (hopefully...)!
 
Last edited:
The problem with the 4GB vs 8GB comparisons out there at the moment are all the same: they are testing in the one configuration that you are far more likely to be Core bound than VRAM bound, a single GPU setup

The more accurate way of testing the effect of 4GB VS 8GB should have been Xfire setups, because VRAM isn't added in SLI/Xfire, but you are increasing your effective Core power. If Xfired 4GB and 8GB remain the same, then it can be assumed that VRAM really does nothing for those GPUs. Those who want the higher VRAM cards are generally those who run multi-GPU setups anyway.
 
That comparison is total BS by toms. They make a disclaimer that they match the core clock and memory clock between the reference and vapor X but make no attempt to compensate for the throttling nature of the 290x reference fan profile.

Anyone who have ever had experience with the reference 290/290x should know that the default fan profile will lead to throttling and affect gameplay. A simple trip to AMD Catalyst Center software and adjusting fan profile will remove all throttling.

So then the 8GB card really has even less of a performance lead over the 4GB version at 4K resolution?
 
With only 11 days left and no major press conferences scheduled by either camp there's no room for a new card release until late Jan early Feb at the earliest.
 
With only 11 days left and no major press conferences scheduled by either camp there's no room for a new card release until late Jan early Feb at the earliest.

That's what I'd bet too. Probably nothing exciting until February. Hopefully we'll see Bermuda XT and GM200 in Q1 '15.
 
I don't mind waiting on the new AMD cards as long as I am waiting on a 144Hz, 1440p FreeSync monitor to go with them..... and I am assuming the new cards will be out before we get any such panels. The dual 290's I picked up on 11/7/2013 for $760 shipped have aged phenomenally well at 1440p and still keep most games I play pegged at a solid, V-sync'd, 110FPS. So as it stands I will be buying my next GPUs, AMD or nVidia, based solely on what FreeSync or G-sync monitor I eventually go with come Spring/Summer 2015 (hopefully...)!

This is basically what I am going to be doing as well. I hope Nvidia supports freesync or gets some better Gsync panels out there or I will look around at AMD.

That said they really need to get on the ball with more Gsync/freesync monitors.
 
was hoping i would have the 8GB 980's before GTA V release(end Jan). IF not, think i will grab 2x 970's instead
 
I see everyone talking about the Titan2 but wouldn't the professional line Quadro have to run complete first? If so then have the quadro/Maxwell versions released?
 
Something tells me nVidia is waiting for AMD to release the 390X first before making their next move.
 
Teh latest edition of maximumpc has a review of teh sapphire 8 gig card. 8 gigs of ram does NOTHING not even in 4k..........

No surprise there. GPU tech needs to catch up to run 4k, not just having more VRAM available.
 
8gb on a 290/x or a 970/980 is mostly useless. When you are gaming at 4k the extra ram will help, but most likely youre going to run out of gpu power. now if you take 8gb with a 290/x or a 970/980 at 4k and run cf/sli, that will make the 8gb more useful. without the gpu power, the ram is useless.

This is worth repeating over and over again.
 
Kitguru also did some Xfire and SLI comparison (dual GPU setup), they threw in 980 SLI as well as a Titan Z just for the sake of comparison.

http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-290x-vapor-x-8gb-cf-review/

Bizarrely enough, in 4k resolution, while 290x 8GB beats 295X by a few % fps (never more than 10%), 980 SLI consistently beats it by the same margin, only losing to it in 1 game they tested.

Now, it could very well be that those games tend not be VRAM hungry, but it does seem that 290x dual Xfire still can't use the 8GB VRAM effectively, even at 4k. So somewhere in there the 290X simply isn't hitting VRAM cap (especially when compared to similar 295x).
 
Kitguru also did some Xfire and SLI comparison (dual GPU setup), they threw in 980 SLI as well as a Titan Z just for the sake of comparison.

http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-290x-vapor-x-8gb-cf-review/

Bizarrely enough, in 4k resolution, while 290x 8GB beats 295X by a few % fps (never more than 10%), 980 SLI consistently beats it by the same margin, only losing to it in 1 game they tested.

Now, it could very well be that those games tend not be VRAM hungry, but it does seem that 290x dual Xfire still can't use the 8GB VRAM effectively, even at 4k. So somewhere in there the 290X simply isn't hitting VRAM cap (especially when compared to similar 295x).

Nice link. Kinda amusing that the Titan Z and quad SLI in general seems to be a complete waste of money. 3x the expense for 0-20% better performance than a pair of 980s.

I'm more worried about games that are coming out in 2015 and 2016. Seems like the trend is to spend 0 fucks optimizing console ports and releasing games that require more and more video memory even though the graphic quality is pretty much the same as stuff that's already released. In theory I like the idea that the Titan 2 or 985 or 980 Ti or whatever in SLI will be able to hold down 60 FPS @ 4K and have enough video memory to be relevant in 2 years. My 2GB 680's feel downright antiquated nowadays just by video memory alone.
 
I was running a Titan before, which I sold in favor of a 295x2 when it was on sale. Trying to play DA:I I ended up having to run LOWER settings than on a single Titan because it ran out of memory 20-30 seconds into the game and would crash (couldn't do maxed textures and could not do 4xMSAA). I sold the 295x2 and got a single 290X 8GB and now I can use Mantle and fade touched textures, I'm seeing up to 6.4GB of Vram use from that. To me the little extra expense of the 8GB card has been worth it and I can also run more 4k textures in Skyrim.

This is what I was getting, mind you this is only at 1080p:

yaymantle.jpg
 
Mantle loads more data into VRAM to get the GPU do more brunt of the work right? or did I completely misunderstand how Mantle works?
 
Mantle loads more data into VRAM to get the GPU do more brunt of the work right? or did I completely misunderstand how Mantle works?


As far as I know, Mantle Vram loads are higher than DX11 (I remember reading somewhere about 500MB in one of the games, but I'm not positive on that). However, Dragon Age looks better under Mantle so I wanted to use that option (things like foliage and stuff). I'm not saying you need a boatload of Vram for 99% of the games out there but I found it beneficial for my uses. Now I can run DA:I completely maxed out in every setting at 1080p and locked at 60fps which I'm happy with.
 
Back
Top