NVIDIA to launch its GeForce GTX 880 next month, at under $500

No, there isn't someone who actually buys these ugly non reference cards, I don't know why they even bother :rolleyes:
 
Is there someone who buys such an ugly card?
Never understood the sense of non reference cards.
From my experience, the after-market cooling on non-reference cards have usually resulted in lower temps and better overclocking potential. There is a trade-off, the reference cards employ the blower style cooler which dumps good portion of heat outside of the case, vs. non-reference cards, which tend to have the dual fan, open cooling method. Most of the time, the latter option results in lower temps. I think.:)
 
As widely reported on the World Wide Web, there has been an unprecedented run on sticky number "9"s. Every supplier of sticky numbers reports large stockpiles of 1 through 8 with moderate levels of 0. However, 9 is running low. The inventory raid seems to have started about a week ago.

Check with your local sticky number supplier: see how high the spot price of number 9 has risen when compared to the summer average.

There can be only one reason: NVidia retailers furiously slapping sticky 9's over the erroneous 8.

Call this: Rumoer Confirmed!!

;)
 
As widely reported on the World Wide Web, there has been an unprecedented run on sticky number "9"s. Every supplier of sticky numbers reports large stockpiles of 1 through 8 with moderate levels of 0. However, 9 is running low. The inventory raid seems to have started about a week ago.

Check with your local sticky number supplier: see how high the spot price of number 9 has risen when compared to the summer average.

There can be only one reason: NVidia retailers furiously slapping sticky 9's over the erroneous 8.

Call this: Rumoer Confirmed!!

;)

Yeah, and since 9 is a perfect square of 3...that also confirms Half-Life 3!!!! Woot...REJOYCE MY BROTHERS AND SISTERS!
 
Yeah, and since 9 is a perfect square of 3...that also confirms Half-Life 3!!!! Woot...REJOYCE MY BROTHERS AND SISTERS!

Lmfao. And comes l4d3 and counterstrike WTO (world takeover) and portal 3 and and!

Gotta hit up godaddy for the www.sticky9.com (probably a porn site already though) :D

Oh and 900 series to stay on topic......

Edit: sticky9 is actually a sticker/magnet etc website, WTF!
 
Those who know all know that the x80 will be from 15-20% faster than a 780 Ti, with the same TDP, and have an MSRP of $499.99 - you know, just to get the thread back on track ;)
 
Huge Nvidia GAME24 event scheduled for Sept 18th..

NVIDIA announced Game24, a worldwide, multinational event spanning 24 hours, celebrating PC gaming. Scheduled for 18th September, 2014, The event will be held across seven locations around the world, and live-streams of each will be broadcast the world over.


This sorta reminds me of how Intel had their big OC competition event on the same day Devil's Canyon was officially released. 900-series def has to release on or before this event.
 
@AFD: Yeah, I saw that tweet from Nvidia and I feel like the new card has to come out on that day.
 
Huge Nvidia GAME24 event scheduled for Sept 18th.



Yep, that is the announcement date.

PS: 19th was made up, as the card was never scheduled to be released on that date.

Enjoy the (the gamescom like) livestream :D
 
Last edited:
Yep, that is the announcement date.

PS: 19th was made up, as the card was never scheduled to be released on that date.

Enjoy the (the gamescom like) livestream :D


According to SweC the event is, amongst other things, for the 900 launch and that it will be on retail shelfs at the same time.

På samma dag ska nämligen de första prestandainriktade grafikkorten ur den nya generationen Maxwell hitta till butikshyllorna.
- Translated: On the same day, the first performance-oriented graphics cards from the new generation of Maxwell will finds it's way to store shelves.

Source
 
are they really going to release a card that's faster then the 780 Ti and for a somewhat reasonable price?...I thought for sure performance would fall between the 780 and the 780 Ti...if so it might be time to upgrade my GTX 580
 
From my experience, the after-market cooling on non-reference cards have usually resulted in lower temps and better overclocking potential.
Depends on what part of the card you're looking at.

Traditional dual-fan aftermarket coolers, even in a best-case-scenario (open-air, not in a case), will actually tend to produce higher VRM temps than reference. This can be detrimental to overall lifespan and hamper more extreme overclocks. After all, hotter voltage regulators are more likely to produce noisy power. Anything that uses a simple base-plate to cool the VRMs, like EVGA's ACX cooler, it highly suspect.

Sure, the core might look nice and cool, but that's hardly the whole story. I've seen multiple instances of this problem, double-checked with an IR temp gun. Anyone who's attempted to swap to a NZXT G10 bracket + AIO water cooler will have run into this as well (even with the dedicated 92mm fan blowing directly onto the VRMs, they still get far toastier than the reference cooler).
 
Given the current prices on the 780Ti, it's almost guaranteed that the 980 is going to be faster and <=$500. There is a 780Ti for $530 on Newegg today. I don't see them releasing a slower card for only $30 less. Thta's ignoring the fact that 780Ti prices will probably drop further once the 980 is officially announced/released.
 
so is the 980 still going to be the mid-range card like the GTX 680 was?...technically it will be the high end but the 20nm cards will be the true high end parts right?
 
4gb vram isnt enough imo.

I have a hard time finding games that come close to 3gb at 4k with fxaa (which is far better quality than lower res with MSAA even) . Reviews back this up as well... 4gb is way more than enough for the coming years in the foreseeable future.
 
so is the 980 still going to be the mid-range card like the GTX 680 was?...technically it will be the high end but the 20nm cards will be the true high end parts right?

Lol, if you consider the 7970 amd Gtx 680 as midrange when they came out, but they were the top end and remained so for a good while. You can't judge future products as making sooner future products low end haha.
 
I have a hard time finding games that come close to 3gb at 4k with fxaa (which is far better quality than lower res with MSAA even) . Reviews back this up as well... 4gb is way more than enough for the coming years in the foreseeable future.

Depends on what goes on with games for the new consoles. They do have 5-6GB of RAM available to games (video and system combined). So in theory they can start to have larger textures, which could need more VRAM. Will this happen? Dunno. It isn't something I'd worry overly much about, but it is a potential situation.
 
Watch Dogs uses more then 3GB VRAM

Yes, as I said I have a hard time finding them. One or two or even five games in several years doesn't mean a trend :p, especially when the chief example is even admitted by the publisher to have been poorly optimized for PC. :)

Depends on what goes on with games for the new consoles. They do have 5-6GB of RAM available to games (video and system combined). So in theory they can start to have larger textures, which could need more VRAM. Will this happen? Dunno. It isn't something I'd worry overly much about, but it is a potential situation.

Most games I've tried use between 1.8-2.5gb of VRAM including BF4, ESO with texture tweak to enable top-resolution levels of detail, Crysis 3, Far Cry 3, Dirt Showdown/Dirt 3, Witcher 2, and other games I play fairly often that are relatively demanding. Without using MSAA it's hard to push over 3gb with FXAA @ 4k :).
 
Last edited:
Yes, as I said I have a hard time finding them. One or two or even five games in several years doesn't mean a trend :p, especially when the chief example is even admitted by the publisher to have been poorly optimized for PC. :)



Most games I've tried use between 1.8-2.5gb of VRAM including BF4, ESO with texture tweak to enable top-resolution levels of detail, Crysis 3, Far Cry 3, Dirt Showdown/Dirt 3, Witcher 2, and other games I play fairly often that are relatively demanding. Without using MSAA it's hard to push over 3gb with FXAA @ 4k :).

Why not using msaa if you can?
 
Last edited:
Why not using msaa if you can?

Because it does nothing at 4k basically except chew up performance while only affecting some edges instead of all like fxaa does (with lower blur thanks to the larger pixel sample count it can work with in the algorithm at such a high resolution)... it just isn't needed or even beneficial at 4k. 1080p? Sure. 1680x1050 from a decade ago? Sure. 1440p? 2x MSAA + FXAA = awesome. But 4k 3840x2160? Nope. FXAA is gorgeous and works in everything unlike msaa.
 
FXAAA looks like smeared mayonnaise on my screen. Objects lose all detail in motion when it is on. Last time I enabled FXAA by accident in BF4, I went 2 - 29 before I figured out what was wrong. When FXAA is on I see at least 2 blurry copies of objects in motion. I get an instant headache from this naturally. Thus I never enable it if I can.

If MSAA is too much of a hit, then I just downsample.
 
FXAAA looks like smeared mayonnaise on my screen. Objects lose all detail in motion when it is on. Last time I enabled FXAA by accident in BF4, I went 2 - 29 before I figured out what was wrong. When FXAA is on I see at least 2 blurry copies of objects in motion. I get an instant headache from this naturally. Thus I never enable it if I can.

If MSAA is too much of a hit, then I just downsample.

At 1080p I agree it looks bad. At 4k it's a different story... Is your SIG accurate that you still run 1080? If so the post isn't about your panel, but was in reply to someone asking about 4k. As explained with 4x the pixels to work with the algorithms look quite good at 4k res.
 
FXAAA looks like smeared mayonnaise on my screen. Objects lose all detail in motion when it is on. Last time I enabled FXAA by accident in BF4, I went 2 - 29 before I figured out what was wrong. When FXAA is on I see at least 2 blurry copies of objects in motion. I get an instant headache from this naturally. Thus I never enable it if I can.

If MSAA is too much of a hit, then I just downsample.
No idea what you're on about with the motion-blur-double-image thing. Never seen FXAA do that.

General blurriness is a result of using FXAA with too-low a resolution, though. Honestly, FXAA is best used in-combination with SSAA, as it can be applied before down-sampling occurs to improve the effectiveness of SSAA without the MASSIVE performance hit of higher SSAA levels
 
No idea what you're on about with the motion-blur-double-image thing. Never seen FXAA do that.

General blurriness is a result of using FXAA with too-low a resolution, though. Honestly, FXAA is best used in-combination with SSAA, as it can be applied before down-sampling occurs to improve the effectiveness of SSAA without the MASSIVE performance hit of higher SSAA levels

Then you must of never used FXAA. It is 1 blurry fest in certain games.

Like BF4 as an example. I turn off FXAA right away, and it improves the IQ quite a bit.
 
Then you must of never used FXAA. It is 1 blurry fest in certain games.
Use it all the time. I've never seen it cause temporal artifacts (double-images on moving objects) like the other poster described.
 
Use it all the time. I've never seen it cause temporal artifacts (double-images on moving objects) like the other poster described.

Welp when numerous websites state that FXAA makes the IQ worse in BF4. I guess you are alone.

Not saying it sucks in all games, just alot of them
 
Use it all the time. I've never seen it cause temporal artifacts (double-images on moving objects) like the other poster described.

Welp when numerous websites state that FXAA makes the IQ worse in BF4. I guess you are alone.

Not saying it sucks in all games, just alot of them

I think Tera also used FXAA, or Guild wars 2, one of them did, and it made the image blurry and annoying.\

BF3 Also.
 
Back
Top