HardOCP looking into the 970 3.5GB issue?

Status
Not open for further replies.
It's really quite amusing to me how much drama this issue is causing. Mostly because it's entirely NVIDIA's fault. The design actually allows them to better utilize the GM204 GPUs that don't bin well enough for the 980. Rather than having to disable the entire ROP/L2 cache block for the 970, they were able to partially enable extra resources. But by marketing it as a full 4GB / 64 ROP card they've shot themselves in the foot because no one expected the last 512MB works more like a cache than fully accessible VRAM.

I'm pretty sure someone in marketing is having a very bad day @ NVIDIA right now.

My thoughts too. They had the opportunity to put a positive marketing spin on this, call it "TurboCache" or whatever. But instead they tried to sweep it under the rug, and now it's blown up in their face.

The saddest part to me though, is that at the end of the day, it's unlikely nVidia's sales will be impacted much if at all, if the attitudes in this thread is any indication.
 
Where are all the naysayers and the "but it doesn't affect anything!" crowd now??

That guy's tests still fail to show anything except a normal performance drop from increasing settings. The only thing that we haven't confirmed is a non-issue at this point is FCAT frametime data... the post you're linking to there sarcastically doesn't actually say anything different than nvidia themselves have so far. I don't know if you even own any gtx 970 cards, and if not you have no stake in this regardless and ought to go post about something relevant to you, but if you do then sarcastically pointing at info that we already have established is completely useless and distracts from the real issue at hand.

Still waiting for some actual objective FCAT data and testing. If the only real difference is that they put the wrong specs, yes I'd be annoyed at the false advertisement, but I couldn't truly be mad since I would be getting exactly what I had expected to in terms of performance and what was shown in reviews. Let's see what happens.
 
The saddest part to me though, is that at the end of the day, it's unlikely nVidia's sales will be impacted much if at all, if the attitudes in this thread is any indication.

Because at the end of the day the 970 still performs as well or better than AMD's top card, while using less power and fully supporting DX12. The benchmarks still haven't changed and the value of the card is still the same as it was when it launched.
 
It's time to relabel those boxes GTX 970 3.5GB

When you buy a product that says 4GB of RAM, you are thinking it's all contiguous and usable. However in this situation it's broken into 3.5GB contiguous RAM, plus 512MB of RAM at a slower speed.

It's a 3.5GB+512MB card.

<VULTURE MODE>I want some money back. I own a GTX980, but if the GTX970's limitations had been known from the start, they could have priced the 980 line down a bit. </VULTURE MODE> ;) :D
 
Because at the end of the day the 970 still performs as well or better than AMD's top card, while using less power and fully supporting DX12. The benchmarks still haven't changed and the value of the card is still the same as it was when it launched.

This... what else are we going to buy, the heat-pumping/slower-performing/lacking-features/poorly-oc'ing/sleazy-company R9 290 card from AMD instead? That's just going from "meh" to "oh hell, no!" :p.
 
the value of the card is still the same as it was when it launched.
Not necessarily. If you bought two of these with the intention of playing at high resolutions / downsampling, you very well might be bottlenecked now by the lack of VRAM. For single card users this isn't as big of a problem because they are going to hit a GPU wall before the VRAM wall, but that's not necessarily the case in SLI.
 
Because at the end of the day the 970 still performs as well or better than AMD's top card, while using less power and fully supporting DX12. The benchmarks still haven't changed and the value of the card is still the same as it was when it launched.

This... what else are we going to buy, the heat-pumping/slower-performing/lacking-features/poorly-oc'ing/sleazy-company R9 290 card from AMD instead? That's just going from "meh" to "oh hell, no!" :p.

Power consumption is a non-issue, MFAA is still a joke, don't get me started on PhysX, and AMD now has VSR as well. About the only argument that holds water is heat. Depending on what games you run and what resolutions you run them at, the 970 does NOT perform as well as even the 290. At 1440p or above the 970 falls behind.

And through all that, you failed to mention the most important part: the 970 costs more. What you said would've been a lot more compelling had the 970 costed the same or been cheaper, but no the 970 is anywhere from $50 to $100 more expensive than the 290.

Oh but I guess you're right in that the 290 and 290X do lack the segmented vram feature :D

EDIT: Before I get accused of being on AMD's payroll, let it be known that I bought 2 Gigabyte 970s 2 weeks after launch date.
 
Power consumption is a non-issue, MFAA is still a joke, don't get me started on PhysX, and AMD now has VSR as well. About the only argument that holds water is heat. Depending on what games you run and what resolutions you run them at, the 970 does NOT perform as well as even the 290. At 1440p or above the 970 falls behind.

And through all that, you failed to mention the most important part: the 970 costs more. What you said would've been a lot more compelling had the 970 costed the same or been cheaper, but no the 970 is anywhere from $50 to $100 more expensive.

Oh but I guess you're right in that the 290 and 290X do lack the segmented vram feature :D
Really? Here in the latest review, the 970 is ahead of the 290 at 1440 and is even slightly ahead at 4k. Plus the 970 has better overall overclocking headroom without sending power consumption way up.

http://www.techpowerup.com/reviews/Palit/GeForce_GTX_960_Super_JetStream/29.html
 
I wonder if this was an intentional performance bottleneck to limit people from using this card in scenarios where Nvidia would rather you buy the 980 (SLI 4k, etc)

*tinfoil hat on* You sell people a "4GB" card knowing most will never hit that, but don't want it to be too good for those who would, knowing it could hurt your flagship model sales.
 
DSR vs VSR isn't a fair comparison. VSR is very limited in supported hardware and resolutions compared to DSR. The Hawaii cards are limited to 3200x1800. Only Tonga can do 4k, and it doesn't really have the power or VRAM to drive 4k. None of them can do 4x VSR on a 2560x1440 or 3440x1440 display, for example.
 
I love how people cannot help but bash AMD in an 'Nvidia made mistakes' thread ;)

Agreed, It's the same people that bash AMD in every other thread, it really gets tiring.

I guarantee that if AMD was in the same situation as Nvidia is in right now, the same people who are defending Nvidia would be on a witch hunt for AMD.
 
Power consumption is a non-issue, MFAA is still a joke, don't get me started on PhysX, and AMD now has VSR as well. About the only argument that holds water is heat. Depending on what games you run and what resolutions you run them at, the 970 does NOT perform as well as even the 290. At 1440p or above the 970 falls behind.

And through all that, you failed to mention the most important part: the 970 costs more. What you said would've been a lot more compelling had the 970 costed the same or been cheaper, but no the 970 is anywhere from $50 to $100 more expensive than the 290.

Oh but I guess you're right in that the 290 and 290X do lack the segmented vram feature :D

EDIT: Before I get accused of being on AMD's payroll, let it be known that I bought 2 Gigabyte 970s 2 weeks after launch date.

Saying power consumption is a non issue is the equivalent to saying "cost of the card is not an issue"... Because if you add the power usage over 3 years of gaming / watching videos etc. up it adds another £50-£100+ onto the price of the card...
 
This... what else are we going to buy, the heat-pumping/slower-performing/lacking-features/poorly-oc'ing/sleazy-company R9 290 card from AMD instead? That's just going from "meh" to "oh hell, no!" :p.

Truly a non-biased stance on the subject.
 
Really? Here in the latest review, the 970 is ahead of the 290 at 1440 and is even slightly ahead at 4k.

http://www.techpowerup.com/reviews/Palit/GeForce_GTX_960_Super_JetStream/29.html

Like I said, it depends on the game, but currently I can think of 3 off the top of my head:

Shadow of Mordor
Civ: Beyond Earth
Ryse Son of Rome

Anyway, I was refuting Prime1's statement that the 970 performs as well or better than AMD's current flagship (290X) across the board, because it certainly does not.

Plus the 970 has better overall overclocking headroom without sending power consumption way up.

This I agree with.
 
Because if you add the power usage over 3 years of gaming / watching videos etc. up it adds another £50-£100+ onto the price of the card...
That's ~$4 dollars a month worst case scenario. Does that honestly sway your purchasing decisions?
 
Power consumption isn't a consideration for me , but heat is.

If two AMD cards adds to the heat of the room vs two Nvidia cards , Nvidia is what I'll pick every time.

With 100 degree and humid at the same time for days-weeks at a time in the summer, the AC already runs non stop. I'll go for the cooler top end card overall whoever makes it.
 
1 thing people need to realize is the performance is still there. The only problem is once you go over 3.5gb you will have some issues depending on the game.

The thing that is total bullshit is being lied too. They didnt lie about the performance, it is still a great fucking card, they lied about the specs for over 4 months. That to me is total bullshit and people who bought the cards have a right to be pissed off.

But saying the card sucks and people want refunds might be a stretch. I think they should make up something to 970 owners. Maybe a good free game? But they should really do something. IF it was 15 days or even 30 days after launch I can understand. But 4 months, the specs on the Nvidia website and every other major hardware site was all over the net, and not once Nvidia didn't realize the mistake?

Either way, No need downplay the 970 GTX, is still really is a great card.
 
^thanks for being the voice of reason, very well worded

Power consumption isn't a consideration for me , but heat is.

If two AMD cards adds to the heat of the room vs two Nvidia cards , Nvidia is what I'll pick every time.

With 100 degree and humid at the same time for days-weeks at a time in the summer, the AC already runs non stop. I'll go for the cooler top end card overall whoever makes it.

Yes heat is actually the SOLE reason why I skipped the 290/290X for the 970. Well that and at the time (2 weeks after Maxwell launch) we didn't see these crazy price cuts on AMD cards yet, so the 290 and 290X costed the same or actually slightly more IIRC.
 
That's ~$4 dollars a month worst case scenario. Does that honestly sway your purchasing decisions?

Well it is relevant to the price of the card.... so yes it does make a difference, also more power = more heat, electricity is more expensive in the UK.

If you add power consumption to selling price the 290 over 2-3 years costs more than the gtx 970.
 
Agreed, It's the same people that bash AMD in every other thread, it really gets tiring.

I guarantee that if AMD was in the same situation as Nvidia is in right now, the same people who are defending Nvidia would be on a witch hunt for AMD.
Or the people that say will they never buy Nvidia cards again and the next AMD card has some kind of issue,they will be here saying they'll never buy AMD again!
 
If any of you 970 owners are ready to toss your card(s) in the trash, please donate them to me. :p
 
That guy's tests still fail to show anything except a normal performance drop from increasing settings. The only thing that we haven't confirmed is a non-issue at this point is FCAT frametime data... the post you're linking to there sarcastically doesn't actually say anything different than nvidia themselves have so far. I don't know if you even own any gtx 970 cards, and if not you have no stake in this regardless and ought to go post about something relevant to you, but if you do then sarcastically pointing at info that we already have established is completely useless and distracts from the real issue at hand.

Still waiting for some actual objective FCAT data and testing. If the only real difference is that they put the wrong specs, yes I'd be annoyed at the false advertisement, but I couldn't truly be mad since I would be getting exactly what I had expected to in terms of performance and what was shown in reviews. Let's see what happens.

Reddit tester said:
•BATTLEFIELD 4

VRAM USAGE Min Avg Max Settings
2.8gb 69 90.253 135 100% resolution scale
3.3 - 3.4gb 38 46.014 52 160% resolution scale
3.5 - 3.6gb 17 36.629 55 165% resolution scale

This was tested using maximum settings with 0x FXAA, max FOV, and 0x motion blur.
EDIT: It seems a lot of people are missing what I did with BF4. I cranked up the resolution scale to purposely induce the Vram related stuttering. No one plays at 165%, it was simply to demonstrate that it could happen in BF4 as well.

At 3.3 to 3.4gb Vram usage, the game ran smoothly. The FPS was expectedly low due to the INSANE resolution scale I had to apply to raise the Vram usage 600mb, but it was still playable. I even killed some tanks, and I'm not very good at that.

ABOVE the 3.5gb threshold was a nightmare. Again, massive stuttering and freezing came into play. The FPS is not representative of the experience. Frametimes were awful (I use Frostbite 3's built in graphs to monitor) and spiking everywhere.

Granted he didn't actually show any FCAT data, but look at the minimum framerates, especially for SoM.
 
AMD stock rising is due to speculation regarding BLX IC Design buying them out.
 
First frametimes results coming in, from PCGamesHardware.de:

p0oDb0Q.png

Xew0wIU.png



qg93AHQ.png

DJrIvi4.png



Google Translate:
However, once the memory above 3.5 gigabytes is really needed and the driver can not shirk that it be situated in Ultra-HD with 4 x MSAA and "High" texture maps with up to 3,980 MiB, shows that there must be tricked, to view the full four gigabytes. The frametimes be far more uneven compared to the GTX 980 (which, incidentally, may approve about 70 MiB more still) and it shows not only in the diagram, but sensitive natures can do this in a direct comparison in the game notice.
Away from "normal" benchmarks are the differences between GTX 970 and 980 clearly than would imply the previously known specifications - at least in the on our benchmarks. While the behavior of the driver and the heuristic possibly obtain different application-specific good results, remains a stale aftertaste whether one or the other or stuttering stutterers in the border area with the previously announced Nvidia configuration would not be avoided.


EDIT: Note that the 980 has been downlocked to better simulate 970's computing power, making the comparison even more relevant to memory subsystem differences and effects, which was supposed to be the same between the cards.
 
Last edited:
First frametimes results coming in, from PCGamesHardware.de:

p0oDb0Q.png

Xew0wIU.png



qg93AHQ.png

DJrIvi4.png



Google Translate:


Interesting, looks like there's no particular adverse effect due to the memory segmentation... we got what we expected when looking at launch reviews. Thanks for the link!

Granted he didn't actually show any FCAT data, but look at the minimum framerates, especially for SoM.

All that guy showed was a single "min" number, and no frametime graphs at all or data. What's your point exactly? We have actual reviewers testing it now and proving that whole speculation is wrong, already at this point too :).
 
Interesting, looks like there's no particular adverse effect due to the memory segmentation... we got what we expected when looking at launch reviews. Thanks for the link!

What?!?! It is evident right there in the graphs and stated to be obvious during gameplay! Is your fanboy distortion field really that strong? :rolleyes:
 
Interesting, looks like there's no particular adverse effect due to the memory segmentation... we got what we expected when looking at launch reviews. Thanks for the link!

Uh, you do realize they concluded the exact opposite of what you just posted, right?
 
Interesting, looks like there's no particular adverse effect due to the memory segmentation... we got what we expected when looking at launch reviews. Thanks for the link!

All that guy showed was a single "min" number, and no frametime graphs at all or data. What's your point exactly? We have actual reviewers testing it now and proving that whole speculation is wrong, already at this point too :).

Ummm you may want to read what the actual reviewers said again, and pay closer attention to the graphs. I mean seriously:

Google Translate:
However, once the memory above 3.5 gigabytes is really needed and the driver can not shirk that it be situated in Ultra-HD with 4 x MSAA and "High" texture maps with up to 3,980 MiB, shows that there must be tricked, to view the full four gigabytes. The frametimes be far more uneven compared to the GTX 980 (which, incidentally, may approve about 70 MiB more still) and it shows not only in the diagram, but sensitive natures can do this in a direct comparison in the game notice.

Quote:
Away from "normal" benchmarks are the differences between GTX 970 and 980 clearly than would imply the previously known specifications - at least in the on our benchmarks. While the behavior of the driver and the heuristic possibly obtain different application-specific good results, remains a stale aftertaste whether one or the other or stuttering stutterers in the border area with the previously announced Nvidia configuration would not be avoided.

But I guess then we're going to argue over whether Google Translate is worth its salt and if anything got lost in translation blah blah. In case we're headed that way, no thanks.
 
Last edited:
There has to be a way to fill up VRAM without changing rendering settings, please tell me? I know that's the easy way but that's such a lazy benchmark attempt because the variables aren't controlled.

ugh i give up on the world

just smh
 
Ummm you may want to read what the actual reviewers said again, and pay closer attention to the graphs. I mean seriously:



But I guess then we're going to argue over whether Google Translate is worth its salt and if anything got lost in translation blah blah. In case we're headed that way, no thanks.

Gonna just quote my reply to you from the other thread:

Well shit, I totally misread that graph set before :eek: :mad:. Guess my 4k monitor wasn't at fault after all... which sucks, because I already sent it back for a refund and had gotten a killer deal on it originally.
 
Yeah, the >3.5GB graph shows frametimes increasing to over 100ms average. That's pretty awful. You would definitely feel that in game.
 
I find it hard to believe that no one at NVIDIA ever noticed this in testing the card pre-release.
 
I find it hard to believe that no one at NVIDIA ever noticed this in testing the card pre-release.

I don't buy for a second, frankly, that it wasn't noticed or was a "misprint". I would have definitely kept that monitor if I had known it was the cards and not the monitor... loved the picture quality it gave :mad: . But there wasn't enough time left on the return policy to chance it and there wasn't clear data out yet, so I sent it back.
 
Status
Not open for further replies.
Back
Top