Starrbuck
2[H]4U
- Joined
- Jun 12, 2005
- Messages
- 2,981
I didn't sell my 970s and leave NVIDIA as much for performance (although I did see issues in 4k) as I did because I can't stand being lied to.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
The game choosing to use what's available is directly related to the quality of programming that went into developing the graphical engine. So yes, I can blame the game for using what's available despite not needing to. I don't see half of my Steam library of 180 some odd games maxing out my VRAM just for shits and giggles - the other half I don't see I haven't launched since I bought them so I can't confirm or deny.
It's still pretty shit what nVidia did/is doing - but that only accentuates and perpetuated a problem which has been going on for years now - the PC gaming/enthusiast community is getting the 10 foot pole shoved up its ass all from all different angles.
I wanna know what, if anything, Nvidia is going to do...Come on!! Tell MEEEEEE
How exactly would that help? You'd have the exact same problems if the extra 512MB was shut off. You still have to turn the settings down until you are under 3.5GB of VRAM even if they disable that RAM in a BIOS update. So just keep your RAM usage below 3.5GB and you'll have no issues.if nvidia doesn't release a driver/firmware update to make the 970 a 3.5 gb card, is there any chance a third party could?
And no, it doesn't matter those were unrealistically high settings for BF4. The article is proof enough that there will be huge issues when 970's RAM is filled and that is something to be more and more expected as newer games come out.
The article is proof enough that there will be huge issues when 970's RAM is filled and that is something to be more and more expected as newer games come out.
There are already games out there that show the very same thing at 1080p.
I don't see how that's lazy programming or the sign of a bad port
How exactly would that help? You'd have the exact same problems if the extra 512MB was shut off. You still have to turn the settings down until you are under 3.5GB of VRAM even if they disable that RAM in a BIOS update. So just keep your RAM usage below 3.5GB and you'll have no issues.
How exactly would that help? You'd have the exact same problems if the extra 512MB was shut off. You still have to turn the settings down until you are under 3.5GB of VRAM even if they disable that RAM in a BIOS update. So just keep your RAM usage below 3.5GB and you'll have no issues.
Because the card advertises that it has 4GB of ram so games try to use it.
This exposes the game to the slow 0.5GB.
It would be ridiculous to tweak everything to avoid it, you would still encounter the issue at times.
If the card advertises only the faster 3.5GB that wouldnt happen.
It's a few dozen lines of well written code to have the architecture/subsystem seek out available resources and use what's not in use. It's a few hundred or a few thousands lines of well written code to actually use a smaller amount of resources more efficiently. It's night and day difference.
It's like blind firing a bazooka saying "eh we'll get something from the blast/shrapnel out of it even if we miss" versus pinpoint double tapping or burst fire action one target at a time. Both get the job done but the burst fire is a much cleaner and preferred method when you might wanna keep that bazooka (VRAM) in reserve for when the tanks come rolling in.
It's like programming a motor controller with just proportional control versus throwing in multiple feedback signals via integral and derivative controllers. You could never hit your mark dead on and always be compensating resulting in something barely capable of a sawing action or you could be Jonny-on-the-spot to the sixty-fourth of an inch and do open heart surgery remotely. The second options cost more up front resources but you get so much extra headroom to solve upcoming problems with should the need arise.
I urge you, go load up Crysis 3 or Guild Wars 2 and slowly bump the settings and compare quality to VRAM use. Then go play something like MechWarrior Online and find a happy place graphics wise. Then turn on DX11 and cry, cry like you did coming out of the womb trying to figure out how to get that 50% of your instantly lost FPS back and how many settings you have to tank to do it. "Well the game still works...it doesn't crash...so that's success on the DX11 implementation right?" Wrong, it's so unoptimized that what little improvements it could offer are severely drowned out by the huge and unnecessary performance hit because some code monkey didn't wanna earn his keep that day and instead browsed Reddit all day at work.
Well, isn't that a nice graph. And no, it doesn't matter those were unrealistically high settings for BF4. The article is proof enough that there will be huge issues when 970's RAM is filled and that is something to be more and more expected as newer games come out. There are already games out there that show the very same thing at 1080p.
If one must replace their 970's, I would at very least choose to do Titan, 780 6GB or 290x 8GB versions, going for another 4GB card is, IMHO, foolhardy at this point, as one is already told (in your face kind of way) that 3.5GB isn't enough.
That is crazy....no fucking wonder I had horrible stuttering in DA:I and BF4 with 4k DSR going....
Jesus.....the 970's in SLI suck for 4k gaming period....
Those cards aren't using anything close to the same FOV.
Nvidia actually had the balls to raise prices on the 980 after all this 970 controversy...it was $599.99 yesterday, now it's $626.99
http://www.amazon.com/Gigabyte-GV-N...phics/dp/B00NH2D5DC/ref=pd_rhf_gw_p_img_1#Ask
you play bf4 at 25 fps?
The issue being discussed isn't games that require more VRAM which could be an optimization issue it's games that are designed to use more than they need if it's available. That's more optimization not less and it would increase performance if the 970 reported 3.5 gb or if the 4 gb was fully useable.
Your posts remind me of those by another screen name both in style and substance, I hope you're not using multiple accounts...
Nvidia actually had the balls to raise prices on the 980 after all this 970 controversy...it was $599.99 yesterday, now it's $626.99
http://www.amazon.com/Gigabyte-GV-N...phics/dp/B00NH2D5DC/ref=pd_rhf_gw_p_img_1#Ask
That's incorrect by the very definition of optimize - make the best or most effective use of (a situation, opportunity, or resource). At best you could call the bold/unlined/italicized statement inefficient - but certainly not optimized. There is nothing effective about using 4GB of VRAM to render the same quality image if it really only takes 2-2.5GB. Again, few dozen lines of code to tell the game to use all available resources...but not enhance the preset image quality; or a few hundred/thousand lines of code to use exactly what is needed and nothing more, keeping the additional resources for overhead. The latter certainly is not optimized from a development/business standpoint as that takes employee's time which costs Mr. CEO thousands of extra dollars - but from a programming perspective it is.
Nvidia actually had the balls to raise prices on the 980 after all this 970 controversy...it was $599.99 yesterday, now it's $626.99
http://www.amazon.com/Gigabyte-GV-N...phics/dp/B00NH2D5DC/ref=pd_rhf_gw_p_img_1#Ask
That's incorrect by the very definition of optimize - make the best or most effective use of (a situation, opportunity, or resource). At best you could call the bold/unlined/italicized statement inefficient - but certainly not optimized. There is nothing effective about using 4GB of VRAM to render the same quality image if it really only takes 2-2.5GB. Again, few dozen lines of code to tell the game to use all available resources...but not enhance the preset image quality; or a few hundred/thousand lines of code to use exactly what is needed and nothing more, keeping the additional resources for overhead. The latter certainly is not optimized from a development/business standpoint as that takes employee's time which costs Mr. CEO thousands of extra dollars - but from a programming perspective it is.
Bursor & Fisher to look into 970 issues
Might be a dud or even an ad, but at least it's a hope (that lawyers will get millions and gtx 970 owners will get a check for $2.27)
Nvidia actually had the balls to raise prices on the 980 after all this 970 controversy...it was $599.99 yesterday, now it's $626.99
http://www.amazon.com/Gigabyte-GV-N...phics/dp/B00NH2D5DC/ref=pd_rhf_gw_p_img_1#Ask
NV run msrp/map pricing
blame amazon / people retuning fully working cards
the card is just as fast as it was when it was reviewed
itt bunch of entitled kids
Entitled to get a card which actually has the hardware listed on the specification sheet/marketing material/reviews?
I hope so.
Nvidia actually had the balls to raise prices on the 980 after all this 970 controversy...it was $599.99 yesterday, now it's $626.99
Agree with you 100% here. If Nvidia advertised it as such before purchase and changed it without ever advertising the change until people found out by themselves after purchase then thats clearly false advertising, bait and switch.
I mean for a lot of people who buy this card it wont matter, maybe down the line it seems it might. But for those who bought this hoping to keep it for awhile or use it for high res gaming it seems it does matter to them. Personally I think there are some shady stuff going on in Nvidia.
Sorry, but if you dump your 970s now and turn around and buy 980s, you're a sucker.
http://www.hardocp.com/article/2014...x_directcu_ii_video_card_review/#.VNDVzS7ClWQ
not seeing a 'bait and switch' you got what you payed for
unless you can prove EVERY reviewer got a golden card...
Agreed 100%. You'll basically be rewarding nVidia. The best option I've seen is returning the 970's and going with a 290x. That will hurt nVidia where it counts - their bank account.
the card is just as fast as it was when it was reviewed
itt bunch of entitled kids
you play bf4 at 25 fps?
So, would a MSI Lightning r9 290x be a better replacement for the gtx 970?