Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
You still have no hard basis for making that claim, if you're referring to video RAM requirements again.You may be right for BF4 (I'll say that you'll be dialing back a slider or two not related to GPU rendering performance), but you're dead wrong for fully next-gen games built exclusively for incoming consoles and the PC.
Because there's another game out there with fewer bugs that provides the same or better gameplay? We've been playing BF3 successfully since the first public beta. It hasn't been smooth, I'll give you that, but you can hardly characterize the game as being any more 'bug filled' than any other AAA-title, and none exist that match BF3's depth and scope.
How did you like Skyrim, by the way?
You may be right for BF4 (I'll say that you'll be dialing back a slider or two not related to GPU rendering performance), but you're dead wrong for fully next-gen games built exclusively for incoming consoles and the PC.
For me, at 1600p, 2GB was just barely enough, but I knew that these cards would last for the two years I needed them. And yeah, I'll be dialing back settings until I replace them- but I'm already doing that to keep BF3 at competitive frame-rates.
You still have no hard basis for making that claim, if you're referring to video RAM requirements again.
You're making an assumption about how developers will allocate RAM on next-gen consoles, and based on the worst-case scenario derived from that assumption, you're making an assumption that said allocation will impact the PC side of things in such a way as to require inordinate quantities of video RAM.
Making predictions based on information like that isn't much better than consulting tarot cards or tea leaves.
@1600p you are turning things down? what are your settings? I've got everything turned up with no problems. well, for me anyway, its pegged at 60fps+ to match vync. (i'm assuming 60fps+ is your target too @ 60hz)
You're not wrong- but obviously that's not how I feel about it .
Essentially, the prediction is based on history- a lot of it. We've had the same consoles for eight years, and the consoles before those were similarly resource constrained relative to the PCs at the time.
So it's a prediction, of course. It's certainly possible that developers won't make use of that VRAM, and it's certainly possible that if they do, the PCIe bus will be fast enough on modern PCs to stream that data from main memory without a performance hit.
But that's not what I expect, because that's not what's happened before- anytime we've had a large jump in computing and memory resources, developers have raced to make good use of it. Sometimes it happens slowly, like the jump to 8GB of RAM as standard for PCs in a crashing RAM market and the ever-expanding CPU core counts of top shelf CPUs, but we're already seeing developers make use of that- even for games.
It might take a bit of time- maybe even a year. But since we typically buy hardware to last two years or more, and recommend hardware the same way, it's real hard not to recognize that developers will have unprecedented resources for their games on these new consoles, resources that PCs just don't have, outside of those very few running GPUs with 6GB of VRAM. And memory is cheap; it's why these consoles have 8GB of RAM in the first place, and it's why we should have more on our GPUs.
These new consoles aren't any less memory constrained, they just have the option of stealing some from places where they weren't allowed to steal it before. Given how weak the GPU in the Xbox one is, I don't foresee the need to steal more than 2GB for graphical assets.Essentially, the prediction is based on history- a lot of it. We've had the same consoles for eight years, and the consoles before those were similarly resource constrained relative to the PCs at the time.
You keep saying memory is cheap, when it really, really isn't...It might take a bit of time- maybe even a year. But since we typically buy hardware to last two years or more, and recommend hardware the same way, it's real hard not to recognize that developers will have unprecedented resources for their games on these new consoles, resources that PCs just don't have, outside of those very few running GPUs with 6GB of VRAM. And memory is cheap; it's why these consoles have 8GB of RAM in the first place, and it's why we should have more on our GPUs.
How do you intend to use more than around 2GB of RAM when that data literally cannot all make it to the GPU quickly enough?
Lets go by the numbers: Assuming you want to maintain 60 FPS, that gives you 1/60th of a second to run all the data through the GPU needed for each frame.
The Xbox One has 68 GB/s of total memory bandwidth, shared between the CPU and the GPU. Even if you dedicated 100% of memory bandwidth to the GPU (and choked out the CPU entirely), that's a grand total of 1.13 GB that can make it through the GPU per frame.
1.13GB, theoretical max, per-frame. In reality, the CPU will be using some of that bandwidth constantly, so it'll more likely be less than 1GB of data that can be crunched by the GPU every frame.
BF3 does a lot of caching, the memory used isn't necessarily what it needs to use for studded/lag free performance. It's not unreasonable to assume BF4 does the same. Someone over at the anandtech forum explained how the engine worked, I'll see if I can locate the post.
Hmmm looks like 1080p you will be JUST good enough with 2gb with no AA on. with 4xAA you go over 2gb.
http://www.bf4blog.com/battlefield-4-alpha-gpu-and-cpu-benchmarks/
bf3 ran like shit
they should fix their shitty net code first before they hand out requirements.
Again, though, terms like "need" and "require" are both simultaneously too strict and too fuzzy to be worthwhile to use in these contexts. You may not need more memory, but you may still want more, because it can still be beneficial.I'll bet that with next gen games the only resolutions that will require more than 2GB of VRAM will be triple monitor and 4K
Again, though, terms like "need" and "require" are both simultaneously too strict and too fuzzy to be worthwhile to use in these contexts. You may not need more memory, but you may still want more, because it can still be beneficial.
Again, though, terms like "need" and "require" are both simultaneously too strict and too fuzzy to be worthwhile to use in these contexts. You may not need more memory, but you may still want more, because it can still be beneficial.
Please define "GPU lag".So you're running framerates of a 780, but with the GPU lag of a 670.
All the math I've seen done says that input lag and FPS are not mutually exclusive. In other words, increasing FPS has the inherant affect of deceasing input lag.
Keep in mind that doing a 670 SLI, while it may perform a bit faster than say, a single 780, in certain situations, the 670 SLI will not be better than a 780 in its own contribution to the whole input lag chain from keypress-to-eye-retinas. SLI often does not usually improve the lag (at all) over a single card. So you're running framerates of a 780, but with the GPU lag of a 670. (It's possible to have 60fps with the input lag of 30fps). All depends on how the game handles the SLI.
There is no question that more of any resource on a computer is better than less. But the question that is really being asked is whether or not increased VRAM results in increased performance and in almost EVERY case the answer is no, this especially being true of the average PC gamer who games at 1080p and 60hz.
Irrelevant. He's attempting to point out how games have to be limited in their scope in order to accommodate consoles.The Xbox one uses DDR3, the PS4 and GPUs use GDDR5. The PS4 will have about 170.6 GB/s, basically 3 times as much.
And, if IdiotInCharge is correct, the PS4 will be hamstrung because games will have to be developed to operate within the Xbox One's specsBut the Xbox One has have that 32mb frame buffer with 32mb of eSRAM that will be used for frame bufffers, that clocks in at 102 GB/s. So the PS4 will be actively using 3gigs of vram per frame at 60 FPS.
I've been hitting you with facts, benchmarks, and some pretty damning evidence.You can keep coming up with corner cases
You mean ignoring all the evidence to the contrary, without actually giving a valid reason for doing so?I can keep spelling it out line by line.
They were upgraded to double-density chips. I mentioned this exact possibility.to address the 'memory is cheap' debate, remember that this generation was supposed to only have 4GB- which is about average for a PC with no discrete GPU- but got upgraded to eight solely on the stupid cheap pricing of RAM this last year or two.
Nope, not if you want to take full advantage of it. I already explained why, but you're just going to continue to ignore me...So yeah, memory is fricken' cheap.
I know for a fact Skyrim does not require anywhere near 6GB of video RAM, even with a ridiculous number of graphical enhancements installed.Now, like Skyrim, you have the option to turn that stuff off- so those of us, like myself, with GPUs that have less than 6GB of RAM will be fine by lowering settings.
There you go with that same false assumption. You're still assuming that current games are not using PC's to their fullest simply because a console version was also developed. Even when we're talking about games that are able to swamp a Titan, that's still not good enough for them to count as an example to you.In almost every case we have today, the answer is most definitely no. If we were talking about today, you'd be exactly right!
But we're not. We don't have consoles on the shelf that have 8GB of RAM today, with over 4GB of that memory available to games today. That's later this year.
They do have a limited scope- thats pretty easy to understand. While many, many things can be scaled, the basic run-times of games that are actually responsible for a games experience has to be very nearly the same. Its a small part of the games code and resulting executable, to be sure, but the point is that theyll be different to really take advantage of the next-gen consoles. Maybe developers are willing to sacrifice some of that experience (and I dont mean graphics/sound/physics) for some games to get them running on the current gen stuff. Thats pretty reasonable, I think.Irrelevant. He's attempting to point out how games have to be limited in their scope in order to accommodate consoles.
The Xbox One is the lowest common denominator. He wants a baseline, the Xbox One is it.
And, if IdiotInCharge is correct, the PS4 will be hamstrung because games will have to be developed to operate within the Xbox One's specs
The PS4 will, no doubt, be better able to utilize its RAM. It has a GPU with more compute units and has many times more bandwidth... but by IdiotInCharge's logic, it's in the same boat as PC's.
I've been hitting you with facts, benchmarks, and some pretty damning evidence.
A rundown of 30 games testing with the same GPU and 2GB vs. 4GB of video RAM was pretty clear cut. This included games that even next-gen consoles have no hope of running at the settings that the PC's they tested were capable of.
A GPU as weak as the one in the Xbox One actively using more than 2GB of video RAM is a corner case, as far as I'm concerned.
You mean ignoring all the evidence to the contrary, without actually giving a valid reason for doing so?
Simply saying a game "isn't next gen so it doesn't count" because it doesn't meet some arbitrary cut-off date you've decided on is not acceptable.
They were upgraded to double-density chips. I mentioned this exact possibility.
You get additional capacity, but no additional bandwidth. Higher density chips also tend to have worse timings than their low-density counterparts, leading to lower overall performance from the RAM.
They took the easy way out, and it shows. The Xbox One could have really used some extra bandwidth.
Nope, not if you want to take full advantage of it. I already explained why, but you're just going to continue to ignore me...
I know for a fact Skyrim does not require anywhere near 6GB of video RAM, even with a ridiculous number of graphical enhancements installed.
Know why? It's based on DirectX9 and uses a 32bit executable. One of the biggest issues with DX9 is how it causes system RAM usage to bloat in concert with graphical complexity. Loading up Skyrim with too many mods will cause it to overflow 4GB of system RAM (crashing the game) before you're anywhere near 6GB of video RAM being in use.
I talked above about trying to apply current benchmarks- it doesnt really make sense unless you go out of your way to find those corner cases that can reasonably reflect the situations that game developers will be working with developing for these new consoles. Sure wish that wasnt the case, thatd make this easier!There you go with that same false assumption. You're still assuming that current games are not using PC's to their fullest simply because a console version was also developed. Even when we're talking about games that are able to swamp a Titan, that's still not good enough for them to count as an example to you.
You're effectively arguing that your argument un-testable because all tests disagree with you...
I can't take you seriously until you stop making that claim.
I can't wait to see how much VRAM Witcher 3 will use on maximum settings, I sense this game will be the litmus test of next-gen VRAM usage.
Hell, I was cranking Witcher 2 down considerably; Ultra shaders really were . Wonder how Titans are doing on that game.
So CPU speed, and everything else involved in the per-frame rendering process, has a huge effect here. This is also one of the reasons why we advocate much higher performance CPUs than most benchmark houses would have the masses believe was 'enough', by the way. Even with the 4.5GHz 2500k in my system below, I find the CPU to be lacking in 'intense' situations in BF3. GPUs are fine, as I calibrate my in-game settings for the worst case scenario, but the CPU still can't keep up, and I find that my system still gets a little 'laggier'. All other things being equal, it means that my opponents have the upper hand. Now, there's a whole metric shit ton of other variables here, from individual play-styles to input maps to the network's performance at that very moment, but when we build a 'system' to accomplish a goal, we try to minimize or eliminate every limitation we can.