sabrewolf732
Supreme [H]ardness
- Joined
- Dec 6, 2004
- Messages
- 4,778
The benchmarks look good.
yes, looks far better than one would expect with just a couple hundred mhz increase in core speed. Guess vega 64 was really bandwidth starved.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
The benchmarks look good.
I've never seen 12gb vram use with bfv at 4k on my setup.Brent and Kyle already ran into 8gb ram limitations on the 2080 and 2070 in BF5. Causing stutter and sporatic low frame rates.
HDR uses 10 bit light maps, textures etc which will add to usage of vram. In BF5 up to 12gb of VRAM can be seen used for more ready assets for Smooth game play and that is without RTX.
The 6gb RTX 2060 s a bad joke and as far as I am concern DOA. Plus another $100 added to it. Seriously will it be able to really use RTX in anything? Ram is the what, that will make this card pointless and a disappointment, Fury 2.
As for Freesync, there are many happy folks using it. My Freesync 2 monitor works flawless and is HDR. There is another aspect that purturbs me, HDR on the 1080 TI makes the desktop look like utter crap. Same monitor on the Vega 64 with hdr desktop and it correctly converts sdr content over, videos etc. then you see zero analysis or review over this aspect.
I will see how VRR works with NVidia.
OkI've never seen 12gb vram use with bfv at 4k on my setup.
Ok
I haven’t seen any benchmarks go past 7 GB of usage.I've never seen 12gb vram use with bfv at 4k on my setup.
Wow. That makes the 2080 look pretty silly for that game at 4K. Even the 2080 Ti might struggle. I was looking around at benchmarks, but didn’t think to check [H]. Thanks. Makes AMD’s decision to go with 16 GB look pretty good right now.
Wow. That makes the 2080 look pretty silly for that game at 4K. Even the 2080 Ti might struggle. I was looking around at benchmarks, but didn’t think to check [H]. Thanks. Makes AMD’s decision to go with 16 GB look pretty good right now.
Wow. That makes the 2080 look pretty silly for that game at 4K. Even the 2080 Ti might struggle. I was looking around at benchmarks, but didn’t think to check [H]. Thanks. Makes AMD’s decision to go with 16 GB look pretty good right now.
The decision might be good for professional or content creation situations, but it really seems like massive overkill for gaming. 12GB would have made a lot more sense and likely have allowed them to lower the price a little.
It would, if they had DXR
? It’s using over 8 gigs of ram in dx12 with no dxr at 4K.
You wouldn't use DX12 in BF:V unless you're using DXR.
Huh? Really? Man I don’t wanna argue but that is obsurd. It’s like your opinion not a fact.
https://www.pcgamesn.com/battlefield-5-pc-performance-analysis#nn-graph-sheet-ultra-settings-dx12
Interesting and in direct odds with HardOCPs findings of enabling DX12 incurring an immediate performance drop.Huh? Really? Man I don’t wanna argue but that is obsurd. It’s like your opinion not a fact.
https://www.pcgamesn.com/battlefield-5-pc-performance-analysis#nn-graph-sheet-ultra-settings-dx12
The 28.47 number is the same for the game right above, probably a copy/paste error.the % are off.... EG, Strange Brigade
60.9 -> 88.9 is more than 28%
Depends upon your system, my Ryzen 2700 system/Vega FE's is faster in DX 12 than DX 11 in BF5 - plus DX 11 seems to be more erratic in framerates. Nvidia also has better DX 11 support so AMD users need to just test which one works best for them.The Frostbyte engine has been slower in DX12 for years. I don't know why that is, it doesn't make sense, but it's a fact.
That was with current Intel/Nvidia hardware, drivers, OS update - that does not mean other systems with different hardware/configurations will have the same behavior or findings.Interesting and in direct odds with HardOCPs findings of enabling DX12 incurring an immediate performance drop.
VRAM used and VRAM needed are two different things, of course. Games will cache as much as possible to memory even if that doesn't actually improve real-world performance.Wow. That makes the 2080 look pretty silly for that game at 4K. Even the 2080 Ti might struggle. I was looking around at benchmarks, but didn’t think to check [H]. Thanks. Makes AMD’s decision to go with 16 GB look pretty good right now.
VRAM used and VRAM needed are two different things, of course. Games will cache as much as possible to memory even if that doesn't actually improve real-world performance.
Agreed.Depends upon your system, my Ryzen 2700 system/Vega FE's is faster in DX 12 than DX 11 in BF5 - plus DX 11 seems to be more erratic in framerates. Nvidia also has better DX 11 support so AMD users need to just test which one works best for them.
Also different aspects of any game can be different between the two API's, for example low content scenes as in objects, shaders etc. may excel under DX 11 single thread low latency (single thread controlling the draw calls) while DX 12 will excel at highly complexed scenes with tens of thousands of objects with the combined strength of multiple cores at work.
If AMD could cut cost wouldn't they have done that?The decision might be good for professional or content creation situations, but it really seems like massive overkill for gaming. 12GB would have made a lot more sense and likely have allowed them to lower the price a little.
And RE2 (demo) DX11 was far and away better than DX12 for me and at least one other mentioned it in the thread.With AMD's poor DX11 support, I can see why DX12 might seem attractive in Frostbyte- but in DX12 it's still slower than Nvidia's DX11
lol or Nvidia DX 12 just plain sucks . It is what it is. I have two video's processing comparing DX11 and DX12, short and sweet. Will be placed in this thread once done and if ok (should be third video post):With AMD's poor DX11 support, I can see why DX12 might seem attractive in Frostbyte- but in DX12 it's still slower than Nvidia's DX11
lol or Nvidia DX 12 just plain sucks .
yes and yes but sometimes no - for example FarCry 5 AMD drivers supports very well, works about perfect with CFX too. When AMD puts the effort in they can have some good DX 11 performance or efficiency.Given that it's a game-to-game thing for DX12, not really. But AMD's DX11 (and DX10) drivers have struggled since their inception, of that there is no doubt.
yes and yes but sometimes no - for example FarCry 5 AMD drivers supports very well, works about perfect with CFX too. When AMD puts the effort in they can have some good DX 11 performance or efficiency.
Buildzoid put up a video talking about the card. Pretty good run down of the card and some speculation on why the price is where its at. As usual with Buildzoid he rambles at times, but he still does a good job providing a bit of a different take on the card compared to a lot of the tech press.
2: I really doubt AMD ever planned this as a data center only chip. AMD is all about using a chip in as many applications as possible.
So this is 60 CUs with 16GB, I would love to see a 50 CU/8GB version for $475.
Brand tattoos are just about the dumbest thing on the planet.