It's basically Skyrim with a fallout total conversion done, plus all the console port fun we've come to know in this industry.
Yes, everything is just a reskin of everything.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
It's basically Skyrim with a fallout total conversion done, plus all the console port fun we've come to know in this industry.
You don't.
No. It never was and never will be. At best that's misdirection in attempting to explain why AMD cards perform so poorly. You wanna know what the real problem is?
nVidia: Drivers released on launch day
AMD: Nothing yet
After all these years of slow driver releases and driver issues, AMD still can't get it right.
On a different note, thanks [H] for the review. I'm running it at 2560x1440 with everything on Ultra on my 980Ti and it's awesome. I with with TAA and didn't bother with FXAA - after reading the review I'm going to have a look at FXAA tonight.
Remove them altogether and you have a DX9 title that may be dated by even 2012 standards.
Zarathustra[H];1041965569 said:...but then I took a nuke to the knee.![]()
Not everyone is happy with ~60 FPS. Many appreciate sharper and more fluid picture.
There is one issue that you failed to mention, we had never seen a game were an AMD card is 30-40% slower compared to a similarly priced nVidia card except when it has nVidia Gimpworks on it, and even like that at least the graphics kinda offset the performance impact, but a game that looks like it was released on 2012 and that runs miserably on AMD hardware only shows one thing. Shoddy coding that was made in purpose to harm AMD hardware.
This is delusional. You really think Bethesda set out to piss off 20% of their customers? Why would they do that? That's just bad business. Maybe instead you should be asking AMD why they haven't bothered to release a game driver.
The grasping at straws and conspiracy theory is really getting ridiculous.
Are you really fucking serious? I mean... seriously? Really?There is one issue that you failed to mention, we had never seen a game were an AMD card is 30-40% slower compared to a similarly priced nVidia card except when it has nVidia Gimpworks on it, and even like that at least the graphics kinda offset the performance impact, but a game that looks like it was released on 2012 and that runs miserably on AMD hardware only shows one thing. Shoddy coding that was made in purpose to harm AMD hardware. According to some, Godrays uses heavy tessellation in a fashion that it makes AMD tessellator go underutilized often, may be it would be a good idea to use AMD's CCC and dial it back to 16X and see how it performs. Something similar happened with Witcher 3 Gerald's hair. Is frustrating that every time a game is released under the TWIMTBP banner, always comes riddled of performance issues and bugs, while a game that use none of that or uses the AMD banner, runs even better on geforce hardware.
Are you really fucking serious? I mean... seriously? Really?
Ok, so Bethesda deliberately set out to make their current flagship game play terribly on AMD cards, because fuck AMD and everyone who uses their cards. Yes, they might be paying customers of Bethesda and Bethesda would be shooting themselves in the face repeatedly by doing this, but fuck them all.
Seriously dude, lay off the whatever it is that you're taking that's causing the delusional paranoia. Reality is beckoning and it tastes good.
TI, probably not. GTX 980 though, sounds like you might need that to get 1440p to run well.
Dumb question...to use FXAA does it also need to be turned "on" under the nvidia control panel? Or is that just override for programs that don't have built in FXAA as an option?
Jonestown wasn't in the US.
You my friend need an education.
Staying on topic. Why not ask amd for a refund if you can list 10 plus games you are consistently getting fucked in. Maybe it's time to realize there's a common problem with one company.
Your use of lame work and stuff like that tells me you have reached not only Jones town level of amd Kool aid drinking but also a dash of Westboro to boot.
You really think a company will purposely fuck their market share?
wth are you talking about? you sound like a child speaking of things that can't even understand,you things high FPS are rates are just about have 2x Titan X's water cooled? you think input lag its just about FPS?. LOLin' really bad...
So you are concerned about Vsync?.. so you never played any fallout before or new vegas or skyrim?. they have always used the same physics tied to refresh rate engine, that's why they use that refresh rate cap, not 60fps as many say, the cap its the refresh rate of your monitor... everything in those games are always tied to refresh rate, motion, movement speed, physics, particles, etc... that's why the game behave crazy when you disable vsync ( .ini file) and have FPS all over the place...
in the other hand thanks Brent for the preview, as always great.. waiting for the full review.
...have always had to have vsync disabled in skyrim and new vegas to avoid objects jumping around for no reason.
currently playing this at 1440p w/ a Phenom II X6 and a Gtx 770 at 1440p with no issues. Sub 60 in some areas but nothing too noticeable. Max settings besides low god rays
wth are you talking about? you sound like a child speaking of things that can't even understand,you things high FPS are rates are just about have 2x Titan X's water cooled? you think input lag its just about FPS?. LOLin' really bad...
So you are concerned about Vsync?.. so you never played any fallout before or new vegas or skyrim?. they have always used the same physics tied to refresh rate engine, that's why they use that refresh rate cap, not 60fps as many say, the cap its the refresh rate of your monitor... everything in those games are always tied to refresh rate, motion, movement speed, physics, particles, etc... that's why the game behave crazy when you disable vsync ( .ini file) and have FPS all over the place...
in the other hand thanks Brent for the preview, as always great.. waiting for the full review.
That's a surprisingly difficult question to answer. I haven't seen anything official that even mentions DirectX. The original engine would have been DX9, but they'd need at least DX10 for geometry shaders and/or tesselation. My guess would be DX11 as that's the most recent and widely supported version. However, the minimum requirements listed on Steam should all support DX12/Vulkan to the best of my knowledge. The game seems to run fine with mostly Ultra settings(few rough areas aside) on a card even below the minimum recommended hardware.Speaking of, what version of DX is FO4 using?
That's a surprisingly difficult question to answer. I haven't seen anything official that even mentions DirectX. The original engine would have been DX9, but they'd need at least DX10 for geometry shaders and/or tesselation. My guess would be DX11 as that's the most recent and widely supported version. However, the minimum requirements listed on Steam should all support DX12/Vulkan to the best of my knowledge. The game seems to run fine with mostly Ultra settings(few rough areas aside) on a card even below the minimum recommended hardware.
As of writing this, SLI seems to be broken in Fallout 4. There is a interim fix; however it works for some but not all. Simply use the the NVIDIA Control Panel and change the SLI profile to "Alternate Frame Rendering 2."
The thing that gets me is the minimum system specs. As everyone is saying this could be a 2012 game, it definitely won't run on 2012 hardware. We tried a mobile firepro card (3dmark equivalent to around a HD7770 desktop) with low settings, and even at 720p the menu screen was very choppy. I can understand maybe needing a 7870 for say 1080p on low, but when a HD5850 can't even run a game at 720p on low there is no scaling taking place.
If I had the game I'd give a GTX 580 a shot, but chances are it probably won't run well even though there is no reason why it shouldn't.
That is Kyle holding up either my card or on of it's sisters furnished by ASUS. I'm the guy in the Green t shirt on the far right just above the guy with the tan hat just before I won it at the AMD [H]ard|OCP GamExperience on 7/16/11. That card does not meet the minimum spec for FO4 so I need to really, really upgrade my GPU LOL. Except for upgrading my system memory to 16GB and getting a new GPU, I think the rest of my system is good for now (I hope).
Why this does not happen on games that are more agnostic like COD for example? Or even games that uses the AMD banner like BF4 or Tomb Raider or Ryse?