Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.
Hardly. I work in IT so I am very familiar with Linux. It is shit for anything besides tasks like server administration. This is certainly not an operating system for leisurely activities like playing video games.
I would never be insane enough to try and bring that operating system into my...
Oh ffs. Now I get it. Stupid ASRock.
This was almost the perfect Z97 motherboard too. Now I have two useless ports when they could've just left it at a 10 SATA port configuration like the Z87 Professional.
Guess I'll go throw two of these hard drives into the basement server.
So I had a ASRock Z87 Professional motherboard which had 10 SATA ports on it that unfortunately just shit the bed.
Finding a replacement Z87 was too difficult so I replaced it with a Z97 Professional.
Problem is, I was running ten hard drives in my machine using all 10 SATA ports on the...
AMD really should have used GDDR5 instead of HBM.
8 GB of GDDR5 instead of 4 GB of HBM would have made the product very competitive against the 980 Ti.
Just goes to show how poor AMD's engineering is these days.
SLI usually works when it's needed; it's only stuff with low system requirements that doesn't have SLI support or the occasional AAA that is a poor port like Dead Rising 3 or Batman Arkham Knight.
It is as I expected. [H] confirms it. The 4 GB of VRAM on Fury X is crippling it:
http://www.hardocp.com/article/2015/06/24/amd_radeon_r9_fury_x_video_card_review/6#.VYuDv9pFuUk
This card should have had 8 GB of VRAM. If HBM couldn't handle it then AMD should have stuck with GDDR5 and held...
Dead Rising 3 is the other game I've played recently with no SLI even though it is a game that really needs it.
That said, SLI does work pretty well in almost every game I play that needs it. I mainly notice indie titles with lower system requirements don't support SLI; all the AAA's with high...
I love Gameworks; never had an issue with a game using these features and I consider it a positive when a title has them.
AMD fanboys are just in a perpetual state of butthurt that they don't get to enjoy these extra eye candies. If they don't get to enjoy them, they don't want anyone to enjoy...
1080p / 30 FPS.
This is "current gen gaming" according to WB.
That wasn't even "current gen gaming" six years ago! I was doing 1080p/60 FPS in every game back in 2009; now I've been doing 4K/60 FPS in every game for a year.
If you got this from a third party reseller or an Nvidia code and you can't get a Steam refund, at least write a negative review for it on its Steam page.
I want to see that 'Mostly Negative' turn into an 'Overwhelmingly Negative'
It runs like dogshit on 980 Ti's too.
But anyways, that's the entire purpose of Gameworks. To provide OPTIONAL impressive eye candy features that require significant GPU power so that people who spent $650+ on their graphics cards can actually enjoy them.
The only people I see getting butthurt...
WB has historically had excellent PC releases so I don't understand what happened to them this year.
Ubisoft did a better job with Assassin's Creed Unity.
This game runs like shit on Nvidia and AMD hardware; get out AMD troll.
Quite frankly, Gameworks is the only good thing about this release. The smoke effects look amazing.
I don't understand why they would betray their PC audience like this, I thought this particular franchise sold millions of copies on PC? SteamSpy reports millions of copies sold for each of the "Arkham" games.
I made a video of it running on machine with SLI 980 Ti's...
Actually, most people are ignorant as to what these features do but when you show them examples and poll them most will respond negatively to these features.
And there are plenty of people who agree with me that a game can be considered "maxed out" with that motion blur/DoF/CA shit turned...
Depth of field is shit, it looks like shit, and it has no business in video games. Why the hell would I want to turn that shit on and blur most of the objects on my screen? This cancer ruins console exclusive games because the peasant boxes don't allow you to turn it off like PC games do.
MORE...
Nope, they were maxed out. You can clearly see in the image quality menus all options set to maximum except for the retarded image-quality ruining options like "depth of field," "film grain," and "motion blur." OF COURSE those were disabled, they always should be disabled. I don't consider a...
How many times do I have to repeat myself?
4K gaming is here, it's viable, and you can max out almost every game out there and maintain 60 fps.
I post videos giving definitive proof of this and they are ignored by butthurt AMD fanboys.
You lack reading comprehension; that's what I'm getting here.
The GTX 970's VRAM has never been acceptable for 4K gaming nor has it ever been acceptable as a 4K gaming card. The GTX 970 has never been considered by anyone as an acceptable solution for 4K gaming. The GTX 970 has always been a...
Too much input lag is a problem. Anything under 40ms is fine. So most televisions are fine as long as you turn on their game mode setting.
But add an adapter that adds another 30ms on top of that and that is terrible and definitely noticeable.
I made several videos of the latest games running on 980 Ti's as requested by posters here:
http://hardforum.com/showthread.php?t=1866004
As you can see they hold 60 fps most of the time as I said. You can also see that most new games are exceeding 4 gigabytes of VRAM usage. Assassin's Creed...
So Fury X is faster than 980 Ti at 4K, but the question is, does it also deliver a stutter-free performance with that 4 GB of VRAM?
How does Fury X handle new textures having to be swapped in in the middle of gameplay because 4 GB cannot hold them all?
I have been monitoring how games perform on my GTX 980 Ti SLI setup. Nothing too special about it; they're on reference coolers and not overclocked at the moment so there is legroom for even better performance.
There are a lot of people under the impression that 4K is still some futuristic...
No shit, Sherlock. You don't think we use this cancerous standard by choice, do you?
If you want a display that wasn't designed for an ant (I.E. a TELEVISION), you need HDMI.
Let me know when they start including DisplayPorts on reasonably sized displays and I'll be all in.
They don't exist.
I saw >60 FPS average framerates for 980 Ti 2-Way SLI in that table for every game except GTA V and The Witcher 3, and that's even WITH these guys stacking on retarded overkill settings (4x MSAA ON TOP OF running at 4K resolution? Are you serious? That's only something you do...
You mean that people interested in the latest, most expensive, fastest graphics card from a company are ahead of the curve when it comes to what display they are gaming on?
I am SHOCKED, truly SHOCKED by this revelation!
Hope you like Panasonic... and only Panasonic. Because they are...
Most people buying into 4K screens are buying a screen where the increased resolution actually matters -- a real 4K TV, not a TV for ants (a monitor). You don't have to be a neckbeard who crouches over a tiny display at a desk in mom's basement to be a "real gamer." A lot of us having living...
You are mistaken. I really do not like Nvidia. I want AMD to be successful so Nvidia's monopoly is broken. But I cannot bring myself to purchase an AMD product because they keep making so many stupid decisions. Even though I buy Nvidia I encourage others to buy AMD when possible. Most do not...
The problem with AMD's drivers isn't necessarily stability, it is the frequency at which they are updated (Nvidia often has SLI profiles out the day a game is released whereas you have to wait for Crossfire profiles; additionally games seem to support SLI better than Crossfire) and their...
This isn't a "little detail."
Calling a company's flagship GPU marketed for 4K with insufficient VRAM and the inability to output 4K @ 60 Hz a "little detail" is like calling a car that is missing its wheels a "minor issue."
There are WAAAAY more 4K TVs than there are 4K TVs for ants (monitors).
4K monitor usage is the scenario that is a niche usage scenario.
4K TVs are the new standard for PC gaming displays. Releasing a new GPU without HDMI 2.0 on it is incredibly stupid.