What's the advantage of the NVIDIA cards over the Radeon cards ?

Discussion in 'AMD Flavor' started by adobian, Mar 16, 2017.

  1. tangoseal

    tangoseal [H]ardness Supreme

    Messages:
    4,622
    Joined:
    Dec 18, 2010
    no...just check the forum where zen 1800x performs bad on nvidia. Then 480s were used and boom insta 7700k performance.

    You need to read and watch video.
     
  2. Pusher of Buttons

    Pusher of Buttons [H]ard|Gawd

    Messages:
    1,192
    Joined:
    Dec 6, 2016
    It's really not that hard.

    AMD has no real competition for NVIDIA as everyone else has said above the 1060 right now, and I'm hoping to hell that changes with Vega, but right now that's the reality.

    The RX480 and 1060 trade blows. I have both. I notice some differences between them, the machines they're in are somewhat similarly specced....the RX480 is in a i5 Haswell desktop and the 1060 in a Skylake i7 laptop....they both struggle a bit at max settings on my 2560x1080 monitor, but both are great 1080p cards.

    Technology wise, AMD cards are big V8 muscle cars....they have a TON of raw processing power (hence their use in bitcoin mining) but they're not nearly as well optimized as NVIDIA cards. On paper the RX480 should blow the 1060 out of the water, it's closer to the 1070 in TFLOPS and has more and FASTER VRAM....but it's not effectively any better in actual usage. With that in mind....the RX480 has gotten progressively better has newer drivers and optimizations have come out whereas the 1060 has been kind of in the same spot.

    It's kind of the old hat with AMD vs Intel or NVIDIA....they can bring a lot of power to the race but they have trouble actually putting it to use. I'd actually argue that with Ryzen and theoretically with Vega AMD might actually have the better tech, but until it's properly supported that doesn't meant too much. Long story short....check the benchmarks that are relevant to you....buy the cheapest thing that does what you want it to do. There's not really enough difference to justify spending more money on either just for the brand...in my opinion of course. AMD stuff -seems- to have a leg up in DX12, and there's a good chance that Vega could be a monster in DX12 if they don't muck it up, but I wouldn't personally make that decision maker in this generation of cards.
     
    cybereality likes this.
  3. Wyodiver

    Wyodiver Gawd

    Messages:
    734
    Joined:
    Aug 15, 2004
  4. Pusher of Buttons

    Pusher of Buttons [H]ard|Gawd

    Messages:
    1,192
    Joined:
    Dec 6, 2016
    I would have called you a loser with your low frame rates rocking my 9700 Pro throwing 200 FPS in CS....of course I was young and drank a lot then lol

    That whole series of ATI cards were beasts, though.
     
    Armenius likes this.
  5. NeoNemesis

    NeoNemesis [H]ard|Gawd

    Messages:
    1,831
    Joined:
    Mar 10, 2004
    Biggest advantage right now is that Nvidia cards seem to be having an easier time running BotW on CEMU.



    That was an amazing card. Had that for years until the fan died. Never felt that way about a video card again.
     
    Wyodiver likes this.
  6. lolfail9001

    lolfail9001 [H]ard|Gawd

    Messages:
    1,201
    Joined:
    May 31, 2016
    And check the forum too, and see that it happens in exactly 1 game, and only with something faster than 1070/980 Ti... So with no AMD card except crossfire configs.
     
  7. Wyodiver

    Wyodiver Gawd

    Messages:
    734
    Joined:
    Aug 15, 2004
    NeoNemeses said:

    "That was an amazing card. Had that for years until the fan died. Never felt that way about a video card again."

    That was my favorite card. But I can't forget the 300$ credit charge for my first 32MB geforce card.
     
  8. bamavooHF

    bamavooHF n00bie

    Messages:
    25
    Joined:
    Mar 30, 2017
    So I plan on straying 1080p for a while. I should grab a 1070 and dump my R9 390?
     
  9. rgMekanic

    rgMekanic [H]ard|Gawd

    Messages:
    2,000
    Joined:
    May 13, 2013
    WIth Nvidia you can update facebook after you log in to download drivers :)
     
    cybereality and Armenius like this.
  10. Archaea

    Archaea [H]ardness Supreme

    Messages:
    5,606
    Joined:
    Oct 19, 2004
    buy a freesync monitor first. :)
    you might consider a 34" 2560x1080 with freesync (flatscreen) can be picked up for $300. Freesync will breath new life into your 390, and you'll probably enjoy the new display.
     
    Nihilus1 and Maddness like this.
  11. bamavooHF

    bamavooHF n00bie

    Messages:
    25
    Joined:
    Mar 30, 2017
    I run a 32" 1080p now. That sounds like a good move. May have to look into those.
     
    Stitch1 likes this.
  12. WorldExclusive

    WorldExclusive [H]ardForum Junkie

    Messages:
    10,559
    Joined:
    Apr 26, 2009
    Nvidia's advantage?

    Putting product on shelves.
     
    Rizen, Armenius, Ripskin and 7 others like this.
  13. Hameeeedo

    Hameeeedo n00bie

    Messages:
    7
    Joined:
    May 27, 2016
    NVIDIA is better because:

    1-Better VR performance
    2-Better 3D support (3D vision)
    3-Better exclusive image quality enhancements: PhysX, CUDA (WaterWorks), Better Shadows (VXAO, HFTS,), Forced HBAO+ through driver, TXAA, MFAA.
    4-Ansel Support
    5-Support for Adaptive V.Sync, Fast Sync, Adaptive Half Refresh Rate
    6-Better performance on complex geometry and Tessellation
    7-Better performance in DX11/OpenGL games and in CPU limited situations (such as using a low end CPU), their DX11/OpenGL driver overhead is small.
    8-Better power consumption
    9-Reliable performance in obscure and less famous games because NV can optimize massive numbers of titles due to their big R&D.
    Divinity Original Sin 2
    980Ti is 20% faster than FuryX @1080p (and 17% faster @1440p), 980 is almost as fast as FuryX @1080p!
    http://gamegpu.com/rpg/роллевые/divinity-original-sin-2-test-gpu

    Obduction
    980Ti is 55% faster than FuryX @1080p(and 30% faster @1440p), 780Ti is delivering the same fps as FuryX @1080p!
    http://gamegpu.com/rpg/роллевые/obduction-test-gpu

    ABZU
    980Ti is 52% faster than FuryX @1080p, (and 30% faster @1440p), 980 is 17% faster than FuryX @1080p as well!
    http://gamegpu.com/action-/-fps-/-tps/abzu-test-gpu

    War Thunder
    980Ti is 15% faster than FuryX @1080p, @1440p it is 25% faster than FuryX, 980 is closely fast as it as well @1080p!
    http://gamegpu.com/mmorpg-/-онлайн-игры/war-thunder-1-59-test-gpu

    The Technomancer
    980Ti is 25% faster than FuryX @1080p (and 17% faster @1440p)! 980 is as equal as well @1080p!
    http://gamegpu.com/rpg/роллевые/the-technomancer-test-gpu

    Firewatch
    980Ti is 25% faster than FuryX @1080p and 1440p, 980 is almost as fast as it as well @1080p!
    http://gamegpu.com/action-/-fps-/-tps/firewatch-test-gpu.html

    Dragons Dogma Dark Arisen
    980Ti is 24% faster than FuryX @4K and @1440p, 980 is as fast as it as well @1440p! (@1080p all cards are CPU limited).
    http://gamegpu.com/rpg/rollevye/dragons-dogma-dark-arisen-test-gpu.html

    Homeworld: Deserts of Kharak
    980Ti is 20% faster than FuryX @4k, though it drops to just 15% @1440p! (@1080p all cards are CPU limited).
    http://gamegpu.com/rts-/-strategii/homeworld-deserts-of-kharak-test-gpu.html

    Crossout
    980Ti is 46% faster than FuryX @1080p, (and 32% faster @1440p), even 980 is faster @1080p!
    http://gamegpu.com/mmorpg-/-онлайн-игры/crossout-test-gpu

    Conan Exile
    980Ti is 45% than FuryX @1080p, 28% faster @1440p! 980 is as fast as FuryX at both resolutions!
    http://gamegpu.com/mmorpg-/-онлайн-игры/conan-exiles-test-gpu

    ARK Survival
    98Ti is 25% faster than FuryX @1080p, the only resolution that matters.
    http://gamegpu.com/mmorpg-/-онлайн-игры/ark-survival-evolved-test-gpu

    Styx Shardsw Of Darkness
    980Ti is 36% faster than FuryX @1080p, 34% faster @1440p, and 22% faster @4K! Even a regular 980 is almost as fast as FuryX @4K!
    http://gamegpu.com/rpg/роллевые/styx-shards-of-darkness-test-gpu

    Ultimte Epic Battle Simulator
    980Ti is more than 60% faster than FuryX @1080p and 50% faster @1440p, the regular 980 is ahead of the FuryX!
    http://gamegpu.com/rts-/-стратегии/ultimate-epic-battle-simulator-test-gpu

    Escape from Tarkov
    980Ti is more than 35% faster than FuryX @1080p and 1440p! The regular 980 is slightly ahead of the FuryX as well!
    http://gamegpu.com/mmorpg-/-онлайн-игры/escape-from-tarkov-alpha-test-gpu

    OutLast 2
    980Ti is 44% faster than FuryX @1080p and 27% faster @1440p, even the 780Ti is ahead of the FuryX!
    http://gamegpu.com/action-/-fps-/-tps/outlast-2-test-gpu

    Inner Chains
    980Ti is 50% faster than FuryX @1080p and 37% faster @1440p, even the 970 is ahead of the FuryX!
    http://gamegpu.com/action-/-fps-/-tps/inner-chains-test-gpu-cpu

    Assassin's Creed Syndicate
    980Ti is 24% faster than FuryX @1080p! 21% faster @1440p! 980 is almost equally as fast!
    http://gamegpu.com/action-/-fps-/-tps/assassin-s-creed-syndicate-test-gpu-2015.html

    Mad Max
    980Ti is 23% faster than FuryX @1080p, 18% faster @1440p!
    http://gamegpu.com/action-/-fps-/-tps/mad-max-test-gpu-2015.html

    Call Of Duty Modern Warfare Remastered
    980Ti is a whooping 72% faster than FuryX @1080, even a 970 is faster than FuryX! @1440p, the advantage collapses to 25%, and regular 980 is equal to FuryX!
    http://gamegpu.com/action-/-fps-/-tps/call-of-duty-modern-warfare-remastered-test-gpu

    Battleborn
    980Ti is 30% faster than FuryX @1080p and 1440p, 980 is equally as fast as the FuryX!
    http://www.pcgameshardware.de/Battleborn-Spiel-54612/Specials/Benchmark-Review-1194406/

    Homefront: The Revolution
    980Ti is 34% faster than FuryX @1080p, and 23% faster @1440p
    http://www.overclock3d.net/reviews/gpu_displays/homefront_the_revolution_pc_performance_review/7
    http://www.pcgameshardware.de/Homefront-The-Revolution-Spiel-54406/Tests/Benchmarks-Test-1195960/

    Dead Rising
    GTX 1060 is 44% faster than RX 480 @4K! (@1080p and @1440p all cards are CPU limited).
    http://www.overclock3d.net/reviews/gpu_displays/dead_rising_pc_performance_review/6

    Ghost Recon Wildlands
    980Ti is 28% faster than FuryX @1080p and @1440p!
    http://gamegpu.com/action-/-fps-/-tps/ghost-recon-wildlands-test-gpu

    Forza Horizon 3
    980Ti is over 50% faster than FuryX @1080p! 40% faster @1440p, even a 980 and a 1060 are faster than FuryX here!
    http://gamegpu.com/racing-simulators-/-гонки/forza-horizon-3-test-gpu

    Mass Effect Andromeda
    980Ti is 23~40% faster than FuryX in Mass effect Andromeda @1080p, 17~30% @1440p! FuryX is barely faster than a 480 or 1060.
    http://gamegpu.com/action-/-fps-/-tps/mass-effect-andromeda-test-gpu
    http://www.pcgameshardware.de/Mass-...55712/Specials/Mass-Effect-Andromeda-1223325/

    Anno2205
    980Ti is more than 30% faster than FuryX @1080p and 1440p!
    https://www.computerbase.de/2017-03/geforce-gtx-1080-ti-test/2/#diagramm-anno-2205-1920-1080
    https://www.techpowerup.com/reviews/Performance_Analysis/Anno_2205/3.html
    http://www.pcgameshardware.de/Anno-2205-Spiel-55714/Specials/Technik-Test-Benchmarks-1175781/
    http://www.guru3d.com/articles_pages/anno_2205_pc_graphics_performance_benchmark_review,7.html

    Dying Light
    980Ti is 40% faster than FuryX @1080p and 27% faster @1440p, the regular 980 is slightly ahead of the FuryX
    http://gamegpu.com/action-/-fps-/-tps/dying-light-the-following-test-gpu.html
    https://www.overclock3d.net/reviews...owing_pc_performance_review_-_amd_vs_nvidia/7

    AMD is currently slightly better under DX12 though.
     
    Last edited: May 20, 2017 at 7:39 PM
  14. Archaea

    Archaea [H]ardness Supreme

    Messages:
    5,606
    Joined:
    Oct 19, 2004
    And yet...

    a posse of Fury cards is to be feared. :)


    1440p
    [​IMG]

    4k
    [​IMG]
     
    Nihilus1 likes this.
  15. silent-circuit

    silent-circuit [H]ardForum Junkie

    Messages:
    16,177
    Joined:
    Sep 18, 2005
    No OC on those cards... Since the Nvidia cards will OC, and the Furies... Won't. Try again Archaea. ;)
     
  16. Bahanime

    Bahanime Limp Gawd

    Messages:
    366
    Joined:
    Sep 27, 2011
    Didn't [H] test this thoroughly already?

    Kyle even expected the Fury X to flop due to it's 4GB, but it did not. It kept up pace vs Titan X Maxwell which has 12GB. o_O

    I think at this point, people need to stop treating HBM as equal to GDDR in capacity, because that's not how it works with AMD's driver optimizations.
     
    Archaea likes this.
  17. Nihilus1

    Nihilus1 Limp Gawd

    Messages:
    272
    Joined:
    Jun 7, 2015
    Overclocking in a dual setup is never that fun unless you have a rear blower. I can't imagine tri and quad setups would be too great do to all that extra heat.

    More people seem to like freesync.
     
  18. Bahanime

    Bahanime Limp Gawd

    Messages:
    366
    Joined:
    Sep 27, 2011
    Two open air cards in SLI on an open bench test rig looks OK, as soon as it's inside a closed case, those open air dumps 250W each right on top of each other... ew.
     
  19. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    4,501
    Joined:
    Feb 22, 2012
    I had both. The Titan X Maxwell was in a different league than the Fury X. The VRAM is definitely limiting in some games. Never mind the build quality for the Fury X was garbage.

    I just did a quick search, but here's BF4 which generally leans AMD and the Fury X got crushed against a OC 980ti (and even not OC'd). Never mind VR performance...

    https://m.hardocp.com/article/2015/08/11/gigabyte_gtx_980_ti_g1_gaming_video_card_review/9

    You really have to compare a OC'd Titan as well (essentially the same as the 980ti above.). By default the Titan X's profile is super conservative.
     
  20. Archaea

    Archaea [H]ardness Supreme

    Messages:
    5,606
    Joined:
    Oct 19, 2004
    Build quality is garbage? Huh?

    I have a MSI 980ti at work I installed on a forensic box, I have a Fury X pair at home. I've not had my hands on a 1070 or 1080 card but there's nothing "garbage" whatsoever about the Fury X construction that I've seen compared to any card I've used in 20 years of GPU purchases. Very solid feel, very nice rubbery texture, nice LED logo, arguably the best quality radiator fan used stock that's currently available --- and not a partner issue of quality because the Fury X cards are all identical save the sticker on the radiator fan. I've seen multiple pro reviews praise the build quality of the card as an example of how to do it right.

    But hey it's AMD, throw any random insult you can think of and let's see what sticks!!!

    [​IMG]
     
    Last edited: May 19, 2017
  21. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    4,501
    Joined:
    Feb 22, 2012
    No.. there were wide spread quality issues. Google Fury X fan wine or pump wine. I had to replace the fan on mine and now the pump sounds like a cheap fish tank air pump.

    It would also heat the metal on my 250D to the point where it burns me. I resolved that by changing the temp curve.

    It's not seeing what "sticks", it's reality. I could go down my basement where this loud POS resides and video it.

    Not overall too surprising. OEMs dropped from their normal 3 yr warranty to 1 year for this card. Some sites showed it getting hot. The red flags were there but curiousity got the better of me.
     
  22. Archaea

    Archaea [H]ardness Supreme

    Messages:
    5,606
    Joined:
    Oct 19, 2004
    Sooo it has a three year warranty and it can't be much over two years old based on release date so send it in. Pump noise and coil wine are both covered under warranty. Or did you do something to the card to lose the warranty and potentially cause yourself an issue?
     
  23. Archaea

    Archaea [H]ardness Supreme

    Messages:
    5,606
    Joined:
    Oct 19, 2004
    Somehow I missed that you had a 32" already. You may find a 34" widescreen disappointingly short vertically coming from a 32". A 34" widescreen has the pixel height of a 27" 16:9.
     
  24. TheLAWNoob

    TheLAWNoob Limp Gawd

    Messages:
    291
    Joined:
    Jan 10, 2016
    Sending your card in doesn't guarantee you a quiet card.

    If you warranty it now, you'd probably get a card that's in much worse condition than the one you send in.

    Or maybe they'll just send you a RX 580 and call it a day.
     
  25. Bahanime

    Bahanime Limp Gawd

    Messages:
    366
    Joined:
    Sep 27, 2011
    BF4 never leaned AMD unless you ran it in Mantle mode. DX11 BF1 has always been NV GPUs performing better.

    Battlefront DX11 did lean AMD though, and BF1 is neutral.
     
  26. Stitch1

    Stitch1 n00bie

    Messages:
    14
    Joined:
    Dec 12, 2016
    Or wait a few weeks and they might send you a Vega card.

    I have owned a Fury X for all of about two weeks now. I recently sold off a 1070 then bought a 580 and then found a great deal on a Fury X. In my setup (2560x1080 W/Freesync) the Fury X makes a lot of sense. I can max out all the games I play, turn off V-Sync and lose the lag and mouse issues that come with it, and everything plays smooth as butter. Yes, the 1070 was the more powerful card. However, I was locking it down with V-Sync. Most of it's power was lost on me anyways. Yes, I took a bit of a loss when selling the 1070 but gained more back by getting a Fury X for way cheap. Yes, it's used but I have zero issues with it. It's dead silent and never gets hotter than about 56c.

    Originally, I was going to upgrade to Vega. However, now that I have the Fury. I may wait it out till I upgrade my system again or if games start to come out that I can no longer take full (noticeable) advantage of while playing within my Freesync range. Where with the 1070 I would probably have upgraded sooner if games started dropping below VSync. If set at 75hz I was already experiencing that. However, if set at 60hz it was still hanging in there just fine. *Side note to that. Games like Witcher 3 at 75hz would drop out of VSync and play like crap but with Freesync running it without VSync the FPS does jump around but it is not noticeable at all. Unless I have the FPS counter on the screen I wouldn't be any the wiser that its changing all the time.

    Now, I will say Nvidia does have units on shelves and they have a really nice lineup. But their dang GSync monitors are just too expensive to justify. I bought my LG 34" 2560x1080 monitor for $250 on black Friday. At the same time the closest GSync I was looking at was around $800 and had the same size and resolution. For that kind of price difference the choice was clear. At that same time the 32" HP Omen 1440 monitor was also on sale for less than $300. That was another really great deal.
     
    Archaea likes this.
  27. Sycraft

    Sycraft 2[H]4U

    Messages:
    3,932
    Joined:
    Nov 9, 2006
    Also something I haven't seen mentioned that isn't relevant to a lot of users but is worth noting is OpenGL performance, features, and stability. nVidia's DX and GL drivers are equally good and fast. AMD has always had trouble on the GL front for some reason. So if OpenGL is your thing, either because you are playing one of the few GL games, doing pro work, or running Linux then nVidia really is the card for you. Of course if you don't do OpenGL, then it doesn't matter. Also this doesn't seem to apply to Vulkan (though there is little enough of that it is hard to tell how it'll go) so may not be an issue in the long run.
     
  28. Archaea

    Archaea [H]ardness Supreme

    Messages:
    5,606
    Joined:
    Oct 19, 2004
    I have a PHD friend and tech nerd who is a Linux lover, and he says AMD is significantly better supported in Linux than Nvidia. I don't use Linux, I haven't researched it, I don't know, but just relaying what he said. He bought a Fury X of his own accord with that explanation. He doesn't game as much as productivity stuff though.
     
    Last edited: May 20, 2017 at 9:45 AM
  29. NIZMOZ

    NIZMOZ Gawd

    Messages:
    950
    Joined:
    Oct 23, 2007
    nvidia cards have always performed better than amd/ati cards. Also they had better driver support.
     
  30. Algrim

    Algrim Gawd

    Messages:
    681
    Joined:
    Jun 1, 2016
    You really don't use OpenGL outside of games or rendering-based software. If you don't do either you can use the default Linux driver for most distros without issue regardless if it's an AMD, nVidia, AGP, or iGPU. I've been running Linux since before it had a kernel 1.0 and it's become trivially easy to use the graphics stack for all but the most esoteric setups. (I don't have a Ph.D. but if you need one to run Linux you're doing it wrong. Nice appeal to authority, by the way...)
     
    Last edited: May 20, 2017 at 6:20 PM
    Factum and razor1 like this.
  31. SpeedyVV

    SpeedyVV [H]ardness Supreme

    Messages:
    4,240
    Joined:
    Sep 14, 2007
    This is really sad.

    Did the world stop being able to have an intelligent conversation?

    We are talking about video cards, something so objective.

    OP question should lead to n reasons. End of thread.

    Yet it is like a bunch of little bit bitches arguing if the White House is red or green!
     
  32. chenw

    chenw 2[H]4U

    Messages:
    3,246
    Joined:
    Oct 26, 2014
    I vote brown.
     
    Dayaks likes this.
  33. rgMekanic

    rgMekanic [H]ard|Gawd

    Messages:
    2,000
    Joined:
    May 13, 2013
    You forgot the /s
     
  34. crazycrave

    crazycrave [H]Lite

    Messages:
    83
    Joined:
    Mar 31, 2016
    I been around video cards since 2003 (MX 420) .. I ran my first SLi rig in 6800GS and next was EVGA 7900GTO SLi as back then you upgraded every 4 to 6 months to keep resell value up. the list of cards I have owned would be long but 9700pro was the King once and AMD blew our mind with the HD5870/ HD7970/290X ..

    Now I feel AMD has giving me the best bang for my buck over the years as my oldest card HD7950 can still do 4K 60Hz via Club 3D or run it in CX with well ever I want like a 7990x2 and it's still DX 12 ready.. heck fire my Eyefinity setup to life again and play iRacing..

    The 290x is my gaming card for now until I see the day I need a gpu that can not handle todays games as it is now faster then the 980GTX in most games and I loves Eyefinity because it was designed for it with the 512Mb memory bus band with and that is real hardware showing up when demand is placed at say 5870 x 1080.. it think AMD now has a driver team in place that can go back and still pull more performance out of GCN ..so my next planned upgrade is AMD Razen .

    I am still waiting for Nvidia to show me how to SLi a 1060/70/80 together in working form as I have run both camps for years and this is biggest issue with Nvidia and i want to see head to head drivers as I know Nvidia has been caught before running softer driver settings for benchmarks then what AMD runs under stock driver settings.
     
  35. sparks

    sparks 2[H]4U

    Messages:
    2,957
    Joined:
    Jun 19, 2004
    the biggest advantage nvidia has over amd is you can buy one.
     
    razor1, jologskyblues and KazeoHin like this.
  36. {NG}Fidel

    {NG}Fidel [H]ardness Supreme

    Messages:
    5,330
    Joined:
    Jan 17, 2005
    Another area nvidia seems to shine is with emulating old games. Open GL obviously helps. I'm not talking about piracy either, but rather about playing old games you own well after the system died, you wanted to clear space, or to obtain better visuals on old titles you own. Not saying AMD can't be used though.

    Also I've owned both. Right now AMD has not earned my dollar but they have many times in the past.
     
  37. JustReason

    JustReason [H]ard|Gawd

    Messages:
    1,688
    Joined:
    Oct 31, 2015
    Maybe I am missing something, but being the games are old I am not sure how having Nvidia is necessarily a boon over AMD. Seriously most of those titles have no need for extremely high framerates, most were not even fluid in motion where one would even notice the difference between 30 fps or 60fps must less 200fps. I play a host of old games, Wizardry 8 right now on a HD7770 and not sure it matters if AMD nor Nvidia. Even my 290 plays all just fine. Playing emulators for PS1+2, SNES, Sega and such using Graphics plugins for highly increased visuals again run perfectly fine as well. Granted I do cringe a bit when I see that TWIMTBP intro, but so far no game I have as far as >5years old exhibits any issues.

    Granted I haven't played everything, using every emulator known, so it is possible I am just missing that particular issue you speak of.
     
  38. Archaea

    Archaea [H]ardness Supreme

    Messages:
    5,606
    Joined:
    Oct 19, 2004
    nvidia always brings me coffee in the morning, meanwhile AMD wets the bed.

    Well that's how some of these posts read anyway. LOL.
     
  39. JRUHg

    JRUHg [H]Lite

    Messages:
    90
    Joined:
    Jan 5, 2016
    nvidias raw perfomance? g-sync + ulmb. :hungry:

    amds dual gpu. eyefinity.
     
  40. {NG}Fidel

    {NG}Fidel [H]ardness Supreme

    Messages:
    5,330
    Joined:
    Jan 17, 2005
    Many emulators have better accuracy and performance. I can get good results on my AMD card but not as good as my nvidia. This has been true generation over and over. Open gl man.