Nvidia Killer

Discussion in 'Video Cards' started by Stoly, Aug 9, 2019.

Thread Status:
Not open for further replies.
  1. funkydmunky

    funkydmunky 2[H]4U

    Messages:
    2,327
    Joined:
    Aug 28, 2008
    A week per frame? So with that math it took 3008.57 years to render all 168480 frames of the 117 min movie.
    EDIT
    * I see it has been tackled already*
     
  2. 5150Joker

    5150Joker 2[H]4U

    Messages:
    3,090
    Joined:
    Aug 1, 2005
    I think we all know the answer to the above. ;)
     
  3. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    17,819
    Joined:
    Jan 28, 2014
    I don't know exactly what you're referring to, but both NVIDIA and AMD use delta color compression.
     
  4. MangoSeed

    MangoSeed Limp Gawd

    Messages:
    427
    Joined:
    Oct 15, 2014
    And it’s lossless.
     
  5. N4CR

    N4CR 2[H]4U

    Messages:
    3,784
    Joined:
    Oct 17, 2011
    I never made a big fuss about asynch I think you have me confused.

    I want to play it, when it's ready. Right now 3 games plus expensive and lacking hardware implementation isn't enough to make it worth while yet, as I already wrote. Each to their own.
     
    Gamer X, kirbyrj and RamboZombie like this.
  6. RamboZombie

    RamboZombie [H]Lite

    Messages:
    90
    Joined:
    Jul 11, 2018

    Wonder which comes first..

    2077 or 3080..

    3080 would be hillarious :) that would make it a very short-sighted decision to buy a 1 gen. RTX card for 2077 - Keanu need them sweet rays to shine bright!
     
    Last edited: Aug 13, 2019 at 3:08 AM
  7. Mylex

    Mylex Limp Gawd

    Messages:
    147
    Joined:
    Aug 30, 2018
    LMAO if they launch a 3080 before CP2077 I would die laughing but unfortunately if the performance is a large jump over a 2080 ti I would buy it or the ti version if they launch both.
     
    IdiotInCharge likes this.
  8. amenx

    amenx Limp Gawd

    Messages:
    304
    Joined:
    Dec 17, 2005
    Isnt that the way it works for all GPUs? Bad timing has always been the curse of consumers on both sides. I wonder how R VII gamers felt when their cards were EOL just a few short months later and the 5700XT equaling it at a much reduced price.
     
    IdiotInCharge likes this.
  9. kirbyrj

    kirbyrj [H]ard as it Gets

    Messages:
    24,310
    Joined:
    Feb 1, 2005
    The Radeon VII was the least gaming of any gaming card released recently as it's a rebadged Instinct card. If anyone did their homework prior to buying it, they would have known it's strengths and weaknesses depending on various workloads and already knew Navi was incoming within 5-6 months.

    It does have good value for certain workloads compared to workstation cards. For gaming, not so much compared to Navi.
     
    Maddness likes this.
  10. Gamer X

    Gamer X Limp Gawd

    Messages:
    270
    Joined:
    Jul 5, 2019
    That is the exact reason Mi50 Radeon 7 went EOL, because Vega20 @ 7nm is not as powerful as Navi10 at games, even if it is smaller.

    That is all Turing is too, is a rebranded burnt-out Enterprise card... it is not a Gamer card, like RDNA is.


    Nvidia's Ampere has to be it's own thing, or it will be a flop upon arrival, if it's nothing other than a Jensen hand-me-down chip, from the Compute/Server world. It would be a fail/fail for Gamers if that is what nvidia has in store for Ampere. That is why "big-navi" won't need much more die space to beat the 2080ti in games.
     
    RamboZombie likes this.
  11. amenx

    amenx Limp Gawd

    Messages:
    304
    Joined:
    Dec 17, 2005
    Look for reviews of the R VII. Nearly all done from a gaming context. I'll bet more people bought them for gaming than its other functions. AMD purposed it for gaming to begin with.
     
    Maddness and IdiotInCharge like this.
  12. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    17,819
    Joined:
    Jan 28, 2014
    6 + 1
    1. Asetto Corsa Competizione
    2. Battlefield V
    3. Metro Exodus
    4. Quake II RTX
    5. Shadow of the Tomb Raider
    6. Stay in the Light
    +
    1. Wolfenstein: Youngblood (in future update)
    Control releases in two weeks with ray tracing at launch, making it 7 + 1.
     
    IdiotInCharge and Factum like this.
  13. RamboZombie

    RamboZombie [H]Lite

    Messages:
    90
    Joined:
    Jul 11, 2018
    And how long has the RTX series been available? it´s still a very meager showing..

    hoping that control is fantastic, my RTX card really need it to be!
     
  14. Snowdog

    Snowdog [H]ardForum Junkie

    Messages:
    8,979
    Joined:
    Apr 22, 2006
    How many developers will want to jeopardize their schedules to kludge in new technology to already in progress games?

    And, how long does it take to develop a new game, so you can plan from the beginning to include Ray Tracing??
     
  15. ChadD

    ChadD 2[H]4U

    Messages:
    4,067
    Joined:
    Feb 8, 2016
    Well as some claim NV has been working on ray tracing for years and years right ?

    They also have hands down the best developer support program.

    So should those 2 things together not meant that when Jensen held that card up... everyone in the industry should have had games and patches ready to drop ?

    As I see it there are only really a few possibilities.
    1) NV had not been working on it for years and years ... but shoe horned it in to make their tensors not completely sit there looking stupid.
    2) NV software program isn't as good as we think. (I don't think that is true NV has a great developer support apparatus)
    3) NV MS AMD Sony have been talking about it... and either
    A) developers where targeting a different date for go time (perhaps next gen console launches) or;
    B) developers don't really want Ray tracing. There are still a lot of developers that believe light maps are superior artistically. Reflections are great sure but the control of a light map is often preferable over trying to bounce light everywhere to properly light a scene.

    As cool as RT can be.... imo I don't think game developers are in love with the feature. Some games will use it but it is hardly going to be a defacto thing in AAA games any time soon if EVER. In the right type of game it can and will be very cool... in a great number of other games it will always just look marginally better, require a ton of development time to setup properly and run like shit compared to just using a simple light map.
     
    noko, Gamer X, funkydmunky and 3 others like this.
  16. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    This is the case for every single large GPU AMD has ever released (meaning, since they bought ATI). Every single one has been compute heavy and targeted commercial applications first. And every single one was worse for gaming for its die size and power draw than a comparable Nvidia part, and since the release of the GTX680, Nvidia has built dedicated high-end gaming GPUs that have smoked AMDs compute-heavy large GPUs. They still do today.

    Big Navi will literally be AMDs first. Ever.
     
    Factum likes this.
  17. ChadD

    ChadD 2[H]4U

    Messages:
    4,067
    Joined:
    Feb 8, 2016
    Your joking right ?

    Nvidia literally only released volta as a compute card.

    Turing is clearly not designed as a gaming first card, unless you are really really smoking the NV marketing materials. 90% of the Turning white papers read as a love letter to AI developers. Granular tensor cores yes please.

    I'm not saying your wrong and that AMD has not been building compute first cards... but man so has NV. Or are you really going to argue that tensor cores have a serious game use ? The first arch we have seen in years that isn't obviously trying to suck off the AI indsutry is Navi. (although AMD may go there with Navi+ / Navi 2) lol
     
    Gamer X and RamboZombie like this.
  18. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    ...and then didn't release it as a consumer card, so your point is?

    -Fastest gaming cards available
    -More efficient than brand-spanking-new AMD parts that are using a smaller node
    -
    While including industry-standard features that AMD is now a year behind on supporting

    So no, I'm not joking. And AMD shouldn't joke around either, as they're about to get pushed out of the market by Intel.
     
    Factum, Armenius and Thunderdolt like this.
  19. ChadD

    ChadD 2[H]4U

    Messages:
    4,067
    Joined:
    Feb 8, 2016
    Well I doubt that highly... if anyone should be worried about Intel its NV. Intels real goal isn't games either. Intel already has major XE supercomputer wins that could and should have (if your the NV sales team) went NVs way. NV has been making the majority of its actual margin from compute clusters for a long time now... and Intel is coming for that cheddar.

    Intel is more likely to push NV out of super computers first.... and AI not long after.

    Anyone thinking Intel is getting into GPUs to sell gamers cards... is dreaming or smoking some real great stuff. Hopefully we get some interesting Intel consumer cards as a byproduct. Whats funny is if NV wants to stay in those markets it's probably going to have to find away to work with AMD. (or spend a lot more money on ARM development again)
     
    Gamer X and RamboZombie like this.
  20. Face2Face

    Face2Face Limp Gawd

    Messages:
    372
    Joined:
    Jan 27, 2013
    Turning looks a lot like a gaming GPU to me… Strong concurrent FP32/INT32 performance, support for VRS and DXR… Also, Turing only has 2 FP64 units per SM, which makes it terrible for anything that loves double precision float performance.
     
    Last edited: Aug 13, 2019 at 3:19 PM
    IdiotInCharge, Factum and Armenius like this.
  21. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    17,819
    Joined:
    Jan 28, 2014
    What do you think Xeon Phi is? It was developed to be a consumer graphics card, first, before being repurposed as a MCP for HPC and other data-oriented applications. Intel have wanted to enter the discrete consumer graphics market for at least 15 years.
     
    IdiotInCharge likes this.
  22. ChadD

    ChadD 2[H]4U

    Messages:
    4,067
    Joined:
    Feb 8, 2016
    I never said it was terrible at gaming... just that it was designed as much (and perhaps more so) to hit these goals
    https://www.nvidia.com/en-us/data-center/tesla-t4/
    then it was to provide gaming performance.

    Volta would have made great consumer cards as well... NV just felt they could keep selling you pascal for a few more years.

    You want to know why games are not the first requirement of a NV GPU design.... Google rents T4s for around $22 a day per card. So counting a few discounts they are making $500-600 per month per card. So think about that T4s are basically 2070 supers. Google is renting them out making around 24k per year per card. (and before anyone says it yes you can rent them in a VM type situation for as low as 30c an hour or so but those are pooled and not dedicated... google makes more on that)

    https://cloud.google.com/blog/produ...ads-on-nvidias-t4-gpu-now-generally-available
     
    Last edited: Aug 13, 2019 at 8:23 PM
    Gamer X likes this.
  23. ChadD

    ChadD 2[H]4U

    Messages:
    4,067
    Joined:
    Feb 8, 2016
    The Intel engineers on that project thought if they could sell it as a consumer part it would give them the funds to further develop it. There really was no non x86 compute business in those days. The players know better now.
     
    Gamer X likes this.
  24. Thunderdolt

    Thunderdolt Limp Gawd

    Messages:
    224
    Joined:
    Oct 23, 2018
    The T4 is a dedicated enterprise card. It's only a 70W TDP card whereas the 2070 Super is 215W.

    If we follow your logic, nobody should buy a Ford to drive around because one time Ford made a race car which was "basically" the same as an SUV.

    More directly, nobody should buy AMD CPUs to play games because AMD's server CPUs are made out of the same style of silicon transistors.

    Ultimately, your line of reasoning is just plain silly anyway. The fastest gaming GPU on the market today is made by Nvidia. The second fastest gaming GPU on the market today is made by Nvidia. Third place is currently a tie between Nvidia and AMD's best ever GPU. Are games more fun to play at low frame rates if you tell yourself that, "well, it might run like hot garbage, but at least it isn't a SERVER part!!@!!23?" Would a rose by any other name not run faster on Nvidia?

    The rumored "Nvidia Killer" here is rumored to match the performance of an 18mos old GPU. Not beat the performance, just match. Not a current model, but one that will be 18mos old and discontinued at the time of "Killer's" launch.
     
    Armenius, Ready4Dis, Algrim and 2 others like this.
  25. ChadD

    ChadD 2[H]4U

    Messages:
    4,067
    Joined:
    Feb 8, 2016
    I never said no one should buy a NV card. Buy whichever you prefer and whichever you feel is the best for your pocketbook/usage case. I simply said something when someone claimed AMD has zero thought about gaming products. It's an odd line of reasoning when its true of ALL GPU MFGS forever more. Also ya the T4 and 2070 are using the exact same chip. T4 doesn't need outputs / analog converters ect. The same is true of AMDs instinct cards same chips as the vegas much lower TDPs.

    However if we want to argue cars. When Ford designs a race car they don't put that racing chassis in a minivan. Then try and charge 3/4 the race car price for the minivan. Sort of what NV did with Turing. Ford doesn't do that they may learn things building race cars that help them build better mustangs... but a mustang is NOT the same car with a different set of wheels. People buying $1200 GPUs with 1/4 of the die being useless to them tensor cores, does drive the cost of gaming cards up. Of course NV would probably have higher costs if they did design a nothing but gaming card anyway... the industry in general is in an odd place where building chips is so expensive that trying to make a one chip fits all chip is the cheapest best option.

    Chiplets are probably the answer to that. It sounds like with in a few years all the major players including NV will be building chiplet based parts. That should fix a lot of the issues with cost of designing and tapping big massive monolithic one size fits all chips. Should mean better Gaming cards as well as better products for their other markets. The margin increases should mean that some real competition should drive pricing down as well. I truly hope in a few years we all shake our heads remembering when top end video cards where selling for more then top end prosumer CPUs.
     
    Ready4Dis likes this.
  26. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    11,291
    Joined:
    Jun 13, 2003
    Yes to both. AMD GPUs go first.

    Their goal is profit.

    Gaming is one facet of the desktop market, but so is compute for content creation and inference and so on. Intel has to attack all of it to ship products and get development attention.
     
    Armenius likes this.
  27. Thunderdolt

    Thunderdolt Limp Gawd

    Messages:
    224
    Joined:
    Oct 23, 2018
    The benchmarks don't lie: Turing is, literally, the fastest gaming GPU platform in the history of time. But you're right - Nvidia also selling server parts makes that Best in History performance actually the worst. Fast is slow. Somehow. Maybe with some LSD.
     
  28. funkydmunky

    funkydmunky 2[H]4U

    Messages:
    2,327
    Joined:
    Aug 28, 2008
    It is code named "NV Killer" not "Matcher". The 5700XT is being held back by its memory speed(artificially it seems) and is knocking on the RTX2070s/2080 door. So where would that place the 5800XT? Now where would that place the "NV Killer" 5900XT?
    If it comes out 6 months from now, that will be a full two years after the RTX2080ti. I don't understand why it would be so unfathomable to some why a brand new gaming oriented architecture wouldn't best the top bar. In fact it should be expected or we are going backwards.
     
  29. Dayaks

    Dayaks [H]ardness Supreme

    Messages:
    6,928
    Joined:
    Feb 22, 2012
    We can basically extrapolate from the 5700xt.

    What is all the “gaming oriented architecture” nonsense lately? It’s like someone put a marketing team on hardforum. So all the cards since the 7970 weren’t gaming oriented?
     
    Armenius and Ready4Dis like this.
  30. Thunderdolt

    Thunderdolt Limp Gawd

    Messages:
    224
    Joined:
    Oct 23, 2018
    You're proving my point. That would place the 5900XT at simply matching the 2080Ti, but 18mos after it was released. During the same 18mos, you don't think Nvidia was working on anything? Best case, this puts the 5900XT as a tie for the 2080Ti for two months before the 3080Ti comes out, at which point the 5900XT will get crushed for the top spot and likely only match the performance of the 3080 or even 3070. In order for AMD to take the top spot in that timeframe, they can't simply update their current offerings - they need to launch a new architecture. Given they just launched a new one a few weeks ago, that simply isn't going to happen until Q1 2021 at the earliest.
     
  31. funkydmunky

    funkydmunky 2[H]4U

    Messages:
    2,327
    Joined:
    Aug 28, 2008
    AMD, out of necessity, has had to get the best of both worlds out of their one architecture.
     
  32. cjcox

    cjcox [H]ard|Gawd

    Messages:
    1,180
    Joined:
    Jun 7, 2004
    In a strange turn of events, AMD is going to release a new set of cards that they simply call "meh".

    "They're cards. Just so, so cards. They'll be priced competitively. The MEH 5600xt and MEH 5600. They'll produce heat, come with a blower cooling solution and consume a lot of power."
     
    Dayaks and Thunderdolt like this.
  33. Gamer X

    Gamer X Limp Gawd

    Messages:
    270
    Joined:
    Jul 5, 2019

    Yes, it means 100% of the GPU was designed for Gamers, not another Industry.

    RDNA architecture is not a hand-me-down architecture/design, from another GPU segment. As such, Turing's architecture no matter how big, can not hang with RDNA in games, because RDNA is more powerful & game focused. This will become quite evident when "big-navi" and "bigger-navi" hit the market. Both 5800 & 5900 will still be small GPU chips, because they don't suffer from transistor bloat.

    Dr Lisa Su made a point about this and paused and repeated herself. To drive home the point that AMD is going in 2 different directions, using two different architectures. Since sharing isn't fair to Gamers because Games require different transistors, than does the business world.



    This is also noted by many of the well respected People in the industry and why RDNA is killing it (meaning turned the whole GPU market on it's side) and it is only the 1st release of RDNA (navi10). There is still the 5600, 5800 & 5900 Series to be released... all before AMD's competition will be able to respond to AMD's well kept secret, RDNA...

    Jensen didn't know, (until it was too late), that AMD had developed a forward thinking Gaming architecture, that nearly every developer and designer has fallen on board with. Including Xbox scarlet and PlayStation5.

    ;)
     
  34. Thunderdolt

    Thunderdolt Limp Gawd

    Messages:
    224
    Joined:
    Oct 23, 2018
    It's funny that you say this. Literally every game runs at least 30% better on the 2080 Ti than it does on anything AMD has released and is scheduled to release in 2019.

    I'm sorry, but there really is nothing to discuss here. The frame rates don't lie. To say anything else is nothing more than fanboi BS.

    "Hand me down" tech? Doesn't matter. AMD is slower.
    "Transistor bloat?" Doesn't matter. AMD is slower.
    "RTX is wasted tech?" Doesn't matter. AMD is slower.
    "7nm arouses me." Doesn't matter. AMD is slower.
    "Nvidia is too expensive!" Doesn't matter. AMD is slower.
    "Consoles actually matter." LOL. AMD is slower. Go find a new forum.
    "Gamers require magical unicorn transistors" Doesn't matter. AMD is slower.

    Do you have anything else you'd like to say? Maybe you can come up with something that is more substantive than agreeing that AMD is slower?
     
  35. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,504
    Joined:
    Mar 22, 2008
    I feel like you are speaking to me personally...
     
    Armenius, noko, Dayaks and 2 others like this.
  36. Factum

    Factum [H]ard|Gawd

    Messages:
    1,863
    Joined:
    Dec 24, 2014

    Add "GrimStar":
     
    Armenius likes this.
  37. cybereality

    cybereality [H]ardness Supreme

    Messages:
    4,504
    Joined:
    Mar 22, 2008
    I honestly could not imagine a more unimpressive trailer for a game.
     
    Stoly, Gamer X, Armenius and 2 others like this.
  38. funkydmunky

    funkydmunky 2[H]4U

    Messages:
    2,327
    Joined:
    Aug 28, 2008
    Well thats because it is a tech demo and not really the game trailer. If it was they would be excluding %90 of their market which is not something they would want to do. Look how shitty it looks for you! LOL
    If a company can stuff some RT in their game, no matter how bad or half baked, it is a great marketing plan with people so desperate to play content with it supported. Reminds me of my old Matrox G400 that had spanking new Bump-Mapping. I literally bought shit games that had BM tacked on to show off my new card. By the time any good games came out every card supported BM :(
     
  39. MangoSeed

    MangoSeed Limp Gawd

    Messages:
    427
    Joined:
    Oct 15, 2014
    Chiplets are an intriguing idea. Theoretically you could move ROPs, memory controllers and cache off to a separate on-package die like AMD did with Zen 2. You would also need to move the "front end" that talks to the CPU. It'll be slower and use more power than a big fat chip though.

    Why should small CPU dies with single digit performance increases cost more than GPUs 3-4x the size that substantially improve performance each generation ?
     
  40. RamboZombie

    RamboZombie [H]Lite

    Messages:
    90
    Joined:
    Jul 11, 2018

    And also, lets not forget - a trailer for an upcoming game.. does not add much weight to the list, mayby when it's released a some point, until then - its a boring trailer on youtube - and plays the same on all cards :)
     
    Armenius likes this.
Thread Status:
Not open for further replies.