Planning my next build - when will nVidia support PCI-E 4?

Discussion in 'Video Cards' started by x509, Jan 6, 2020.

  1. x509

    x509 [H]ard|Gawd

    Messages:
    1,907
    Joined:
    Sep 20, 2009
    I'm planning my next build for "sometime" next year, when my schedule stops being so much business travel. :(

    I have pretty much decided on a Ryzen and X570 motherboard, as well as an nVidia 2060 or 2060 Super Board (or the replacements for these components.) But I can't seem to find any nVidia cards today that support PCI-E 4.0. Is it realistic to wait for this? I'm not a gamer but I do a lot with Photoshop and Lightroom.
     
  2. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,891
    Joined:
    Jun 13, 2003
    Next generation, whenever that hits?

    There's no benefit with current GPUs. They're not close to maxing out PCIe 3.0 yet.
     
  3. Hagrid

    Hagrid [H]ardForum Junkie

    Messages:
    8,617
    Joined:
    Nov 23, 2006
  4. HockeyJon

    HockeyJon [H]ard|Gawd

    Messages:
    1,129
    Joined:
    Dec 14, 2014
    It’s not going to make a difference for most people at this point. I wouldn’t even have PCIe 4.0 factored into my decision making at this point. You’re better off getting a higher end B450 or X470 board than ponying up the extra cash for an X570 board and dealing with the whine of the little fan that they put on the chipset. Use the money you saved towards a better video card, or faster RAM, or a nicer case, or whatever else you might want in the build that will actually make a difference in your user experience.
     
  5. doubletake

    doubletake Gawd

    Messages:
    592
    Joined:
    Apr 27, 2013
    If you want PCI-E 4.0 to give you tangible performance increases in PS or LR, it's not going to be from a video card, but from an SSD as scratch storage. Even then, that's only if you are actually working with project files large enough to somehow choke current 3.0 solutions.
     
    Armenius likes this.
  6. owcraftsman

    owcraftsman Gawd

    Messages:
    897
    Joined:
    Feb 3, 2007
    I think the better question is when will anyone produce a GPU that totally utilizes the bandwidth available to x16 PCIe 3.0 and that includes a 2080 Ti.
    Or when will Intel step up the number of available PCIe lanes from 16?
    Only now has the 2080Ti been able to saturate an x8's bandwidth making x16 essential, so maybe the future is bright.

    For the record HockeyJon My X570 chipset fan at 5000 RPMs is totally inaudible as a matter of fact even with stress testing benchmarking and gaming I've yet to hear it over and above everything else and it's still on the benchtop and hasn't found it's way to a case yet so it's not your Daddies chipset fan anymore. Can't vouch for other mobos though. (see sig)
    I upgraded from Z97 to x570 and couldn't be happier. If you are buying a new board now x570 makes a lot of sense, but would I upgrade from B450 or x470? probably not.
     
    jmilcher, Armenius and HockeyJon like this.
  7. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,891
    Joined:
    Jun 13, 2003
    ...max per PCIe slot?
     
  8. owcraftsman

    owcraftsman Gawd

    Messages:
    897
    Joined:
    Feb 3, 2007
    max lanes available
     
  9. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,891
    Joined:
    Jun 13, 2003
    They're including quite a bit more than 16 ;)
     
    Dayaks and Armenius like this.
  10. HockeyJon

    HockeyJon [H]ard|Gawd

    Messages:
    1,129
    Joined:
    Dec 14, 2014
    Good to hear that you have a quiet fan, because that was a big reservation for me when I upgraded from my Z87 platform to Ryzen. The other was concern about fan failure, which I’m sure would be years in the future, but they can be difficult to replace and I tend to keep my systems for several years. I’m still a fan (no pun intended) of less active cooling where able.
     
  11. Jamie Marsala

    Jamie Marsala Limp Gawd

    Messages:
    128
    Joined:
    Mar 9, 2016
    I have never even seen my fan spin once. I apparently do not heat the chip up enough to turn it on. Have never heard my PSU fan spin either, which according to reviews it does not until after 340 Watts are in play so I must not be using enough to need it. But during benching the X570 fan never turns on, can't say it never spins during gamins as I can not see it while trying to play. But I certainly have never heard it.
     
    HockeyJon likes this.
  12. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    20,005
    Joined:
    Jan 28, 2014
    If you're planning on "sometime next year," then why even think about it at this point? We have a long way to go for 2021, and a lot will change by then.
     
  13. owcraftsman

    owcraftsman Gawd

    Messages:
    897
    Joined:
    Feb 3, 2007
    maybe I should have said max available to GPUs from CPU unless you have a plx chip but I thought it should be clear what the context was and its common knowledge there is only 16 lanes dedicated to the GPU slots.
     
  14. Jandor

    Jandor Limp Gawd

    Messages:
    413
    Joined:
    Dec 30, 2018
    You need to wait half a year to get Nvidia or AMD RDNA 2 on PCIe 4.0. This may be already useful on a 2080Ti if it had PCIe 4.0 support, and this has been proved useful if you're low on VRAM on your AMD graphics card, especially RX 5500 with only 4GB RAM, not needing more if they were used on a X570 with PCIe 4.0.
     
  15. Jandor

    Jandor Limp Gawd

    Messages:
    413
    Joined:
    Dec 30, 2018
    Also, mind that if next year is 2021, there will be an X670 chipset coming probably at the end of the year with new Zen 3 CPU line this will be the last supported CPU on AM4 chipset with DDR4.
     
  16. Algrim

    Algrim [H]ard|Gawd

    Messages:
    1,611
    Joined:
    Jun 1, 2016
    If you're going to wait until next year keep in mind that AMD may be releasing a new socket (which is what I'm waiting for before jumping on the Ryzen bandwagon).
     
    Armenius likes this.
  17. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,891
    Joined:
    Jun 13, 2003
    Well, it honestly isn't clear, as both Intel and AMD have been adding PCIe lanes to their CPUs and to their chipsets -- X570 is essentially AMDs version of a PLX PCIe switch, but total bandwidth is of course limited by the link to the CPU.

    For total number of GPUs... this isn't really that important for most, and to get real PCIe x16 to each GPU, you're still going to have to move up to an HEDT platform.
     
  18. Mode13

    Mode13 Gawd

    Messages:
    728
    Joined:
    Jun 11, 2018
    Also would like to throw in that I set my chipset fan profile to quiet and the thing is virtually never on. Even if I turn it on I can't hear it at all. (X570 Aorus Pro Wifi)

    I would only consider PCIE gen 4 in the short term if you want to go with balls to the wall NVME speeds. A 2080 Ti barely saturates PCIE 3.0 x8, so unless you have plans on multiple GPUs or running a top end single card 3 years from now, it's really pointless IMO.

    I bought X570 on launch day for the VRM, like almost everyone else, expecting to be able to overclock my shiny new ryzen 3700x. It's senseless to even attempt to drive more than 120 watts into this CPU, so the VRM is virtually pointless for anything less than a 3900x or better (when compared to a solid x470 board). I'd like to say 3950x or better but I don't have the personal experience to say for sure, and even that would run perfectly well on a top end x470 board like the MSI x470 gaming m7 ac
     
  19. x509

    x509 [H]ard|Gawd

    Messages:
    1,907
    Joined:
    Sep 20, 2009
    Ah. Good point. My "real life" schedule and commitments are such that I mihgt not be able to start my new build until sometime like May-June-July. And I am not going to repeat the mistake I've made in the past, which is to buy parts and then not use them until months later. By that time, there were newer designs on the market.

    It's always true that with PCs, whatever you buy now will be superseded in a year, sometimes less. But there are times when the industry is on the cusp of a major change. The new AMD socket to me at least is such a time.
     
  20. Nenu

    Nenu [H]ardened

    Messages:
    19,091
    Joined:
    Apr 28, 2007
    I wish they had recorded game load time as well, this is where I would expect to see a difference going to slower speeds.
    Not so sure with higher speed because there is already preloading to avoid in game latency.

    With a PCIE-4.0 SSD and gfx card textures should load to gfx memory faster making the game start a little sooner.
    Perhaps :)
     
  21. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,891
    Joined:
    Jun 13, 2003
    SATA SSD vs. SATA spinner: yup
    NVMe SSD vs. SATA SSD: a little
    PCIe 4.0 vs. PCIe 3.0: yeah, that's not even measurable

    The bigger issue isn't that having faster bus speeds available isn't useful or desirable, it's that almost nothing is close to make use of the last standard.

    Which is why planning around PCIe releases is a bit of a moot point. We're still waiting for software to catch up, really.
     
    PhaseNoise and Armenius like this.
  22. Lepardi

    Lepardi Limp Gawd

    Messages:
    228
    Joined:
    Nov 8, 2017
  23. Hagrid

    Hagrid [H]ardForum Junkie

    Messages:
    8,617
    Joined:
    Nov 23, 2006
    Are you sure? I see no link!
     
  24. noko

    noko [H]ardness Supreme

    Messages:
    4,553
    Joined:
    Apr 14, 2010
    On X570, if one is talking about next generation high end GPU or top end and using both X16 pcie slots, then I would say pcie 4 would be important for the GPU since it will be running at pcie 8x, at pcie 3.0 16x as in leaving the second slot empty then it should not be a problem.
     
  25. x509

    x509 [H]ard|Gawd

    Messages:
    1,907
    Joined:
    Sep 20, 2009
    This thread has been very educational, which is why I like [H] so much, but as the OP I still don't know when/if in 2020 nVidia boards will support PCI-E 4.0.
     
  26. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,891
    Joined:
    Jun 13, 2003
    Essentially, whenever the next generation of Nvidia GPUs releases, they will have PCIe 4.0.

    So it's not if as much as when they'll release with PCIe 4.0, and with respect to GPU releases, it's all guesswork based on past history and so on.

    What we can say in that context is that we're due for an Nvidia release.
     
  27. PhaseNoise

    PhaseNoise 2[H]4U

    Messages:
    2,280
    Joined:
    May 11, 2005
    Yep. It's "easy" to add to a new design, and there's really no reason to push it to current-gen designs since it isn't a realistic bottleneck.
     
    Armenius and IdiotInCharge like this.
  28. x509

    x509 [H]ard|Gawd

    Messages:
    1,907
    Joined:
    Sep 20, 2009
    I agree that we won't see a GTX 2060 Super Rev 2, with PCI-E 4 support. But as sure as night follows day, there will be a replacement product, that will have better specs, irregardless of PCI-E 3 or 4. My question is WHEN?
     
  29. PhaseNoise

    PhaseNoise 2[H]4U

    Messages:
    2,280
    Joined:
    May 11, 2005
    Sorry, didn't mean to sound dismissive.

    Fundamentally - we don't know. And I suspect, NV doesn't have a solid date either yet. The fabs have to be in order, and getting good yields. The simulations have to be running flawlessly. The driver team has to have support truly working (this is actually a huge part of a modern GPU delivery). Etc. Lots of different parts, and all have to come together. All of that information is closely guarded.

    All I would personally speculate is "sometime this year", given current information.

    I would just reiterate caution against worrying too much about the bus protocol directly. That in itself is not a good reason to buy or delay. A new chipset IS a good reason, and will drag behind it new signaling support. The bus protocol is going to be a tiny fraction of any performance concern. You can verify this today by running your card as x8 vs x16, and notice it's not much different.
     
    Armenius and IdiotInCharge like this.
  30. Algrim

    Algrim [H]ard|Gawd

    Messages:
    1,611
    Joined:
    Jun 1, 2016
    My next graphics card purchase will be when one of the GPU companies manages to release an HDMI 2.1 version. RTX? Don't care (but wouldn't turn down if the price is right); PCI-E 4? Again, don't care (but if it's offered, why not?).
     
    Armenius likes this.
  31. x509

    x509 [H]ard|Gawd

    Messages:
    1,907
    Joined:
    Sep 20, 2009
    Agreed, but successful companies like nVidia have experienced people runing these projects. Of course, everyone steps on a banana peel once in a while.
    If nVidia doesn't release new product sometime this year, then there is something seriously wrong. Of course, "this year" could be "December 86th." LOL.
    Are you saying that AMD will replace the X570 chipset this year with something newer/faster? Any good intel yet?
     
  32. PhaseNoise

    PhaseNoise 2[H]4U

    Messages:
    2,280
    Joined:
    May 11, 2005
    Sorry, misspoke. By chipset I was really meaning “major gpu revision”.
    The 570 is fine. All you’d really add are pcie lanes, and the TR platform allows that should you need them. If you go 570, you’ll be set for a long time.

    I wouldn’t expect any banana peels, just juggling all the internal and external (marketing windows etc) issues. Makes it hard for them to predict, let alone us.
     
  33. Keljian

    Keljian Gawd

    Messages:
    784
    Joined:
    Nov 7, 2006
    There is absolutely no need for it at the moment.
     
    Armenius likes this.
  34. Nenu

    Nenu [H]ardened

    Messages:
    19,091
    Joined:
    Apr 28, 2007
    What is this word "irregardless", is it the opposite of regardless?
     
  35. Ready4Dis

    Ready4Dis Gawd

    Messages:
    629
    Joined:
    Nov 4, 2015
    It's synonymous, the words mean the same thing.
     
  36. Nenu

    Nenu [H]ardened

    Messages:
    19,091
    Joined:
    Apr 28, 2007
    Thanks, it doesnt read well because the ir part at the start is a negative, yet it isnt treated as a negative.
    According to the web it appears to have started in the 18th century. I guess someone wanted to use big words and cocked it up :)
     
    Armenius likes this.
  37. Armenius

    Armenius I Drive Myself to the [H]ospital

    Messages:
    20,005
    Joined:
    Jan 28, 2014
    I think it gained parlance as a one-word reply. So instead of constructing a sentence fragment like "regardless of the outcome" you just say "irregardless," though I believe "regardless" by itself works equally as well.
     
    Nenu and Algrim like this.
  38. IdiotInCharge

    IdiotInCharge [H]ardForum Junkie

    Messages:
    12,891
    Joined:
    Jun 13, 2003
    It's a double-negative in a single word; while many languages tolerate or outright use double negatives, they're poor grammar in English.

    Granted poor grammar (as well as profanity) can be used to make a point, however, the use of 'irregardless' tends to come across as uneducated.
     
  39. cjcox

    cjcox [H]ard|Gawd

    Messages:
    1,386
    Joined:
    Jun 7, 2004
    I'm wondering if I'm the only one that see PCIe 4.0 as the "one to skip version" and that PCIe 5 might be the next big target. Does anyone else feel this way?
     
  40. MangoSeed

    MangoSeed Gawd

    Messages:
    617
    Joined:
    Oct 15, 2014
    Holy shit please don’t ever repeat that. You’re contributing to the downfall of human intelligence.

    Irregardless has never been and should never be a real word in the English language.
     
    x509 likes this.