PCI E 4.0 is coming !!!!!!!!!!!!!!!

I sometimes wonder if there are people who never buy anything because they're always "holding off" for the next best thing. Just buy whatever and enjoy it.

Exactly. I bought a 2500k and P67 motherboard because I knew that PCIe 2.0 would last for a very long time. We've barely begun to saturate 8x PCIe 2.0 with a GTX 1080, and even then it's just a few percent.

My 16x slot should have several more years of life.
 
Last edited:
The article I read yesterday said that they were arguing over PCI-E 5.0 already. :)
 
The article I read yesterday said that they were arguing over PCI-E 5.0 already. :)

I didn't think we'd be able to push that much bandwidth over a single lane of copper. Not for cheap anyway.

We'll have to wait and see. The 3.0 transition took no time at all because 20% of the bandwidth gain came in changing the clock encoding. But now they will just have to up the data rates.
 
Last edited:
Vega did show a difference between PCIe 3.0 8x and 16x for multiGPU. But since mGPU is dying off... Yeah 3.0 should last forever.
 
AFAIK we barely utilize 2.0 to its full potential, 3.0 is overkill and utilized for PCI-E SSDS the sort, wtf will 4.0 do?

Newer Thunderbolt adaptors, Optane based storage, etc.

Might also improve power consumption/delivery and better support for external devices.
 
Vega did show a difference between PCIe 3.0 8x and 16x for multiGPU. But since mGPU is dying off... Yeah 3.0 should last forever.
I was under the impression that there are more mGPU setups than ever before, and may see a further increase if VR actually takes off. If you have a 4K LCD per eye for example, no single GPU can handle it, and it would make sense to have one GPU per LCD.

NVIDIA VRWorks™

LiquidVR™ | Immersive Virtual Reality Technology | AMD

Both AMD and NVIDIA are pimping mGPU dedicated card per eye for VR.
 
I was under the impression that there are more mGPU setups than ever before, and may see a further increase if VR actually takes off. If you have a 4K LCD per eye for example, no single GPU can handle it, and it would make sense to have one GPU per LCD.

NVIDIA VRWorks™

LiquidVR™ | Immersive Virtual Reality Technology | AMD

Both AMD and NVIDIA are pimping mGPU dedicated card per eye for VR.

That's great that they are.... AMD can't even get single cards right in VR half the time. I'm not going to hold my breath.

Given mGPU is reliant on fantastic frame time or you literally vomit, I have some skepticism. I'm more interested in nVidia's Simultaneous Multi-Projection. Doesn't SMP make one GPU per an eye antiquated?
 
That's great that they are.... AMD can't even get single cards right in VR half the time. I'm not going to hold my breath.

Given mGPU is reliant on fantastic frame time or you literally vomit, I have some skepticism. I'm more interested in nVidia's Simultaneous Multi-Projection. Doesn't SMP make one GPU per an eye antiquated?
I don't know, but if both major players are saying dedicated GPU per eye is the way to go, there must be something behind it.
VR SLI provides increased performance for virtual reality apps where multiple GPUs can be assigned a specific eye to dramatically accelerate stereo rendering.
What we know for sure is that VR relies on very low latency, very high framerates, and high resolution, and having a dedicated powerful card per eye makes sense. Personally, I'm hoping its not a fad like 3D, because I'm really digging VR on my Samsung Gear VR and just wish it had the horsepower for "real" games.
 
I don't know, but if both major players are saying dedicated GPU per eye is the way to go, there must be something behind it.

What we know for sure is that VR relies on very low latency, very high framerates, and high resolution, and having a dedicated powerful card per eye makes sense. Personally, I'm hoping its not a fad like 3D, because I'm really digging VR on my Samsung Gear VR and just wish it had the horsepower for "real" games.

From the videos I've watched of people playing VR games, I don't think it's a fad. I think at the least, it'll be adopted into amusement parks / arcades (GameWorks, etc.) / training. I can already see many applications in the sports training field. It's also being helped along by the boom in 360 video equipment.

There's a new FPS/MilSim game that is VR based. Not too bad graphics wise and interesting to watch how well the Vive mirrors the player stance. I think it might take a few years to really grab hold, but it's here to stay IMHO.
 
FWIW - I was running two GTX970s and also GTX980ti's in SLI on an x79 board awhile back with pci-e 2.0 pushing 4k res. Then I added the forcegen3.exe hack to pci-e 3.0 and it added a pretty significant boost. Granted I was running these in a 16x 8x config due to slot spacing (intel DX79SI) but the ability to flip to 3.0 with that "hack" was more than just noticeable while gaming at 4k with SLI.
 
As even this very minimal article mentions, 4.0 and 5.0 are not about video cards, which might maybe never exceed 3.0 bandwidth before the video card industry dies, becomes irrelevant or metamorphoses into something else. It's about m.2 and u.2, and storage solutions that will exceed PCI-E 3.0 speeds.

You could view it as an interesting indirect confirmation that Intel and Micron really do have something going on with 3D XPoint and Optane, because what else would need gobs and gobs of bandwidth like this?
 
It'll be interesting to re-examine the importance of PCIe speeds in the next few years with the impact of low level APIs and the shifting of more compute workload over to the GPUs, especially for non rendering (eg. AI, physics).
 
He loves Nvidia. Don't throw your countrymen under the bus. ;)
what Happends when you use AMD

Lol, great thread, some hilarious posts - still OP seems too excited for PCI-E, graphics cards are happy as it is - at least as far as gaming is concerned. In the spirit of the thread you linked to, I have to say maybe PCI-E 4.0 will raise the current limit on the slot so at least RX480 users will have something to look forward too ;)
 
And hyper PCI-E card clips!
Each motherboard comes with a card extractor.
 
That power has to come from somewhere. It means more power pins sticking out of the motherboard.
 
The proposal of a dedicated PCI-E external plug/cable for attaching portable/external peripherals directly to 4x lanes of PCI-E 4.0 makes me moist.
 
OP seems a little too excited about PCI-E 4.0

it's fightingfi, what do you expect? he's been here posting useless articles in the wrong sections of the forum for 8 years.. i mean i'll give him credit though at least it wasn't a 5 year old article this time.


PCI-E 4 would allow GTX 10 series cards to run without a power adapter. So the wattage is probably more important than speed increase.

the only thing i worry about is that high end boards just get even more expensive due to the overhead costs for the boards to support the higher power limits. but i don't see the power adapters on gpu's going anywhere any time soon since gpu manufactures are still going to have to make the cards backwards compatible.
 
Last edited:
I sometimes wonder if there are people who never buy anything because they're always "holding off" for the next best thing. Just buy whatever and enjoy it.
I am still waiting on HBM3 and 5nm processors. My Athlon64 3500 and Radeon 9700 Pro are still good enough. I only use that as my excuse for not having upgrade funds.
 
it's fightingfi, what do you expect? he's been here posting useless articles in the wrong sections of the forum for 8 years.. i mean i'll give him credit though at least it wasn't a 5 year old article this time.




the only thing i worry about is that high end boards just get even more expensive due to the overhead costs for the boards to support the higher power limits. but i don't see the power adapters on gpu's going anywhere any time soon since gpu manufactures are still going to have to make the cards backwards compatible.

plus it's not like the extra power is coming through the 24pin, you're still going to need at least two power connectors per PCI-E slot if you're going to allow 300W on all of them
 
I am still waiting on HBM3 and 5nm processors

nobody.jpg


Tick, tock.
 
Back
Top