Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Well when DDR5 basically didn't have enough bandwidth (and more importantly random access), why should DDR4 be better?I have no idea what that has to do with the post I made or the one I replied to.
The official inter roadmap is the source,
Product Introduction Date 27 May – 7 June 2013 (Day/Time TBD)Sales and AdvertisingDates (to end users) Advertising: NO forms of advertising or promotions to end users allowed (includes print/web)until 2 JuneSales, Shipments: NO sales/shipments to end users until 2 June
There is no DDR5. There is only GDDR5 which is based off the DDR3 specification with a DDR4-ish feature or 2 (namely 8-bit prefetch buffer) and conforms to the GDDR5 specification as outlined by JEDEC.
Didn't the same happen when we moved from DDR2 to DDR3? Latencies went up but speeds and bandwidth jumped a lot, and power consumption went down.Yes I meant these Hynix chips from Kepler cards. When you ignore these bit larger latencies for quasirandom access (which wreak havoc with memory controller), it has not shabby 70 GB/s bandwidth.
I wouldn't bet on DDR5 appearing on the market. They might start to use Linux naming convention. Odd DDR numbers for GFX cards, even DDR numbers for desktop.
BTW from what I seen DDR 4 would have brutal CL15+ latencies. (Considering my current RAM is CL5, it's a lot.) Thus DDR 4 would stay lemon until they would drop latencies a bit.
They already have a naming convention and each one is tied to a JEDEC specification also. GDDR and DDRYes I meant these Hynix chips from Kepler cards. When you ignore these bit larger latencies for quasirandom access (which wreak havoc with memory controller), it has not shabby 70 GB/s bandwidth.
I wouldn't bet on DDR5 appearing on the market. They might start to use Linux naming convention. Odd DDR numbers for GFX cards, even DDR numbers for desktop.
BTW from what I seen DDR 4 would have brutal CL15+ latencies. (Considering my current RAM is CL5, it's a lot.) Thus DDR 4 would stay lemon until they would drop latencies a bit.
Opposite of the GPU world. NV puts out it's best at a premium before it's next gen. We enthusiasts have to wait a year after Intel's next gen.
The chips you're referring to are consumer versions of Xeon models made for 1-2P systems. It doesn't usually take a year, but 6-9 months between mainstream socket and high end enthusiast socket releases aren't unusual. Server chips go through longer validation cycles and the high end enthusiast market is a tiny niche of desktop processors sales. IOW, there really aren't very many people waiting.Opposite of the GPU world. NV puts out it's best at a premium before it's next gen. We enthusiasts have to wait a year after Intel's next gen.
Apples and non-apples.Well when DDR5 basically didn't have enough bandwidth (and more importantly random access), why should DDR4 be better?
Opposite of the GPU world. NV puts out it's best at a premium before it's next gen. We enthusiasts have to wait a year after Intel's next gen.
Yeah its the opposite of how it was for Nehalem/Bloomfield. I love my i7 920 but its day has passed. I can't wait for Haswell-E and IB-E isn't compelling. Thus I will be buying a regular Haswell system and just dealing with it...
If AMD was competitive on the high end we'd have had Haswell already, possibly Haswell-E but Intel can take its time while we hunger for more power.