DDR3 vs. GDDR5

shantd

Gawd
Joined
Aug 2, 2008
Messages
665
Forgive me if this thread has been done to death, I've been away for some time. Now that the next gen of consoles is coming out, I've become curious about the differences in their respective architecture. The XB1 will have DDR3 RAM, while the PS4 will have GDDR5 RAM.

According to this short video comparing the two on Gamespot ( http://www.gamespot.com/call-of-duty...so-far-6408699 ):

"GDDR5 Ram is better optimized for handling large chunks of data. Microsoft will instead incorporate DDR3 RAM which is relatively better at handling smaller bursts of information."

Assuming that's accurate, it still doesn't reveal an advantage one way or the other. For console gaming, do you want RAM that's optimized for large chunks of data, or smaller bursts? Bottom line, which RAM is better? Lastly, what's with the "G" in GDDR5? Is it also implied in DDR3, ie (G)DDR3?

Thanks.
 
There's so much more to the memory system than the kind of RAM used.

Most pressingly:

1. The frequency
2. The latency
3. The bus width - number of channels

...

As you know, DDR is used as general system ram and GDDR is used as video memory. Neither one is perfectly in its element when asked to do both.

It all comes down to the implementation used by the manufacturers.

If one is better than the other, it's not by much; the game designers will push both consoles to their limits and the games will be no more or less fun because of which RAM the console used.
 
"GDDR5 Ram is better optimized for handling large chunks of data. Microsoft will instead incorporate DDR3 RAM which is relatively better at handling smaller bursts of information."
That may be because the One that using DDR3 also has a small amount of SRAM (which is very fast), which is used to buffer the significantly slower DDR3 ram. GDDR5 memory is superior to DDR3 for performance in every way. GDDR5 is based on DDR3 but has some other features to make it much faster, but at a higher cost.
Assuming that's accurate, it still doesn't reveal an advantage one way or the other. For console gaming, do you want RAM that's optimized for large chunks of data, or smaller bursts? Bottom line, which RAM is better? Lastly, what's with the "G" in GDDR5? Is it also implied in DDR3, ie (G)DDR3?
GDDR5, the G stands for Graphics, because GDDR5 is typically used on graphics cards because of it's significantly faster performance. Memory speed directly correlates to faster GPU performance. Notice only the low end/cheap GPU's use DDR3?
 
GDDR5 has high latency and high bandwidth, while DDR3 has low latency and low bandwidth.

PCs use DDR3 for system RAM because low latency is much more important than high bandwidth. PC video cards use GDDR5 because high bandwidth is much more important than low latency.

For more technical detail, I'll quote this thread:

"System memory (DDR3) benefits from low latency (tight timings) at the expense of bandwidth, GDDR5's case is the opposite. Timings for GDDR5 would seems unbelieveably slow in relation to DDR3, but the speed of VRAM is blazing fast in comparison with desktop RAM- this has resulted from the relative workloads that a CPU and GPU undertake. Latency isn't much of an issue with GPU's since their parallel nature allows them to move to other calculation when latency cycles cause a stall in the current workload/thread. The performance of a graphics card for instance is greatly affected (as a percentage) by altering the internal bandwidth, yet altering the external bandwidth (the PCI-Express bus, say lowering from x16 to x8 or x4 lanes) has a minimal effect. This is because there is a great deal of I/O (textures for examples) that get swapped in and out of VRAM continuously- the nature of a GPU is many parallel computations, whereas a CPU computes in a basically linear way."
 
It's a massively huge advantage for graphics. Probably no benefit for the CPU, maybe even a small disadvantage due to higher latency.

You don't seen anything but cheap low end graphics cards using DDR3. You won't see them on HardOCP, or probably most other sites since those cards are so weak they don't send them out for reviews. The only new one I've seen reviewed with DDR3 is an HD7750 with 2GB of memory. It took a 35% performance penalty by using DDR3 even though it had twice as much memory as the GDDR5 card. http://ht4u.net/reviews/2012/msi_radeon_hd_7750_2gb_ddr3_test/index36.php

The PS4 already has 50% more GPU cores, and if the Xbone takes a 35% performance hit by using the slower DDR3, then the PS4 is going to be about twice as powerful when it comes to graphics.
 
Lets say this: Who will win in a gunfight? The AK-47, or the bazooka?
 
It's a massively huge advantage for graphics. Probably no benefit for the CPU, maybe even a small disadvantage due to higher latency.

You don't seen anything but cheap low end graphics cards using DDR3. You won't see them on HardOCP, or probably most other sites since those cards are so weak they don't send them out for reviews. The only new one I've seen reviewed with DDR3 is an HD7750 with 2GB of memory. It took a 35% performance penalty by using DDR3 even though it had twice as much memory as the GDDR5 card. http://ht4u.net/reviews/2012/msi_radeon_hd_7750_2gb_ddr3_test/index36.php

The PS4 already has 50% more GPU cores, and if the Xbone takes a 35% performance hit by using the slower DDR3, then the PS4 is going to be about twice as powerful when it comes to graphics.

Seems valid.

Xbox One = 7770 with DDR3

PS4 = 7850 with GDDR5

In the PC world one of these is obviously the clear winner in every single aspect.
 
The higher latency is essentially meaningless on the console for GDDR5. The PS4 wins in this department hugely.
 
That's what I'm thinking. Considering the PS4 won't have essentially three OS's running in the background like the Xbox I really don't see the latency becoming an issue. Plus I would think that since it isn't a full on desktop system that even if DDR3 has advantages over the GDDR5 that the disadvantages will far out weight the advantages of GDDR5...especially in a GAMING machine when RAM bandwidth is far more crucial for graphics.

To me when taking everything as a whole, the PS4, unbiasedly, is definitely the far better of the two in nearly every way (only thing I see MS having is the TV shits and MAYBE Xbox Live).
 
Xbone does have the eSRAM though so that definitely won't make the GPU only reliant on the DDR3 memory, I'm not sure how much the tiny memory of 32MB will be a bottleneck in such things, don't know how such setup functions technically to be honest like the buffering process and the size needed for certain stuff, while the PS4 can give the GPU many GBs of very fast GDDR5 RAM.

Xbone's weaker specs + being what seems to be harder to develop for will definitely show in a couple of years of Next gen consoles lifespan.
 
Xbone does have the eSRAM though so that definitely won't make the GPU only reliant on the DDR3 memory, I'm not sure how much the tiny memory of 32MB will be a bottleneck in such things, don't know how such setup functions technically to be honest like the buffering process and the size needed for certain stuff, while the PS4 can give the GPU many GBs of very fast GDDR5 RAM.

Xbone's weaker specs + being what seems to be harder to develop for will definitely show in a couple of years of Next gen consoles lifespan.

On the 360 the eRam was quickly bottlenecked by developers as it wasn't big enough. They are using the same amount which means it will be the same case this time around.
 
On the 360 the eRam was quickly bottlenecked by developers as it wasn't big enough. They are using the same amount which means it will be the same case this time around.

Is that true? If so...sucks for Xbox.
 
Sony also eliminated the major wait states and cache clearing normally needed for CPU and GPU to communicate and swap data. It also allows both the GPU and CPU access to all of the memory, at the same time, which is not traditionally the case. This get's rid of a ton of latency issues and will maximize bandwidth.
 
So...basically what all this boils down to is that the PS4 is the far better system over all.

I'm just glad Sony delivered on its promise that the PS4 was going to be a gaming machine first and foremost. Something Microsoft seems to have put as second.

My only real complaint with the PS4 is that it uses the same CPU as the Xbox. Faster GPU? Check. Faster RAM? Check. Faster CPU? Empty check box. I know the PS4 was originally said to have it's CPU clocked at 2GHz a core...maybe this is something that will change with firmware updates? Who knows, I do wonder though if the 1.6GHz speed will hinder the bandwidth for the GPU though...would 2GHz a core allow that much better performance and graphics?
 
As far as I know the cpus were custom made for each console, so they aren't exactly the "same."
 
So...basically what all this boils down to is that the PS4 is the far better system over all.

I'm just glad Sony delivered on its promise that the PS4 was going to be a gaming machine first and foremost. Something Microsoft seems to have put as second.

My only real complaint with the PS4 is that it uses the same CPU as the Xbox. Faster GPU? Check. Faster RAM? Check. Faster CPU? Empty check box. I know the PS4 was originally said to have it's CPU clocked at 2GHz a core...maybe this is something that will change with firmware updates? Who knows, I do wonder though if the 1.6GHz speed will hinder the bandwidth for the GPU though...would 2GHz a core allow that much better performance and graphics?

The GPU is the biggest factor when it comes to graphics fidelity and performance. The CPU will be important mainly for the OS and system tasks. What's important in regards to the CPU speed is that it is enough to maximize the performance of the GPU, and in both consoles cases that should be enough.
 
As far as I know the cpus were custom made for each console, so they aren't exactly the "same."

Their APUs are custom and different, but when it comes to the CPU part in it they're apparently identical from what we've learned so far, GPUs are very different though and Xbone has the added eSRAM part.

I'm not sure if both CPUs are confirmed to be ran @1.6Ghz (I guess PS4 confirmed it? can't remember tbh), I was hoping it would be clocked faster but at least they have 8 cores to utilize so it's not that bad. Also we have no idea how much GPGPU will be used this gen, the more powerful GPU compared to the CPU definitely makes you wonder how far they can go with it.
 
Sony also eliminated the major wait states and cache clearing normally needed for CPU and GPU to communicate and swap data. It also allows both the GPU and CPU access to all of the memory, at the same time, which is not traditionally the case. This get's rid of a ton of latency issues and will maximize bandwidth.

Yes, synchronization shouldn't be an issue with the added bus Sony is using. They knew that by going with GDDR5 it offered high bandwidth but also high latency. Again, it won't be a problem.
 
On the 360 the eRam was quickly bottlenecked by developers as it wasn't big enough. They are using the same amount which means it will be the same case this time around.

Are you referring to the included eDRAM, that was a separate, daughter die to the GPU? If so, it was actually only 10 MBs on the Xbox 360.
 
Are you referring to the included eDRAM, that was a separate, daughter die to the GPU? If so, it was actually only 10 MBs on the Xbox 360.

Oh, that's right it was 10. Even still, 32 is not a lot to play with.
 
the CPU portion is anemic, compared to what we are used to with current PC CPUs and coding techniques. See: individual core performance being king, due to software not doing a great job of multithreading out tasks.

But, these things have 8 cores. If care is taken to spread things out in parallel over these 8 threads, we should still get a pretty nice level of performance. Good enough. Sony has stated they have been doing a lot of R&D to give developers tools to do exactly that.

To see how this method can be beneficial, Just look at Crysis 3 benchmarks. It scales really well with added threads and de-emphasizes traditional core performance. AMD processors basically match Intel's, in this case.
 
Plus, PS4 has additional co-processors that have been added on to offload many of the sharing/video features so that they do not impede with gameplay. That's a luxury that PC's, and specifically the One, do not share.
 
Plus, PS4 has additional co-processors that have been added on to offload many of the sharing/video features so that they do not impede with gameplay. That's a luxury that PC's, and specifically the One, do not share.

Do we have confirmation that the Xbone does not offload anything to seperate chips and does everything on the cpu?
 
When I went from Core 2 Duo DDR2 Ram with cas 3 latency to DDR 3 Cas 8 1600Mhz with core i7 I thought the ram was supposed to be faster in all respects despite the higher timings. Is that not the case?
 
I don't understand what good the superior playstation does for us if the games are cross platform developed for xbone. Hopefully, Sony will just dominate them. Aren't developers inherently bottlenecked by xbone since that is the lowest performance denominator? Or perhaps we'll see superior anti-aliasing on ps4.

I would think if Microsoft built a well specced machine and priced it right they would win by default because of the better online experience, Halo, ect. How could they have messed up so bad.

It should also be noted that some cable providers like Comcast are providing updated user interfaces and dvr systems in some markets. The new Comcast system is actually called X1!, confusingly. Doesn't this essentially undermine the value of the hdmi in on xbone. How hard is it to change inputs on the tv! I don't get what they are doing at all.

And when every review of every cross platform game comes down with better graphics on the ps4, what other than brand loyalty does Microsoft have to fall back on?

Edit: OMG PS4 has twice the ROPS on the gpu than xbone. Doesn't that essentially mean gimped anti-aliasing on xbone?

And who killed Jimmy Hoffa and where is he?! Can someone please answer these questions for me!
 
Last edited:
I don't understand what good the superior playstation does for us if the games are cross platform developed for xbone. Hopefully, Sony will just dominate them. Aren't developers inherently bottlenecked by xbone since that is the lowest performance denominator? Or perhaps we'll see superior anti-aliasing on ps4.

I would think if Microsoft built a well specced machine and priced it right they would win by default because of the better online experience, Halo, ect. How could they have messed up so bad

- Multiplatform games performed better on the 360 because it was easier to develop for compared to the PS3's cell, not because the console happened to be created by MS. Time costs money and developers minimized costs that way.

Next gen is a completely different story, not only the PS4 hardware is clearly superior to the Xbone, it's ALSO easier to develop for with a very similar setup that we use on PCs, unlike the Xbone which developers will have to deal with the eSRAM configuration to get the best out of it.

We definitely won't be seeing that much of a difference with launch or early games (specially considering the PS4's 8GB GDDR5 was added not long ago compared to the 4GBs dev kits had), but I bet the difference will be very visible within 2 years of the console's lifespan, PS4's GPU is 50% faster, any PC gamer would know that's no joke since it's almost 2 generations ahead in terms of performance boost that we see with graphic cards in the same lineup. For example I'm sure some games will manage 60FPS on the PS4 while not being possible on the Xbone, that alone is Huge in my book.

- XBL online is overrated, I used both along with PSN and games that use P2P are still shit and depend on the host. PS3 had a serious OS limitation when it comes to online features, not having cross chat for example was mainly due to hardware limitations (IIRC not enough RAM). PS4 will Definitely have much improved online features (more capable hardware, they even have a dedicated processor for certain background tasks) and this is the 2nd major online PS console so they already have 6-7 years of experience in this field.

Given the MP paywall and Sony's very focused vision to avoid repeating past mistakes, I have no doubt PS4's PSN will be AT LEAST as good as XBL if not even better. PSN Plus is already MUCH better than Gold when it comes to value/free games.

- Why mention "Halo" when it comes to better online experience? Didn't people get sick of that shit already? Sony hands down got much better first party support with more studios and game varieties.

Xbone will need to bring some serious AAA exclusives to be viable next gen, 3rd party exclusives is a thing of the past, most of those "exclusives" that Xbone showed at E3 will most likely make its way to the PS4 at some point, it's all about 1st party titles now.

Also most of the Xbone TV/Sport bullshit are exclusives to US, I honestly don't see how they'll manage to avoid being completely destroyed by the PS4 globally, just like the PS2 did to the original Xbox.
 
When I went from Core 2 Duo DDR2 Ram with cas 3 latency to DDR 3 Cas 8 1600Mhz with core i7 I thought the ram was supposed to be faster in all respects despite the higher timings. Is that not the case?

The best low latency DDR2 was 4-4-4-12; these days 9-9-9-24 is pretty standard for DDR3.

I went looking for what the latency of GDDR5, but it seems like the answer is not simple; in the process of looking, I found this article about GPU memory latency which should be of interest to this thread:

http://www.sisoftware.net/?d=qa&f=gpu_mem_latency

I don't seem to be able to understand the article well enough to compare those GPU latencies to DDR2 vs DDR3 latencies.
 
So. PS4 is more like a PC than xbone? That is crazy to me. I can't believe the difference in hardware is so big. Plus, the price difference. Unreal. I would think they would be almost identical systems, like Alienware and Falcon Northwest competing. One adds a tv tuner and $100 to a gaming system and they come out 50% less powerful on the GPU side? This at atime when 60fps and 4k resolutions are the big talking points. From a PC based company. I thought Nintendo had cornered the market on confusion but I guess not.

At least this time I can get behind someone. Last round xbox had ghe better gpu but weaker cpu so it was unclear. I ended up with a Wii only and my pc. The PS4 looks like the clear winner here. I think PC gamers have a lot to gain aswell if we get ports. They will be ports of games for 8 core cpus and 8 gigs of ram. I wonder if the ports will just plain run better on current pcs for that reason alone.
 
The best low latency DDR2 was 4-4-4-12; these days 9-9-9-24 is pretty standard for DDR3.

I went looking for what the latency of GDDR5, but it seems like the answer is not simple; in the process of looking, I found this article about GPU memory latency which should be of interest to this thread:

http://www.sisoftware.net/?d=qa&f=gpu_mem_latency

I don't seem to be able to understand the article well enough to compare those GPU latencies to DDR2 vs DDR3 latencies.

I just realized I made an assumption which is cas 8 ddr3 is higher latency than cas 4 ddr2 or that the two are directly comparable but we need not solve that here. I always thought the bandwidth increase made up for the difference.
 
PC gaming comparisons are not entirely fair to compare to One and PS4. In most gamers computers, DDR3 memory is not used for graphics. The GPU has it's own memory, in most cases GDDR5.

The best comparison you can make is looking at AMD APU or Intel IGP benchmarks because the integrated GPU does use system memory. The faster the memory the better the performance of the GPU. The unfortunate part about this, though, is that even the fastest DDR3 memory still bottlenecks a GPU. GPU's need GDDR5 to show full potential.

Review sites I use show around 5-15% increase in FPS by going from 1333 to 2400 DDR3 memory for integrated graphics test. Considering GDDR5 has more bandwidth, and the PS4/One have much better GPU's than current CPU IGP's have, it's clear to see that you could expect even more FPS boost in consoles going from DDR3 to GDDR5. I honestly wouldn't doubt if a 1080p/30 game on One runs at 1080p/60 on PS4.
 
Last edited:
That's what I'm thinking. Even if we don't see better graphic fidelity on the PS4 we're most certainly going to see better, more stable frame rates plus things like better AA or draw distance/level of detail.

I see a lot of people say that the Xbox and PS4 will have roughly the same graphics so it won't matter which one you go with...based on specs alone I say that's clearly not true. Graphics may remain the same, but when the Xbox is dipping into sub-30 frames the PS4 will be able to maintain 45-60 fps. To me that's all I care about.
 
Recently a developer echoed my statements about the One vs. PS4. He said that both consoles are a huge improvement over current gen at that at launch and afterwords there won't be much difference in the games. Later in the life cycle the PS4 benefits will start to show.

SOURCE
 
lol I don't get why its going to take any time for the PS4 advantage to show, its not like its hard to code for the stupid thing now, you distribute processes to the same 8 cores you would on the xbone, you just have more muscle to work with so flip a couple of graphics options on.

Actually read the quote and the way he words it sounds more like people wont give a shit to even bother flipping those options on cause it will all just be a rush to get the games out.
 

So... does anyone even read the words around these articles? It clearly states that Microsoft never announced the GPU for the Xbox One. Neither has AMD. In fact, when I trace everything back to the article where people claim they "heard" the Xbox One's GPU, it was a single article about AMD putting the HD 7750 on the market, and the article author assuming it had to be the GPU for the "Durango" because of the price point.

I'd kinda like more concrete info before I shit on anything, although it would explain why Dead Rising 3 didn't look very impressive.
 
So... does anyone even read the words around these articles? It clearly states that Microsoft never announced the GPU for the Xbox One. Neither has AMD. In fact, when I trace everything back to the article where people claim they "heard" the Xbox One's GPU, it was a single article about AMD putting the HD 7750 on the market, and the article author assuming it had to be the GPU for the "Durango" because of the price point.

I'd kinda like more concrete info before I shit on anything, although it would explain why Dead Rising 3 didn't look very impressive.
With all the bad press surrounding the One, why on Earth would Microsoft be withholding positive information (that their GPU is better than expected)? I also have not yet seen a statement by Microsoft denying that the information out there is correct. I would believe that Microsoft would want to quickly dismiss rumors about their console which is the behavior that they have exhibited recently.

The reason they have not yet admitted to the GPU specs is because they are the real specs and they don't want to confirm more reasons to hate on the console.
 
Some people are really caught up in their MS hate to think that PSN will be better right off the bat than XBL especially now when you have to pay to play online there will be certain expectations. The only reason they offered games in the first place was because their online experience sucked balls. That being said with Balmer in charge right now I hope they announce a new head of the Xbox division. Also if GDDR5 is so amazing why don't normal pc's use it? (yes I know amd is coming out with their unified platform). Either way I reserved both systems and will cancel one right before launch, I'm still undecided.
 
Back
Top