Console RAM upgrade?

trick0502

Supreme [H]ardness
Joined
Apr 17, 2006
Messages
5,563
I was playing some N64 last night and I noticed the RAM expansion slot. Why don't console use that any more? Wouldn't it be nice to upgrade the RAM in your 360 or ps3?
 
It would also be a "required" upgrade for some games. If I remember correctly, Perfect Dark on the N64 required the extra ram to play the single player campaign. Sure, for a small sum, you could somewhat upgrade a console, but you are still limited to the rest of the non-upgradable hardware. To me, Perfect Dark still performed poorly with the extra ram.
 
I know the PS3 can uses the HDD as virtual memory, something like 2GB.
 
Because a RAM upgrade would pretty much defeat the whole purpose of a console, which is to have a uniform platform.

The new XBox's actually have a somewhat faster processor than the original XBox's, and they purposely underclock them to keep things uniform. Games, with a few exceptions, have to make sure that they can play without a hard drive on the XBox.

As for the hard drive when it comes to the PS3 and XBox 360, there's a difference between the hard drive and RAM. Games themselves know how much diskspace they'll use. Yet, at the same time, the console owner might have exceeded their limit. But that falls on the responsibility of the console owner. RAM is a different beast altogether. With the HD, you know everyone already has one. Once you start differentiating between consoles, you limit your audience more. And while certain accessories such as the Kinect may sell millions, you have to assume that most will fail. Otherwise, you just evolve into a fancy PC with a million different combinations, but with TV out.
 
Nytegard got it. You want to have a uniform platform. This is the mistake MS made with the harddrive. Since it was possible to not have a harddrive with a 360 most developers just programmed to the lowest common denominator (having no harddrive). So many games didn't take advantage of the extra performance you could get by caching to the harddrive. This slowly changed over the years though and at least you can install games to the harddrive now. Different Memory amounts would be worst though.
 
Nytegard got it. You want to have a uniform platform. This is the mistake MS made with the harddrive. Since it was possible to not have a harddrive with a 360 most developers just programmed to the lowest common denominator (having no harddrive). So many games didn't take advantage of the extra performance you could get by caching to the harddrive. This slowly changed over the years though and at least you can install games to the harddrive now. Different Memory amounts would be worst though.

Very true, however on the contrary they sold millions of 360s w/o hard drives.
 
but the extra ram could be used in a way like rogue squadron on the n64. you got 320 x 340 without the ram and 640 x 480 with. just think a game like halo you could play in hd instead of that odd rz it runs in.
 
but the extra ram could be used in a way like rogue squadron on the n64. you got 320 x 340 without the ram and 640 x 480 with. just think a game like halo you could play in hd instead of that odd rz it runs in.

That's not really RAM, but more the graphical capabilities of the XBox. The RAM would allow the levels to be larger and more open. Either way, this would only piss off users. Either many people couldn't play the levels, or if you upgraded the graphical capabilities so it could run at a true 720p resolution, it would give the people who spent more money a significant advantage over people running at 640p. This is one of the complaints console fans use against PC fans, in that on the PC, a lot of a person's skill is due to their machine vs the console where, while there are factors such as TV size and internet connection, there's less "advantages" such as stuttering, or being able to see someone clearly while another person on a cheaper system sees a pixel.
 
It would be nice to have a ram upgrade. Now that the consoles are getting old it would allow developers to get a bit more out of the consoles. However it would also require them to optimize the game for two different specs and I doubt they would go for that.
 
I know the PS3 can uses the HDD as virtual memory, something like 2GB.

If you've ever used a PC where you significantly hit your RAM limit and it started using the HDD virtual memory you'll know that using a HDD as RAM is basically akin to smashing your head against a wall. :p

RAM expansion packs are sort of a nice idea but they divide your consumer base. The nice thing about consoles is you have such a huge install base of everyone using identical hardware. As soon as you divide it things start to get more complicated, devs have to start making games with 2 setups in mind, or make it for only 1 and exclude anyone who hasn't upgraded.
 
What consoles really need is an idiot-proof plug-and-replace video card. I think that would do a lot to the longevity of a console.
 
That's not really RAM, but more the graphical capabilities of the XBox. The RAM would allow the levels to be larger and more open. Either way, this would only piss off users. Either many people couldn't play the levels, or if you upgraded the graphical capabilities so it could run at a true 720p resolution, it would give the people who spent more money a significant advantage over people running at 640p. This is one of the complaints console fans use against PC fans, in that on the PC, a lot of a person's skill is due to their machine vs the console where, while there are factors such as TV size and internet connection, there's less "advantages" such as stuttering, or being able to see someone clearly while another person on a cheaper system sees a pixel.

Having a higher resolution does not give you any sort of advantage. See top level Quake/Unreal/CS play where people run at ultra low resolutions and strip out all details, because it translates into a massive advantage.

Even with consoles for some competitive games, if you are not both playing on the same console head to head, a person running their game off a CRT is going to have a pretty large advantage than a person running of an HDTV simple because there is no input lag.

So no, increasing resolution and graphical fidelity has never given you an advantage over anybody else, and in most cases will hand you a disadvantage. An eyefinity configuration with 3 30 inch monitors is worse from a competition standpoint than a single 19inch CRT pushing 800x600.

The main draw of consoles is a standard configuration though. I know when I buy a game for a console it will just work and it will look like it was intended. With a PC I have no idea if the game is going to run properly, look anything like the pictures, or if some company hasn't paid off the game maker to make sure it doesn't run properly simply because I chose a different brand of video card.

Add-ons that increase console power tend to do horribly anyways. See the Sega 32x/CD, or N64 DD or RAM upgrades.
 
What consoles really need is an idiot-proof plug-and-replace video card. I think that would do a lot to the longevity of a console.

Probably the best idea right therel. Make it plug and play. Would have to be limited to Sony or MS's official or endorsed card. Allow an auto detect of hardware changes. Release updated drivers for new cards via system updates. Turn off the console remove the old card in a box. Plug in the new. Boot it up wait for it to choose the correct driver and your off and running.
 
If you've ever used a PC where you significantly hit your RAM limit and it started using the HDD virtual memory you'll know that using a HDD as RAM is basically akin to smashing your head against a wall. :p

I know, but if some software/game is programmed correctly and caches files that doesn't require the vast speed, it can serve the purpose just fine.
 
Probably the best idea right therel. Make it plug and play. Would have to be limited to Sony or MS's official or endorsed card. Allow an auto detect of hardware changes. Release updated drivers for new cards via system updates. Turn off the console remove the old card in a box. Plug in the new. Boot it up wait for it to choose the correct driver and your off and running.
Sounds awful.

Besides, I think they already have these. They're called Macs.
 
Having a higher resolution does not give you any sort of advantage. See top level Quake/Unreal/CS play where people run at ultra low resolutions and strip out all details, because it translates into a massive advantage.

Even with consoles for some competitive games, if you are not both playing on the same console head to head, a person running their game off a CRT is going to have a pretty large advantage than a person running of an HDTV simple because there is no input lag.

So no, increasing resolution and graphical fidelity has never given you an advantage over anybody else, and in most cases will hand you a disadvantage. An eyefinity configuration with 3 30 inch monitors is worse from a competition standpoint than a single 19inch CRT pushing 800x600.

The reasons why competitive players of those games run at ultra low resolutions is because of the design of the game engine. They process more network traffic when they are running at higher FPS so you're able to react quicker to things that are happening.

Similarly, those same people use CRTs because they have higher refresh rates than LCDs so they can see changes on-screen faster than a person running at a lower refresh. These games also have server side options to enforce a maximum FOV which completely defeats eyefinity. The maximum refresh rate of a CRT also (typically) decreases as the resolution increases which is another reason not to run at the maximum resolution supported by the monitor.

If they could keep the same FPS and refresh rate and run 3 30 inch displays at higher resolutions it would be better than what they've got now.
 
The reasons why competitive players of those games run at ultra low resolutions is because of the design of the game engine. They process more network traffic when they are running at higher FPS so you're able to react quicker to things that are happening.

Similarly, those same people use CRTs because they have higher refresh rates than LCDs so they can see changes on-screen faster than a person running at a lower refresh. These games also have server side options to enforce a maximum FOV which completely defeats eyefinity. The maximum refresh rate of a CRT also (typically) decreases as the resolution increases which is another reason not to run at the maximum resolution supported by the monitor.

If they could keep the same FPS and refresh rate and run 3 30 inch displays at higher resolutions it would be better than what they've got now.

No it wouldn't, the pixels would be smaller and you'd have more screen to monitor, major disadvantage, 30inch screens are for graphics and epeen, I'd never game on mine.
 
no what they should of done was jsut released the xbox 360 with a 1gb of ram or more. Espically if they want to do this 10 year -15 year console cycle crap.
 
no what they should of done was jsut released the xbox 360 with a 1gb of ram or more. Espically if they want to do this 10 year -15 year console cycle crap.

Why?

MS and Sony are pretty smart as to where things are going, so is Nintendo.

When the 360 and PS3 were released most people did not have an HDTV, most people still don't even have a 1080p TV. So it's not as if consoles are underpowered for what people actually need.

And if you look at the best console games, and it's not the garbage PC centric crap that's been dragging consoles into the mud for a while, do they really need more RAM? Looking at the console titles I have liked best, on just the PS3, Valkyria Chronicles, Demons Souls, GoW3, Castlevania, and Bayonetta (not counting fighters) I fail to see how any of them would have been better off with "moar rams plz".

And games typically get better towards the end of the consoles life as devs master the hardware better, the PS3 is the best example of this now.

And that 10-15 years, well, it makes sense given the cost to sell all these things at a loss. But they can still release something earlier. The PS2 had a 10 year life easily.
 
No it wouldn't, the pixels would be smaller and you'd have more screen to monitor, major disadvantage, 30inch screens are for graphics and epeen, I'd never game on mine.

I agree that having a CRT is a major advantage. Lack of input lag helps tremendously. But, what you're talking about with a lower resolution is due to many archaic traditions.

When Quake, UT, and CS first came out, you could easily choke an opponent to single frames by clouding up their screen with action (eg: smoke in CS). Having a constant high frame rate was necessary. And there was the school of thought who used the whole 120 fps to get maximum velocity. Also, it was assumed that a lower resolution would increase the hitbox.

Outside of the input lag, do you think any of these exist today? Part of the reason many pros didn't upgrade even today was that they got use to their resolution. It's similar to changing your keyboard or mouse. Try changing any of them, I guarantee your kdr will drop, at least for awhile.

But we're talking about resolution and not screen size. I'd agree with you about screen size and having a larger screen interferes after awhile. It's why I game on my 24" in not my 27" nor 30" monitors. But a higher resolution does grant an advantage. You want more pixels if you can. A higher resolution allows players to change that white pixel into an actual image of a player. Maybe with consolfied fps it doesn't matter, because sniping is useless in most FPS's now due to the extremely small maps and auto-aim, but we're talking PC oriented maps and FPS's here. A higher resolution allows me to hit a player in Bog on CoD:MW with my P90 across the entire map rather than having to pull out a sniper rifle. So while you get 1 shot every 5 seconds, I get 30...
 
CoD and large open map games are non competitive. I play Quake :p and I disable SLI, and run 800x600, pic mip the crap out of it still.
 
Would extra ram really help the consoles that much? I was under the impression that for the n64 it was extra video ram. Extra system memory, while nice does not seem like it would help the consoles run at higher resolutions. I mean I know it would offer more performance, but I thought that would mainly be in places where a lot of stuff is going on, not an overall graphics boost.
 
CoD and large open map games are non competitive. I play Quake :p and I disable SLI, and run 800x600, pic mip the crap out of it still.

You can also play Quake at 640x480, so why is 800x600 better than 640x480 but worse than 1024x768?
 
The reasons why competitive players of those games run at ultra low resolutions is because of the design of the game engine. They process more network traffic when they are running at higher FPS so you're able to react quicker to things that are happening.

Glad someone knows what they're talking about.

CS 1.6 is interesting...it kind of breaks at anything higher than 800*600 resolution. The mouse deadzone become apparent, and hitboxes are all out of whack when I play at 1080p.
 
You can also play Quake at 640x480, so why is 800x600 better than 640x480 but worse than 1024x768?

You can play it at 320x240 as well. I know, because I used to have to do just that.
 
When Quake, UT, and CS first came out, you could easily choke an opponent to single frames by clouding up their screen with action (eg: smoke in CS). Having a constant high frame rate was necessary. And there was the school of thought who used the whole 120 fps to get maximum velocity. Also, it was assumed that a lower resolution would increase the hitbox.

I didn't believe it either, but I tested it. I tested 640*480 vs 1920*1080 in CS 1.6 and the hitboxes behaved very differently. I tried aim_ak-colt and went for headshots on a non moving enemy across the map with an m4a1. At 640*480, headshot every time. At 1920*1080, I couldn't get the headshot to register at all most of the time.

There's something really fishy that just breaks when the resolution gets too high in that game. No idea why.
 
They do.

GT5, a vision that is actually beyond the PS3's capabilities. Uncharted 2 uses pretty much every last resource the PS3 has.

If Uncharted 2 pushes the PS3 to the limit then how is number 3 looking better and running in 3D?
 
Because game code is so complex that you can always improve it 'just a little bit'.
 
yeah my 360 is running sluggish with new titles... be cool to upgrade the chip and gpu too. and maybe put in a big heatsink.
 
And if they made consoles upgradeable, what you'd have is a........PC :)
(Which kind of negates the purpose of the console..lol)
 
If an empty memory slot is of no benefit, then the whole console industry should and will die! Die! Die! Die!

Hardware is powerful enough and software is complex enough that there no longer is enough benefit from uniform hardware to justify the limitations of uniform hardware. I think we'll start seeing consoles becoming more like computers, with various degrees of performance, and regular performance improvements. This will mean that companies won't have to take huge losses when a new console is released nor will they have to waste potential by holding back the performance of later hardware for the sake of compatibility.

There will be a new Xbox next year and every year after that, about October. Merry Christmas.
 
Best hardware upgrade of all time...

The 4MB Ram upgrade for the Sega Saturn.


Allowed me to play the arcade perfect port of Marvel Vs. Street Fighter. Cost me $70 for an import of the game + $40 for the RAM cartridge (my version doubled as extra save game space + a game genie)....an eternity of work at fast food wages......but i was the envy of all my high school friends. They only had the crappy PSone port that had super slow frame rate and you couldn't switch characters. Seriously, what is Marvel vs. SF if you can't switch your character mid-fight?

Ah, good times. Stuff was so much harder to get back then. :D
 
Last edited:
If an empty memory slot is of no benefit, then the whole console industry should and will die! Die! Die! Die!

Hardware is powerful enough and software is complex enough that there no longer is enough benefit from uniform hardware to justify the limitations of uniform hardware. I think we'll start seeing consoles becoming more like computers, with various degrees of performance, and regular performance improvements. This will mean that companies won't have to take huge losses when a new console is released nor will they have to waste potential by holding back the performance of later hardware for the sake of compatibility.

There will be a new Xbox next year and every year after that, about October. Merry Christmas.

I think you're dreaming. One of the top reasons people say they prefer consoles over PC's is because you just know the game will work the way it's supposed to because everyone has the same hardware. The benefits of a unified system aren't just performance enhancement possibilities, it's a matter of creating the same experience for all users. If Consoles become upgradable in terms of their performance, then developers would have to start making decisions like "Do we design the game around the new, faster console and potentially have problems with people who own the slower one, or do we design a game that potentially looks outdated compared to its competition?"

If you want a console you can upgrade, just buy a damn PC. You can put all the extra memory, faster CPU's and better heatsinks in them that you can afford.
 
The new XBox's actually have a somewhat faster processor than the original XBox's, and they purposely underclock them to keep things uniform

Wait...wasn't the original Xbox equipped with a PIII clocked at around 700MHz? I'm pretty sure the 360 is a triple core processor at around 3GHz...and that's a lot more than just somewhat faster.

I mean unless I'm misunderstanding what you're saying...?
 
Wait...wasn't the original Xbox equipped with a PIII clocked at around 700MHz? I'm pretty sure the 360 is a triple core processor at around 3GHz...and that's a lot more than just somewhat faster.

I mean unless I'm misunderstanding what you're saying...?

I'm actually talking in reference to just the XBox 360. I figured the context would be easy enough to discern the difference, so I apologize that it wasn't.
 
Back
Top