Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Nytegard got it. You want to have a uniform platform. This is the mistake MS made with the harddrive. Since it was possible to not have a harddrive with a 360 most developers just programmed to the lowest common denominator (having no harddrive). So many games didn't take advantage of the extra performance you could get by caching to the harddrive. This slowly changed over the years though and at least you can install games to the harddrive now. Different Memory amounts would be worst though.
False.Games don't push the boundaries of consoles anymore.
but the extra ram could be used in a way like rogue squadron on the n64. you got 320 x 340 without the ram and 640 x 480 with. just think a game like halo you could play in hd instead of that odd rz it runs in.
I know the PS3 can uses the HDD as virtual memory, something like 2GB.
That's not really RAM, but more the graphical capabilities of the XBox. The RAM would allow the levels to be larger and more open. Either way, this would only piss off users. Either many people couldn't play the levels, or if you upgraded the graphical capabilities so it could run at a true 720p resolution, it would give the people who spent more money a significant advantage over people running at 640p. This is one of the complaints console fans use against PC fans, in that on the PC, a lot of a person's skill is due to their machine vs the console where, while there are factors such as TV size and internet connection, there's less "advantages" such as stuttering, or being able to see someone clearly while another person on a cheaper system sees a pixel.
What consoles really need is an idiot-proof plug-and-replace video card. I think that would do a lot to the longevity of a console.
If you've ever used a PC where you significantly hit your RAM limit and it started using the HDD virtual memory you'll know that using a HDD as RAM is basically akin to smashing your head against a wall.
Sounds awful.Probably the best idea right therel. Make it plug and play. Would have to be limited to Sony or MS's official or endorsed card. Allow an auto detect of hardware changes. Release updated drivers for new cards via system updates. Turn off the console remove the old card in a box. Plug in the new. Boot it up wait for it to choose the correct driver and your off and running.
Having a higher resolution does not give you any sort of advantage. See top level Quake/Unreal/CS play where people run at ultra low resolutions and strip out all details, because it translates into a massive advantage.
Even with consoles for some competitive games, if you are not both playing on the same console head to head, a person running their game off a CRT is going to have a pretty large advantage than a person running of an HDTV simple because there is no input lag.
So no, increasing resolution and graphical fidelity has never given you an advantage over anybody else, and in most cases will hand you a disadvantage. An eyefinity configuration with 3 30 inch monitors is worse from a competition standpoint than a single 19inch CRT pushing 800x600.
The reasons why competitive players of those games run at ultra low resolutions is because of the design of the game engine. They process more network traffic when they are running at higher FPS so you're able to react quicker to things that are happening.
Similarly, those same people use CRTs because they have higher refresh rates than LCDs so they can see changes on-screen faster than a person running at a lower refresh. These games also have server side options to enforce a maximum FOV which completely defeats eyefinity. The maximum refresh rate of a CRT also (typically) decreases as the resolution increases which is another reason not to run at the maximum resolution supported by the monitor.
If they could keep the same FPS and refresh rate and run 3 30 inch displays at higher resolutions it would be better than what they've got now.
no what they should of done was jsut released the xbox 360 with a 1gb of ram or more. Espically if they want to do this 10 year -15 year console cycle crap.
No it wouldn't, the pixels would be smaller and you'd have more screen to monitor, major disadvantage, 30inch screens are for graphics and epeen, I'd never game on mine.
CoD and large open map games are non competitive. I play Quake and I disable SLI, and run 800x600, pic mip the crap out of it still.
The reasons why competitive players of those games run at ultra low resolutions is because of the design of the game engine. They process more network traffic when they are running at higher FPS so you're able to react quicker to things that are happening.
You can also play Quake at 640x480, so why is 800x600 better than 640x480 but worse than 1024x768?
When Quake, UT, and CS first came out, you could easily choke an opponent to single frames by clouding up their screen with action (eg: smoke in CS). Having a constant high frame rate was necessary. And there was the school of thought who used the whole 120 fps to get maximum velocity. Also, it was assumed that a lower resolution would increase the hitbox.
Games don't push the boundaries of consoles anymore.
They do.
GT5, a vision that is actually beyond the PS3's capabilities. Uncharted 2 uses pretty much every last resource the PS3 has.
If an empty memory slot is of no benefit, then the whole console industry should and will die! Die! Die! Die!
Hardware is powerful enough and software is complex enough that there no longer is enough benefit from uniform hardware to justify the limitations of uniform hardware. I think we'll start seeing consoles becoming more like computers, with various degrees of performance, and regular performance improvements. This will mean that companies won't have to take huge losses when a new console is released nor will they have to waste potential by holding back the performance of later hardware for the sake of compatibility.
There will be a new Xbox next year and every year after that, about October. Merry Christmas.
The new XBox's actually have a somewhat faster processor than the original XBox's, and they purposely underclock them to keep things uniform
Wait...wasn't the original Xbox equipped with a PIII clocked at around 700MHz? I'm pretty sure the 360 is a triple core processor at around 3GHz...and that's a lot more than just somewhat faster.
I mean unless I'm misunderstanding what you're saying...?