Next Generation Console Hardware Update

All modern X86 cpus are CISC based. They have a RISC to CISC converter built in them.
Other way round, I think. They're internally RISC-based, with a CISC->RISC μop decoder.

Kind of ironic that CISC ultimately came out on top, and meanwhile everyone - even Intel - admits that RISC is a better way to build a CPU...
 
The CISC vs RISC argument is old and dated. All modern X86 cpus are CISC based. They have a RISC to CISC converter built in them.

As powerful as those IBM PowerPC processors seem, just remember a few fun facts about them.

#1 They're IN-ORDER and not OUT-OF-ORDER like modern CPUS. Since Pentium Pro's, they've been using out-of-order. Which makes the consoles very slow in comparison.

#2 Apple dumped PowerPC in favor of Intel, which has been proven for many years to be much faster. Benchmarks have constantly shown that IBM PowerPC to be vastly slower then Intel's.

#3 Many developers had expressed hatred for the next generation CPUS. Due to the lack of out-of-order, many developers believed their code ran faster on the original Xbox, as opposed to the 360.


Another thing to remember is that Microsoft has expressed interest in merging the Xbox platform with their Windows platform. So rather then having two competing platforms, you'll just have a universal platform. In other words Microsofts next generation console will likely have their games able to be played on PC as well as console. Whether that's a good thing for PC gamers is yet to be seen.

Thank you for reminding me, forgot that processors have RISC to CISC decoder in them now and you're right, the argument is dated.

I was thinking from more of a programming point of view though because I know that OSX had to be changed to run on Intel CPUs.

Similarly, could avoiding having to port DX10 or DX11 to a PPC/RISC based CPU be the reason they opted to return to an x86-based console?

It seems likely it was in my opinion. It saves time, money and resources not having to modify or reprogram DX10/DX11. Keep everything 1:1 parity like I mentioned above just with a slight modification since there isn't a full-blown Windows OS running on the console.

Same thing with Windows 8 for ARM--it has to be reprogrammed to run on that CPU.

Risc or Cisc or the cpu in next generation will play a much more minor role. The shaders or GPU processing is way more important with APIs like OpenCL, Direct Compute, DirectX, OpenGL etc. Beefing up the shaders as much as possible while keeping the total power curve as low as possible, I say 150w maybe max target. A little bit less then the current X360. 100w would even be better.

As for memory, if you want everything to be fast with an APU you better have very fast memory. In this case I would think a custom memory controller for DDR 5 will be used. 2-4gb. Now if this fast pool of memory has access to like a 16gb-32gb SSD then game loading, level loading and caching the game so the hard drive turns off may really make the machine take less power, cooler but also much more quiet in the long run. I don't see a need for 8gb of memory with a SSD.

I really don't think Microsoft will accept loosing money this time around for each hardware sold, especially if they don't have to. APU streamlines and makes simplier many aspects, don't need separate cpu/GPU, don't need a memory controller, more compact making overall design smaller and cheaper to build. Upgrades are cheaper to do, a new smaller process APU, highest cost part reduced, makes an overall big impact later on during the life time of the console generation.

Anyways using an APU looks to be very smart on Microsofts part which also supports their other platforms better then previous generation is even better.

Given that this might be a Bulldozer-based APU and knowingly they'll modify it so much that it'll not even look like the desktop PC version, they'll definitely have to change the integrated memory controller to handle faster RAM.

If I remember from one of the Llano APU reviews, the memory bandwidth for the integrated GPU has been sorely "gimped." They'll most likely "un-gimp" it for use in the next console.

Been there, done that, after Dungeon Siege and D&D Daggerdale turned out to be crappy Xbox ports. Both with out the ability to remap the keys. I am not sure I want to encourage more crappy xbox ports.

Running Xbox 360 inside win 8 sounds kinda weird.... wont some sort of virtual box be required -- even if it is hidden from the user? Ya it would be a nifty and well used achievement, given how the OS for the 360 seems to work it would almost have to require virtualization to pull it off. I wonder if XP Mode is some sort of test product along those lines.

I suggested in another thread here on [H] that it's possible that XP Mode was sort of a test or hint in the direction they wanted to take Windows 8 to.

The whole argument in that thread was no computer hardware can emulate the 360 currently. I suggested that a virtual machine might be a better option since you can just run it on a system with AMD-V or Intel VT-x capable CPU. You would probably need a lot of RAM to do it since the virtual hardware would run in memory like how VMware does it with Workstation and similar.

But, can Microsoft really do that? Can they or do they have the capability to program an entire virtual 360 machine for Windows 8 on the level of performance that VMware does it?

Time will tell since we only got an indefinite clue/hint as to what 360 has to do with Windows 8 right now.
 
AMD Fusion makes perfect sense for a new console. They only have to pay one license fee, cheaper to manufacture, it's a single platform that is designed to work efficiently instead of bringing an IBM CPU and AMD GPU together to work in harmony etc etc
 
The whole argument in that thread was no computer hardware can emulate the 360 currently. I suggested that a virtual machine might be a better option since you can just run it on a system with AMD-V or Intel VT-x capable CPU.
...
But, can Microsoft really do that? Can they or do they have the capability to program an entire virtual 360 machine for Windows 8 on the level of performance that VMware does it?
You know it's not an x86 CPU, right?
 
Would you stop posting this garbage Hardocp?

Lets see, just one week ago you were claiming Cell was going to be in the next Xbox, anybody who follows this stuff (console hardware) closely knows how friggen ridiculous that suggestion was, completely fabricated, and what do you know, this week it's no longer Cell in next Xbox! What a huge shock!

I'm pretty sure your "source" is some 13 year old kid from neogaf emailing you from his mom's basement. Just stop, Steve/Hardocp. You're embarrassing.

Fusion wont be NEAR powerful enough to power the next gen Xbox, especially when AMD is on record recently stating next Xbox will have Avatar quality graphics. How many SP's does the highest end Fusion have? According to wiki, currently it's 400. That's basically an IGP. You heard it hear first guys, the next Xbox is going to be powered by an IGP.

It's funny how all these rumors try to make next Xbox out to be some weakling system. Sorry Hardocp, no matter how much you want that it's not going to happen. It's going to be a discrete IBM CPU and a discrete, very powerful, AMD GPU.

Hell if nothing else, MS wont deviate from their current hardware manufacturers so that they can keep backward compatibility in software. So that rules out any AMD CPU.
 
Hell if nothing else, MS wont deviate from their current hardware manufacturers so that they can keep backward compatibility in software. So that rules out any AMD CPU.

It's not entirely impossible that they could include an Xbox 360 CPU/GPU package onboard like Sony did with PS2 hardware built in original PS3s.

From what I understand, the 360 CPU/GPU is on one chip on the latest slim models. Actually, they might only need the CPU part if it's an AMD GPU, compatibility would be pretty good still.
 
AMD Fusion makes perfect sense for a new console. They only have to pay one license fee, cheaper to manufacture, it's a single platform that is designed to work efficiently instead of bringing an IBM CPU and AMD GPU together to work in harmony etc etc

It makes no sense at all. None.

-I have read AMD's X86 license is such that they're not even allowed to be in consoles. Well more specifically, Microsoft needs to own the IP of it's console CPU and GPU for manufacturing purposes. This is actually why the original Xbox was such a big financial failure, MS didn't own that consoles chip IP, Intel and Nvidia did, so they basically had MS by the balls, MS had no choice but to buy the CPU from Intel and the GPU from Nvidia. That was where the whole arbitration fight with Nvidia etc came from. With Xbox 360 and IBM/AMD, MS now owns that IP, they can fab the chips wherever they want, and costs are much lower. But I have heard AMD's X86 license does not legally allow them to sign over the IP like Microsoft needs. So you likely wont see AMD or Intel in any console pretty much ever again from what I understand. That's reason one this cant happen. IBM is such a big player in consoles for good reason. We already know the Nintendo Wii U for example has an IBM CPU.

-Reason 2, Backwards compatibility. Microsoft does this in software not hardware, so there's pretty much no way they can switch to another brand of CPU. It would make BC a nightmare. And unlike Sony BC is pretty important to MS (they actually force all games to be programmed in Direct X, just so BC will be easier in the future, unlike Sony which allows lower level programming). This is a big factor suggesting an IBM CPU.

-Reason 3, simply not powerful enough. The simple fact is you cant fab a CPU+GPU together that's anywhere near as powerful as the two separate. I suspect that's why Hardocp is pimping this rumor. IF the next Xbox uses a Fusion, it will by necessity be pretty weak. Which also doesnt jive with actual statements recently from AMD about avatar.

Again, why are you all discussing this rumor like it's true when ONE WEEK AGO Hardocp told you the next gen Xbox would have a Cell CPU (there arent enough LOL's for that one, the next Playstation wont even have a Cell CPU it's a dead technology that hasnt had active development on it in years, never mind that Sony owns it, never mind that programmers hate it). If you trust a source that presented such a ludicrous rumor just to go back on it a week later, well...
 
It's not entirely impossible that they could include an Xbox 360 CPU/GPU package onboard like Sony did with PS2 hardware built in original PS3s.

Yeah, and Sony removed that PS2 hardware from PS3 very quickly for a reason, it's expensive. That's a non starter. Xbox 360 hardware is going to be way relatively more expensive in next gen than PS2 was in PS3 too. Every gen the hardware gets more advanced and expensive. Heck 360 itself is still 199/299, still puts out a lot of heat and uses a big enclosure, etc. By this time last gen the PS2 slim was out that was tiny. There's simply no way you're getting X360 hardware in Xbox 720, not for cost reason, or for heat reasons.
 
Thank you for reminding me, forgot that processors have RISC to CISC decoder in them now and you're right, the argument is dated.

To clarify this whole point, all current x86 CPUs are RISC cores of varying architectures, with an x86-based CISC front-end uOp decoder. Thing is that it's much easier to optimize for a smaller set of instructions than for the truckloads of instructions CISC provides. RISC came out on top, even if it isn't clearly visible.

Similarly, could avoiding having to port DX10 or DX11 to a PPC/RISC based CPU be the reason they opted to return to an x86-based console?
No MSFT console uses DirectX in any form or shape, aside from the DX-inspired API. There'd be no point in using a cross-platform library on a console with a fixed configuration, ergo you strip away all the cruft and use the hardware as directly as possible. Ergo no DX :)
 
I've been saying for the last 6 months I thought one of the big 2 were going to use a completely AMD based setup. I do not think it's going to be Fusion only however. It might be a Fusion based APU with a hybrid type setup where the graphics card (6XXX variant) only powers games to cut back on heat/noise for media. I believe this for a few reasons;

  • AMD has the technology to do it.
  • It makes sense for Microsoft with 720 being even more of a Media HUB than the 360
  • It would be be overly complicated to implement and in the future would be simple to progress to a fully Fusion APU without a discrete GPU once technology was able to in another 5-7 years.
 
Do we really need backwards capability? I mean I already have a 360....

I never played any of my Xbox games on the 360. It got packed away the day the 360 arrived. Still where I left it over 5 years ago.

As for folks deriding a custom IGP, I may remind you that the PS3 is still doing quite well with an antique custom 7800GT. If anything the PS3 has shown us that its not about the GPU so much its what you can do in software using the CPU side.

Maybe MS is deciding to go more CPU heavy this time rather then GPU.
 
AMD's integrated graphics could drive 1920x1080 but those are going to be some pretty poor looking graphics. Maybe the next generation of APU will fare much better and that's what will go into the Xbox3.

The next console will need to do 1080p natively @30fps for me to consider it.
Also it will need more than 4GB RAM, hopefully 6GB. If they think 2GB RAM is next gen, PC gamers are in for another long 5-6 years of sub-par games.

6 or 8GB of GDDR5 will tend to be really fucking expensive(and if the console makers use anything else they're nuts, including XDR), plain DDR3 won't have enough bandwidth, and will use far too many chips. If you think it's a good idea YOU try and sell the big three on the benefits of the cost of using a complex PCB with 24(!) memory chips on it with all the traces routed to a less than 2 sq. in. area. Then trying to apply die shrinks and cost reductions to that design over the next 5-10 years.

2GB is far more realistic for a unified architecture, maybe 4GB if density goes up and price goes down in time.
 
Do we really need backwards capability? I mean I already have a 360....

I never played any of my Xbox games on the 360. It got packed away the day the 360 arrived. Still where I left it over 5 years ago.

As for folks deriding a custom IGP, I may remind you that the PS3 is still doing quite well with an antique custom 7800GT. If anything the PS3 has shown us that its not about the GPU so much its what you can do in software using the CPU side.

Maybe MS is deciding to go more CPU heavy this time rather then GPU.

I dont care about BC and I totally agree with your reasoning. However, Microsoft does. And that is why they will not use an AMD CPU and will not use AMD Fusion.

Also about PS3, completely wrong. It uses a 7800GTX, and that was definitely a high end card in 2006 when PS3 released. About as far from an IGP as you could get. So, you just proved yourself even more wrong with your own example.

The correct example would be if PS3 had used a 2005 IGP, in which case, of course it would have been discontinued a long time ago and Xbox would own the whole market.

I would like to see how great the PS3 would be doing "in software using the CPU side" if it had a 2005 IGP (1/20 the power of RSX, probably cannot run Quake 3) instead of RSX. Hint: Not well.

But if you like that philosophy, cool, we'll use that for playstation 4. Low powered IGP in PS4 it is. You said yourself that worked fine so, no argument from you. For Xbox 720 though, we'll use an extremely powerful AMD GPU if it's all the same to you. PS4 will still win because of all that CPU software no doubt.
 
6 or 8GB of GDDR5 will tend to be really fucking expensive(and if the console makers use anything else they're nuts, including XDR), plain DDR3 won't have enough bandwidth, and will use far too many chips. If you think it's a good idea YOU try and sell the big three on the benefits of the cost of using a complex PCB with 24(!) memory chips on it with all the traces routed to a less than 2 sq. in. area. Then trying to apply die shrinks and cost reductions to that design over the next 5-10 years.

2GB is far more realistic for a unified architecture, maybe 4GB if density goes up and price goes down in time.

I think Ps4 will use 2Gb because Sony is a little broke so they can afford a lot, so I agree with you there. It's just too expensive for Sony to use more than 2Gb in PS4.

Microsoft is doing better financially so probably 6-8GB.
 
The CISC vs RISC argument is old and dated. All modern X86 cpus are CISC based. They have a RISC to CISC converter built in them.

As powerful as those IBM PowerPC processors seem, just remember a few fun facts about them.

#1 They're IN-ORDER and not OUT-OF-ORDER like modern CPUS. Since Pentium Pro's, they've been using out-of-order. Which makes the consoles very slow in comparison.

#2 Apple dumped PowerPC in favor of Intel, which has been proven for many years to be much faster. Benchmarks have constantly shown that IBM PowerPC to be vastly slower then Intel's.

#3 Many developers had expressed hatred for the next generation CPUS. Due to the lack of out-of-order, many developers believed their code ran faster on the original Xbox, as opposed to the 360.


Another thing to remember is that Microsoft has expressed interest in merging the Xbox platform with their Windows platform. So rather then having two competing platforms, you'll just have a universal platform. In other words Microsofts next generation console will likely have their games able to be played on PC as well as console. Whether that's a good thing for PC gamers is yet to be seen.

First line is totally wrong. RISC won. Every proc is a CISC to RISC these days.
 
Also about PS3, completely wrong. It uses a 7800GTX, and that was definitely a high end card in 2006 when PS3 released. About as far from an IGP as you could get. So, you just proved yourself even more wrong with your own example.

The correct example would be if PS3 had used a 2005 IGP, in which case, of course it would have been discontinued a long time ago and Xbox would own the whole market.

I would like to see how great the PS3 would be doing "in software using the CPU side" if it had a 2005 IGP (1/20 the power of RSX, probably cannot run Quake 3) instead of RSX. Hint: Not well.

But if you like that philosophy, cool, we'll use that for playstation 4. Low powered IGP in PS4 it is. You said yourself that worked fine so, no argument from you. For Xbox 720 though, we'll use an extremely powerful AMD GPU if it's all the same to you. PS4 will still win because of all that CPU software no doubt.

In your rush to point out I forgot the X in GTX you kinda missed my point entirely but it doesnt matter. Not that it was really a GT or a GTX but a derivative of the 7800 family is neither here nor there.
 
Just in case my point was -

If the PS3 can still churn out interesting and reasonable graphics effects today using a 5-6 year old GPU (supplemented by the CELL/software) that none of us would be seen dead with, do we really need to put the equivalent of a GTX580 in this generation?

No I dont think we do. It only has to render up to 1080p after all.

A custom 6750/GT440 is as high as they need go. The rest can be bolstered by CPU power.

Plus some folks are forgetting this all has to go in a box with a Mrrp of $400 or so.
 
In your rush to point out I forgot the X in GTX you kinda missed my point entirely but it doesnt matter. Not that it was really a GT or a GTX but a derivative of the 7800 family is neither here nor there.

No I got your point, it's totally misguided.

You're saying "look the GPU in the PS3 is crap so it's okay for a next console to have a crap GPU it'll still be better".

What you're totally ignoring is that the GPU in PS3 specifically wasn't crap (and not even close to an IGP) in 2006 when it came out.

These consoles may have to last until 2020 or beyond. What's below average in 2011 will be a joke then.
 
No I got your point, it's totally misguided.

You're saying "look the GPU in the PS3 is crap so it's okay for a next console to have a crap GPU it'll still be better".

What you're totally ignoring is that the GPU in PS3 specifically wasn't crap (and not even close to an IGP) in 2006 when it came out.

These consoles may have to last until 2020 or beyond. What's below average in 2011 will be a joke then.

Noo no noooooo where did I say it was crap? Where? I said it was an antique which it is now. I said folks on this forum wouldnt use it today, which is true.

All I'm saying is that if 5-6 year old tech is still holding up today and the bar for the ultimate performance level hasnt really increased (1080p) then chances are we dont need to put a top end card in for the next generation. Chances are todays mid range will hold up for 4-5 years just as well.

If anyone thinks a 580/6970 spec chip will be in the next console then good luck to them.
 
I think Ps4 will use 2Gb because Sony is a little broke so they can afford a lot, so I agree with you there. It's just too expensive for Sony to use more than 2Gb in PS4.

Microsoft is doing better financially so probably 6-8GB.

I doubt that, look at the hardware in Vita. Sony is not about compromises when it comes to hardware. If anything they learned from the overly complicated mistakes associated with PlayStation 3. Sony's next console will be on par with the next Microsoft console and probably just as easy to program for.
 
as long as the experience is enjoyable it doesn't matter what FPS or how many GB of ram it has in it.....there's no way they would need 6GB of ram in a console with all of the optimizations those things go through...it's not like they have to hold Windows 7 Ultimate with all the service packs and hold all your internet junk files, patches, etc.......they don't run multiple applications at once. They don't multi-box MMO's.

Honestly for what a console does, 2Gb is probably plenty with 4Gb being overkill and they don't tend to spend money on things they dont need when they are taking losses anyways. Watch it have either 2 or 3 GB ram and be just fine with that. Hell, anything over 4Gb in a desktop is mostly wasted even today!

When I play my 360 I don't sit there trying to analyze how many FPS it is running at. I play on a 55" 1080p HDTV. If it wasn't smooth I would be complaining but I'm not......i really dont care if it's 10 FPS.

Devs are asking for 8GB in the next consoles, why? Because they want to make better games.
The ignorance of console gamers isn't factored in here. They'll buy whatever MS and Sony is selling.
For the sake of the gaming industry, consoles has to move into the realm of PC gaming.

don't even count on that first one. It's going to be 720p for the vast majority of games. Most people don't have a TV big enough to distinguish 720 from 1080 at 5 feet much less 8-10 feet.

I'm predicting no less than 8GB based simply on trends from previous consoles and what devs have said this gen about the lack of ram. Not to mention crytek calling for 8GB.

I'll honestly just laugh if it has 2GB. By the time these things are through, PCs will be topping 64-128GB and phones will be at 8-16GB if not more. If these things are super powerful, phones will overtake them before the generation is out. They were pretty powerful this gen, and phones will probably match them just before or after the new ones come out. So unless they want to price themselves out of the market, they'd better blow our minds with this damn machines.

Crytek did say 8GB is needed to develop the games they want in the future. Epic also wants a beefy console. I agree with them.

720p maybe accepted, but 1080p TV's are the standard in today's market. 720p native would be another step back in gaming.
If it wasn't for BF3, this year would have been a sub-par year for PC gaming. It will only continue if consoles are remain the lead platform with low-end hardware.
 
I game on a PC every day with 4Gb of memory and never hit the ceiling or even close really. I don't play today's cutting edge games. I play BFBC2 occasionally and Crysis Warhead/Wars @ 1080 4-8xAA with everything maxed out no problem. Console games aren't going to go from using 512Mb of ram to 4+ Gb in one generation either. Hell, name 5 games for PC that use over 4Gb of ram at 1080 resolution at any settings. We all know console games will be stripped down and optimized compared to their PC counter-parts.

They probably said it would be nice if the development team was working with a dev kit with 8Gb of ram but I seriously doubt you will see anything like that in the next generation of consoles sold to consumers.
 
because IBM cell processors kick major ass. the closes thing that comes to it will be the fusion line from AMD provided that they code correctly for it

The Cell processors was a piece of junk. John Carmack hates it, and Gabe Newell can't stop talking trash about it. The Fusion processor is nothing like the Cell.

Fusion = X86 cores + graphics core
Cell processor = PowerPC core + 8 SPE cores

I believe that one of those SPE cores are even disabled. Most developers don't even use the SPE cores, so for the most part the Cell chip is nothing more then a retarded PowerPC.

Fusion wont be NEAR powerful enough to power the next gen Xbox, especially when AMD is on record recently stating next Xbox will have Avatar quality graphics. How many SP's does the highest end Fusion have? According to wiki, currently it's 400. That's basically an IGP. You heard it hear first guys, the next Xbox is going to be powered by an IGP.
Technically, all console systems are powered by an IGP. Modern Xbox 360's combine both the Xenos and Xonon chip together. Plus, the graphics chip in the Xbox 360 has to share memory with the Xenon CPU. The Xenos graphics chip is considerably behind even the $50 graphic cards you can buy for your PC.

Going Fusion is doing what the Xbox 360 already does.
It's funny how all these rumors try to make next Xbox out to be some weakling system.
Technically, the Fusion will make the next Xbox a power house. Even back when the Xbox 360 was first released, it wasn't comparable to PCs in performance. Considering the Xbox 360 graphics are stuck in DX9, and has a crippled PowerPC processor, the Fusion will bring Xbox up to speed in performance.

-Reason 2, Backwards compatibility. Microsoft does this in software not hardware, so there's pretty much no way they can switch to another brand of CPU. It would make BC a nightmare. And unlike Sony BC is pretty important to MS (they actually force all games to be programmed in Direct X, just so BC will be easier in the future, unlike Sony which allows lower level programming). This is a big factor suggesting an IBM CPU.
Ever heard of emulators? If it's done with software, then emulation is going to be easy. Given that the Fusion chip is powerful enough to emulate the 360 hardware, which it most likely will. BTW, direct to metal is already being done on the 360.
-Reason 3, simply not powerful enough. The simple fact is you cant fab a CPU+GPU together that's anywhere near as powerful as the two separate. I suspect that's why Hardocp is pimping this rumor. IF the next Xbox uses a Fusion, it will by necessity be pretty weak. Which also doesnt jive with actual statements recently from AMD about avatar.
Modern Xbox 360's have both the CPU+GPU together on a single chip. Having the GPU sitting right next to the CPU will also increase performance. Also, why do so many people believe that the crap in consoles is anywhere as comparable to Fusion, or even surpassing it? What evidence has shown you that a Fusion won't utterly destroy what's in modern consoles?
 
I doubt very seriously they will have 8GB's in them. There are very few PC exclusives that actually take that amount, if any. Cost is another concern, do you expect them to use 8GB of DDR3 much less GDDR5?

You guys are dreaming, home consoles are about all components supplementing each other equally while meeting a certain price range. 3GB GTX 580's in Tri SLI on insane resolutions, AA, AF, Details, etc. and ya'll want 8GB's for 1920x1080 with a video card that will not even be remotely that powerful?

How much system memory or video memory do you honestly need at 1920x1080 with a 68XX series video card?
 
We all know console games will be stripped down and optimized compared to their PC counter-parts.
It's funny how the games on PC end up that way also.

You guys are dreaming,
How much system memory or video memory do you honestly need at 1920x1080 with a 68XX series video card?

Devs asked for 8GB, I asked for 4GB. I guess the devs are dreaming, but they are the ones making the games.
 
Devs asked for 8GB, I asked for 4GB. I guess the devs are dreaming, but they are the ones making the games.

Links? I'd be very interested to hear these developer's and their reasoning for wanting such asinine amounts of ram.
 
It wouldn't suprise me if they had a discrete + fusion. Figure Bulldozer 2012-2013 could have 800 SI SP's. (Especially a console specific one.) That's much faster than the current GPU's in consoles, then Crossfire in another 800 SP's or more. You could have low power and decent graphics for web and movies, or high power kick ass graphics for demanding games.

Then in a later hardware revision, you could probably go to 1600 SP's on the APU and drop the discrete card.

I could see going from 512mb RAM to about 2gb DDR5 I suppose.

One last thing, at 1080p, we're not looking at much AA and you don't need crazy amounts more hardware than current gen unless you think it's going to get a lot better. I find that hard to believe since game price is a funtion of game artists time, basically it's going to be too expensive to make games that look signifigantly better. That's part of the stagnation in graphics (besides consoleitis.)
 
Yeah. 2Gb Max. Look at the success they had with 512Mb. 2Gb is 4x as much. Easily enough for any next Gen console game and arguably enough for most of today's PC games with a few exceptions.
 
Have to remember that consoles also don't have the overhead a typical PC has in terms of operating system. They are much more focused on one task, running the program.
 
If this indeed turns out to be true, this is going to be a HUGE win for consoles and PC gamers. Since we get mostly console ports these days, having a leading console utilize the same architecture that PC's are using will be big in that it will likely increase the quality of the ports as well as the quality of the content we will get. This will also make life insanely easier for developers to come out with quality PC stuff while adhering to strict deadlines.

Its also a big win for developers, and MS. Anything that makes development easier cuts costs, and lures the suits like flies to honey.
 
so will games be blu-ray on the new xbox?

Better question yet, will we get blu-ray movie playback on the new xbox?

Of course, most folks will already have BD players and our old PS3's, so it won't matter much.
 
Better question yet, will we get blu-ray movie playback on the new xbox?

Of course, most folks will already have BD players and our old PS3's, so it won't matter much.

I say they might bring back HD-DVD tech for the games discs. after all its sitting there, paid for and they dont have to pay Sony.:D

I dont care for having a BD player in the new machine. I would say this issue is probably causing MS the most concern.
 
Have to remember that consoles also don't have the overhead a typical PC has in terms of operating system. They are much more focused on one task, running the program.

Yea, cause when friends log onto their Xbox 360, it doesn't pop up on my screen. Also, Xbox 360 can play music while playing a game. Yea, extremely focused. :rolleyes:

It's getting to the point where I expect to enter a user name and password to log onto my Xbox.
 
I say they might bring back HD-DVD tech for the games discs. after all its sitting there, paid for and they dont have to pay Sony.:D

I dont care for having a BD player in the new machine. I would say this issue is probably causing MS the most concern.

I believe that is a distinct possibility. HD-30 dual layer discs offer far more space than 99.9% of games will require as it will probably be mostly JaRPG's that might need more space. Even better is the fact that the format is dead, which adds a huge kink in the desire of potential software pirates who arn't as hardcore. Since the format is dead, Microsoft probably could license it for a bargain and maybe offer integrated BD playability by buying something like you had to do with the original Xbox.
 
Yea, cause when friends log onto their Xbox 360, it doesn't pop up on my screen. Also, Xbox 360 can play music while playing a game. Yea, extremely focused. :rolleyes:

It's getting to the point where I expect to enter a user name and password to log onto my Xbox.

I'll trust that you realize there is far far far more going on in a desktop PC environment than in a gaming console.
 
I say they might bring back HD-DVD tech for the games discs. after all its sitting there, paid for and they dont have to pay Sony.:D

I dont care for having a BD player in the new machine. I would say this issue is probably causing MS the most concern.

this makes sense. and hd-dvd would be a good step up from dl dvd.
 
Links? I'd be very interested to hear these developer's and their reasoning for wanting such asinine amounts of ram.

Crytek wants 8GB
http://www.neoseeker.com/news/16241-crytek-wants-minimum-8gb-memory-next-console-generation/
http://www.google.com/webhp?rlz=1C1....&fp=e90cd4f8d77db75c&ion=1&biw=1920&bih=1113

Also, Epic Games Good Samaritain Demo was ran on a PC that would put mine to shame.
One can say that demo can be maxed out on 2GB of RAM, it will be drastically scaled down if turned into a real game.
 
Back
Top