Nintendo's next console "Project Cafe" to be powered by AMD R700 GPU

Consoles are not going to go to a yearly upgrade cycle at least not for hardware. That does not make any sense in their business strategy. A console sells you a subsidized or decently powerful machine for 1 year then for 5 more years they sell you out dated junk. Because it is all propreitory you buy it for the exclusive games and because in the beginning you under paid for it. If they update every year then they would need to charge more to make money on the consoles. Then how would they be any different than a PC?
 
This would be a tricky comparison to make....b/c...we still have to see what Nintendo and developer's design philosophies turn out to be. While the x1950xt in the xbox 360 is noticeable less powerful in raw power than a 4xxx series gpu(note not necessarily a 4870), the x1950xt renders games at under 720p. If the 4870 attempts to render at what is often triple the resolution at 1080p, it's a lot harder on any gpu.

the Xbox 360 GPU as implimented does not even match the power of a Radeon X1800XL let alone anything newer...........

ALL graphic intensive games are rendered at sub 720p and maybe 2x aa

COD gives you 614400 pixels at 30fps 2xaa and unkown ansio (600*1024)

X1800XL gives you 786432 pixels 2Xaa 4x ansio 32fps at (1024*768)

So the 360 only renders 78% of the pixels that the 1800 does and does it slightly slower dispite the 1800 having to deal with windows, dx layer, and a driver.....
 
Yes, the post describes Nintendo as having ultimate senority. How exactly does Nintendo's history making card-games give them seniority in the game console market? Yes, 1977 is their first console date. After several other companies had made consoles like Sega, Atari, etc. Their senority didn't guarantee them future success or lasting success in any case. With all of Atari/Sega's current generation gaming consoles.

I think you missed the point of that post....
 
ALL graphic intensive games are rendered at sub 720p and maybe 2x aa
Umm, thanks for repeating what I said.

COD gives you 614400 pixels at 30fps 2xaa and unkown ansio (600*1024)

X1800XL gives you 786432 pixels 2Xaa 4x ansio 32fps at (1024*768)

So the 360 only renders 78% of the pixels that the 1800 does and does it slightly slower dispite the 1800 having to deal with windows, dx layer, and a driver.....
Thanks again for repeating what I said only said differently. The point of the post was the sub-hd resolutions generated by the xbox 360s gpu are often only 1/3rd or less of the pixels generated when rendering natively at 1080p.

Its pointless to compare equal-benchmarks between a R700 gpu and an R600 gpu atm b/c we dont know if Nintendo will keep with the sub 720p or 720pish native resolution and then upscale or natively render at 1080p.

I think you missed the point of that post....

Ditto. Yes, I ignored part of the earlier posts main point. Just being an old company doesn't mean you have more seniority in a new product line your a new-comer too. I agree I was wrong about what Nintendo was doing back in the 1800s and only wish my knowledge of collector card games from the 1800s was more up to date. Alas, I am not cool enough to know some of those overwhelmingly :rolleyes: interesting facts; and Wikipedia and the modern Internet wasn't around when I was still young enough to play Nintendo games targetted for the 8-12 year old demographic. Just BBSs.

In my defense though, I don't think the Nintendo of 1800s hand-making collectors cards as anything to do with the modern day incarnation other than in name. I wouldn't be surprised if Henry Ford's grandfather ran a printing press but I wouldn't associate Ford with making books. ... apart from car manuals...part ordering books...bah nm.
 
Last edited:
the Xbox 360 GPU as implimented does not even match the power of a Radeon X1800XL let alone anything newer...........

ALL graphic intensive games are rendered at sub 720p and maybe 2x aa

COD gives you 614400 pixels at 30fps 2xaa and unkown ansio (600*1024)

X1800XL gives you 786432 pixels 2Xaa 4x ansio 32fps at (1024*768)

So the 360 only renders 78% of the pixels that the 1800 does and does it slightly slower dispite the 1800 having to deal with windows, dx layer, and a driver.....

I don't understand where people come with these comparisons about the X360 and the PS3 GPUs but I believe I like your approach because you base it on pixels.The X360 is difficult because at the time there wasn't an GPU like it. The PS3 on the other hand was a tweaked 7800 GPU with a fancy name (RSX).

Moreover, I think many here don't realize the talent of some of these console developers. Sega and their VF5 series look phenomenal yet it ran on inferior hardware(6800GT/Pentium arcade hardware.) We all know that all these GPUs we buy each new release will never see their true potentials in their lifetime. That is the PC's curse.


If anything, think of it as Nintendo honoring the memory of one of AMD(ATI)'s previous GPUs by taking it to newer heights than before. :D
 
Last edited:
WBurchnall, Nintendo can still use lower internal resolution and keep scaling up like the others, since all that would be required is exactly the same thing the xbox 360 and many dvd players use, a chip designed only for output scaling.


And thus my question still stands for my personal interest if anyone has hard numbers on x1950xt vs 4850 HD.


Btw YeuEmMaiMai i would give the benefit of the doubt that the chip is really kinda like a x1950xt and that the internal settings chosen are just so the minimum framerate is kept high enough at all times, vsynced . I don't have first person experience with it, but i would guess that it would be in their best interest to keep the framerate smooth that way (meaning that it could potentially average a higher non vsynced number, but if run on a higher res the framerate dips would be unpleasant).
 
Umm, thanks for repeating what I said.


Thanks again for repeating what I said only said differently. The point of the post was the sub-hd resolutions generated by the xbox 360s gpu are often only 1/3rd or less of the pixels generated when rendering natively at 1080p.

Its pointless to compare equal-benchmarks between a R700 gpu and an R600 gpu atm b/c we dont know if Nintendo will keep with the sub 720p or 720pish native resolution and then upscale or natively render at 1080p.

No it's not pointless to compare because it demonstrates the relative power of each chip and a 1900 series card clearly outclasses the 360GPU. It appears that you took a rather inaccuate guess as to what the 360GPU was capable of and I just corrected it the Rv4770 will utterly destroy it. Btw, a 1900xt has roughly twice the rendering power of a 1800XL which i compared to the 360GPU



Ditto. Yes, I ignored part of the earlier posts main point. Just being an old company doesn't mean you have more seniority in a new product line your a new-comer too. I agree I was wrong about what Nintendo was doing back in the 1800s and only wish my knowledge of collector card games from the 1800s was more up to date. Alas, I am not cool enough to know some of those overwhelmingly :rolleyes: interesting facts; and Wikipedia and the modern Internet wasn't around when I was still young enough to play Nintendo games targetted for the 8-12 year old demographic. Just BBSs.

In my defense though, I don't think the Nintendo of 1800s hand-making collectors cards as anything to do with the modern day incarnation other than in name. I wouldn't be surprised if Henry Ford's grandfather ran a printing press but I wouldn't associate Ford with making books. ... apart from car manuals...part ordering books...bah nm.

Actually it has everything to do with Nintendo today because without them being around Nintendo would not be here today

BTW a 4770 can easily render Fallout 3 quality graphics at 1900*1200*32 with 8xaa 15x ansio and very high quality textures at a comfortable 35.6fps This a 2.38Mpix vs 2.07 for HD or 87% of that leaving you with even more room. I do not see this particular chip having any issues rendering at 720p or even 1080p with great eq........expecially for Nintendo games which for the most part ARE directed toward younger audiences and casual gamers
 
Last edited:
In my defense though, I don't think the Nintendo of 1800s hand-making collectors cards as anything to do with the modern day incarnation other than in name. I wouldn't be surprised if Henry Ford's grandfather ran a printing press but I wouldn't associate Ford with making books. ... apart from car manuals...part ordering books...bah nm.

If Henry Ford sitting in his factory he inherited watched the presses and a light bulb hit him, he closed down the factory one summer, retooled it, and opened up the first Mass produced car factory line. Then yeah it would apply to Ford today.

Sure making cards doesn't make them the greatest in the line of video game consoles. But the fact that they have managed as a company for over 100 years to make leisure time toys that are fun for the whole family. Well I think that applies.
 
Consoles are not going to go to a yearly upgrade cycle at least not for hardware. That does not make any sense in their business strategy. A console sells you a subsidized or decently powerful machine for 1 year then for 5 more years they sell you out dated junk. Because it is all propreitory you buy it for the exclusive games and because in the beginning you under paid for it. If they update every year then they would need to charge more to make money on the consoles. Then how would they be any different than a PC?

How about instead of losses for one year followed by five years of dated junk, they don't take losses the first year and then they don't sell dated junk the following five? Why doesn't this make any sense?

In the old days, consoles were unique creations that were costly to design, and every last ounce of speed really was needed to be rung out by the software. Now, consoles are essentially locked-down generic computers, and they're powerful enough that every detail of the game doesn't need to be hard-coded. It's time for Sony, Nintendo, and MS to treat consoles like tightly controlled computers (like Apple does, except more so).

For example, in the old days, games were written in machine language and hard-coded to a specific resolution (and, the console makers didn't take big losses, as Sony and MS have done). Now, if MS came out with a new console every year, they could put in the ROM some performance details that a new game reads and then tweaks the game according to those details. Similar to a PC game that can be tweaked, accept on the console, it'll always be automatic and far more efficient (because the hardware is tightly controlled and pre-benchmarked). Someone with a an old console would get 720p with some features turned off. Someone with a new console would get 1080p with more features enabled.

This means, a company doesn't have to put out a relatively powerful console and take losses at the start to make sure it has a very long shelf-life. This means that later in the console's life, the console makers don't fall far under the value curve. Consider, any of the current consoles, especially the Wii, could be increased in speed several times without raising the cost of production.

I feel as if I have to spell this out for you. The original XBox 360 had a 20GB HDD. The current XBox 360 has a 250GB HDD, over ten times as big, and it costs MS less than MS paid originally. Essentially the same is true of every part of the XBox 360, except on other parts, MS is forced to slow them up for comparability or simply buy weak parts that cost the same as much stronger parts.

The big console companies are just stupid if they keep (and continue to expand) the 20th-century business model. Big companies tend to be stupid. They're like elephants who can squash people but they are not gracefully operated.
 
The marketing of what you suggest would be a nightmare Stone Cold. Hows the average consumer going to work this out? Ok lets see, this is a Wii 5 game but I only have a Wii 1 so x, x, x, and x features won't be available to me. It just doesn't make sense from a marketing perspective, and would just create segmenting and confusion.
 
That sounds like paying for a low end PC every year but losing out on all the other features a computer brings.

It begs the question why do they make consoles in the first place if I can hook a wiimote up to my PC and play Zelda as a PC title? These companies clearly take a loss for the actual hardware, so "piracy" can't be an acceptable answer since the hardware is subsidized and pirated for SP content on the respective consoles anyway... what's the difference? Besides being a few more percent idiot proof.
 
That would be madness, Stone Cold. If you leave too many choices to the marketplace people are just going to throw their hands up and walk away. For example, look what happened to Atari. They tried to treat their console just like a computer, which it is not.

First of all, like you suggested, they skimped on the 2600 hardware so that it wouldn't be too expensive to manufacture, figuring they could just upgrade the hardware further down the road. Except the finalized specs made developers irate because it was an amazingly small amount for the cartridge and many games had glitching errors due to the lack of memory. Atari famously asserted in some cases to customer complaints that the glitches were actually just special effects. They released several semi-yearly upgrades, like the 5200 and the 7800 and even the Jr. and none of them could save the company. Developers hated that, so all the good ones stopped coding, which in turn opened the floodgates of third-party crap, and the rest is history. That's why the console industry doesn't do that anymore.

One console, built with long lasting components yet economically feasible parts, good for a whole lifecycle of five or more years will be the norm as long as we play on mass produced goods. This also has the added benefit of making things easier for the folks who code the games, since keeping the hardware identical makes it simple to optimize for their products.
 
Not everyone can or want's to mess with a PC just to play simple fun games on their TV. That is what Nintendo bet on this round. And big N has seem to won this go round. Oh, and it would seem that Nintendo does not lose money on their HW. They actually make money.
 
It begs the question why do they make consoles in the first place if I can hook a wiimote up to my PC and play Zelda as a PC title? These companies clearly take a loss for the actual hardware, so "piracy" can't be an acceptable answer since the hardware is subsidized and pirated for SP content on the respective consoles anyway... what's the difference? Besides being a few more percent idiot proof.

Console makers make them because they can make money directly or indirectly off them. They cannot charge licensing costs off titles sold if it is not sold for their console.

Sony and MS also push consoles to support other long term interests. Sony for instance used the PS3 to push Bluray.

Consumers buy them because they are cheaper and simpler to use.
 
I don't understand where people come with these comparisons about the X360 and the PS3 GPUs but I believe I like your approach because you base it on pixels.The X360 is difficult because at the time there wasn't an GPU like it. The PS3 on the other hand was a tweaked 7800 GPU with a fancy name (RSX).

Moreover, I think many here don't realize the talent of some of these console developers. Sega and their VF5 series look phenomenal yet it ran on inferior hardware(6800GT/Pentium arcade hardware.) We all know that all these GPUs we buy each new release will never see their true potentials in their lifetime. That is the PC's curse.


If anything, think of it as Nintendo honoring the memory of one of AMD(ATI)'s previous GPUs by taking it to newer heights than before. :D

and there is still no dual die GPU on the market, yet. The xbox360 GPU put, what's equivilent to half of the render back ends, onto that 10MB daughter die...
 
It begs the question why do they make consoles in the first place if I can hook a wiimote up to my PC and play Zelda as a PC title? These companies clearly take a loss for the actual hardware, so "piracy" can't be an acceptable answer since the hardware is subsidized and pirated for SP content on the respective consoles anyway... what's the difference? Besides being a few more percent idiot proof.


You're myopic to the needs of your fellow consumers. You sound like everyone who doesn't understand why the tablet market exists and why they are threatening Laptops and desktops.

This is why MS or Sony is going to come out with their next console next year, being behind Nintendo on the tech curve is just sad. Nintendo has to release a new console because they've lost most of their advantages (lower production cost, monopoly on the motion controller, low demand for HD consoles, etc.).

Next year? Definitely not happening.

Sony is bleeding in too many divisions,not just their gaming division. On top of the recent PSN debacle where consumer confidence is degrading rapidly and lawsuits will fly left and right, they have to hold for at least two years.

MS is profitable in many of its business ventures but MS has made a point of stating they do not want the entertainment division to be reliant on the others looking out for its shortcomings. Now there is a very good rumor that MS is currently divided on whether they should be profitable with their console like Nintendo or fall back on their old strategy. If Nintendo's specs are too high MS releasing one year after won't be nearly as impressive or as effective in improving profits as waiting for a year and half.

Take note that I didn't say two year because that would be way too long IMO and would more likely put Nintendo of becoming the next PS2 where the inferior console was defacto king because of the strongest third support out of all the other consoles.
 
I think this would be massively awesome. And come to think of it, if the new console used a 4770, that would be a pretty formidable machine, albeit still old by today's standards.

EDIT: Plus, you guys whining about DX11, consider this: would you really want an HD 4850 class GPU doing native 1080p with tessellation? Do you know how badly that would kill performance? It would be totally unacceptable for a console.

But remember that consoles aren't PC. PC gaming uses a standard API to accomodate a broad wide of different hardware/configurations, but PS3/Xbox 360 and whatever thing that comes from Nintendo uses a special Software Developing Kit that works as a Close To Metal API, which means no overhead and maximum performance/quality, something that can't be said from PC's. PC's are about 10 times in average or more powerful than a Xbox 360 or PS3 and yet, the game's doesn't look 10 times better, not even Crysis.

Another scenario, can you run Crysis 2 on medium/high settings on a 7800GT GPU at 720P?? But yet, you can do that on a PS3 which uses the exact same GPU. That's why there was an article around on AMD's site that said that many developers wanted to drop the API and use Close To Metal stuff which will increase performance and take advantage of developer's creativity as they're severely limited by the API itself. (Of couse it isn't practical as every GPU is unique and coding for each one will be hell on hell). A good example of this, DX9 on PC can only process up to 5,000 draw calls in a rendering scenario, the same kit which uses similar language on the Xbox 360 and PS3 can make up to 30,000 draw calls thanks to the Close To Metal nature of the PS3/Xbox 360 which has very little overhead.
 
You're myopic to the needs of your fellow consumers. You sound like everyone who doesn't understand why the tablet market exists and why they are threatening Laptops and desktops.



Next year? Definitely not happening.

Sony is bleeding in too many divisions,not just their gaming division. On top of the recent PSN debacle where consumer confidence is degrading rapidly and lawsuits will fly left and right, they have to hold for at least two years.

MS is profitable in many of its business ventures but MS has made a point of stating they do not want the entertainment division to be reliant on the others looking out for its shortcomings. Now there is a very good rumor that MS is currently divided on whether they should be profitable with their console like Nintendo or fall back on their old strategy. If Nintendo's specs are too high MS releasing one year after won't be nearly as impressive or as effective in improving profits as waiting for a year and half.

Take note that I didn't say two year because that would be way too long IMO and would more likely put Nintendo of becoming the next PS2 where the inferior console was defacto king because of the strongest third support out of all the other consoles.

at this point in time there is NO reason why MS could not make a profitable console from launch day as there are plenty of ATi gfx chips and power PC chips that would have no problem running a 1080p display with hig fiedelity graphics..... in the last 5 years we have seen STAGGERING increases in video processor performance with ATi (and Nvidia) typically delivering 2x the performance with each new generation up until the 5xxx series with the 6xxx series being a performance/power tweak of the 5xxx series due to delays in the manufacturering process shrink.........
 
The hardware increases are always staggering. The custom RISC processer running at 233mhz in the PS2 was replaced with a 7800GT in the PS3. Just as big a jump.
 
Don't be surprised if the next Xbox uses and APU, possibly even an AMD trinity based chip. Keeps the cost down, performance up, they can work with AMD to create a one off version that won't take a whole lot of development time and would be compatible with the video sub system of the Xbox360, which should prevent a backwards compatibility issues.
 
But remember that consoles aren't PC. PC gaming uses a standard API to accomodate a broad wide of different hardware/configurations, but PS3/Xbox 360 and whatever thing that comes from Nintendo uses a special Software Developing Kit that works as a Close To Metal API, which means no overhead and maximum performance/quality, something that can't be said from PC's. PC's are about 10 times in average or more powerful than a Xbox 360 or PS3 and yet, the game's doesn't look 10 times better, not even Crysis.

I disagree with this in reality it does not work just like this. Even though it is possible it is also possible to highly optimize for anything but in reality it is not done nearly as much as people think. Many game devs are developing for 4 platforms at the same time they run porting software to help them there is no way they are digging into all thre consoles optimizing things down to assembly they are just dropping settings then letting the console or TV upscale. Just think about in the early console wars Sony even though it had more power was losing out to M$ with the better development platform showing that companies were more concerned with speed of development than Power or optimization. The point is almost no one is doing a really good job optimizing for the hardware. This FUD is just something sold to people to justify how far behind consoles are just like apple spent all those years trying to justify PowerPC when they were falling behind and eventually abandoned it, eating every word they had said.

Stone cold as other said people have tried the common update and like I said that makes a console very much like a PC and then it begs the question why would the maker make the console without thehuge reward of years of low cost production and high profits, why would developers develop games for consoles if not for the single platform with a very large user base, and why would customers buy it when they could get a comparable all in one PC. If M$ looked at that they would probably just say skip it lets get XBL working on PCs and let people buy their own hardware and lets get our peripherials working better on PCs. Basically lets make a dedicated add on suite for windows that turns any computer into a 4 controller device. Personally I think that would be great but I know it would not happen because the high profit rip off model is not there.

Also as others said alot of companies use consoles as a way to support their primary business. Heck did you know the only reason M$ made Office was to sell more windows?
 
Last edited:
This is pretty great news.

We have to remember: The Radeon 4000-series going into "Project Cafe," as others have mentioned before, is not saddled by a complex OS, its libraries, and APIs.

Consoles are much more elegant and "simple" (so to speak) in their design. By focusing on a specific set of hardware, console developers can exploit the hardware to its near 100% capability.

For example, take a look at the X1950 GT in the XBox 360. Take a look at the games developed so far for that system and ask yourself: Would those games even be possible by desktop PCs using Windows and the same GPU using a common DirectX API? No.

When the X19xx series was out, the games that were only possible on the 360 did not exist then. It only took GPUs of the past few years (AMD 4xxx to 5xxx and Nvidia 2xx to 4xx) to deliver the performance capable of running games that looks equivalent to what the consoles were doing on older hardware since 2005/2006.

Crazy to think about, isn't it?

I know computers are more powerful than consoles themselves. But, just imagine what a new Nintendo console is capable of without a full-fledged desktop OS and a DirectX API limiting the GPU's capability, and Nintendo using a CUSTOM API to handle calls to the entire GPU. Let's say that Nintendo uses AT MINIMUM a Radeon 4770 GPU and we compare it to the 360 (X1900-series) and PS3 (7800-series) GPUs.

(Note: Source is from http://en.wikipedia.org/wiki/Comparison_of_ATI_graphics_processing_units)

"Project Cafe" w/ Radeon 4770 un-customized (as an example)
750 MHz GPU
640 Unified shaders
32 texture mapping units
16 ROPs
24 GT/sec fillrate (Gigatexels)
12 GP/sec fillrate (Gigapixels)
51.2 GB/sec memory bandwidth to 512 MB GDDR5 memory at 800 MHz
XBox 360 Xenos GPU (Custom X1950 GT)
500 MHz GPU
48 Unified shaders
16 texture mapping units
8 ROPs
8 GT/sec texture fillrate
4 GP/sec pixel fillrate
22.4 GB/sec memory bandwidth (GPU to 512 MB GDDR3 Shared RAM at 700 MHz)
(Note: 500 MHz Logic chipset with 10 MiB eDRAM on GPU is used to assist the Xenos GPU to handle 4x FSAA, alpha-blending, and Z-buffering without affecting the GPU's performance and has 256 GB/sec bandwidth to its own eDRAM memory. Take that into consideration.)
Playstation 3 RSX GPU (Custom 7800-series GPU)
500 MHz GPU
24 Pixel shaders
8 Vertex shaders
24 Texture filtering units
8 ROPs
12 GT/sec texture fillrate
4 GP/sec pixel fillrate
22.4 GB/sec memory bandwidth (GPU to 256 MB GDDR3 RAM Dedicated at 650 MHz)
(Note: Compared to the 360, AA, alpha-blending, and Z-buffering is handled directly on the PS3's GPU with no assistance.)
Just by going on numbers alone, this system if it uses a Radeon 4770 will be more powerful than either the PS3 or 360 by a big margin. And, there are critics here stating that Nintendo should have gone with a 5000-series or even 6000-series GPU?

It would, one, increase the cost of the console, and, two, make the 360 and PS3 look obsolete for several years. Nintendo is about affordability and gameplay first. They were not concerned with graphics or eye-candy. Games is their sole business and nothing more. It's why they've lasted this long in the likes of Microsoft and Sony, whose business has not been solely on games. And, Nintendo would not want to repeat the same mistake Sony did by releasing a console at an exorbitant price using exotic hardware. Radeon 4770 goes for around $100 and that's the price of it retail, not the GPU itself. So, the system will be relatively cheaper to produce given that it is an older GPU.

If the "Project Cafe" console does use an R700 series, likely a 4770 GPU, the console will not be burdened by a Windows OS nor its DirectX API and the hardware libraries needed to run a full desktop OS. So, if you remember how Radeon 4770 handled Crysis in Windows, imagine if it had a smaller OS kernel, much less libraries and a custom API and what kind of games are seemingly possible on this new Nintendo system.

The system will probably be the first console system to render games at 1080p without fancy hardware/software scaling if it is done right. You have to realize that the majority of the games on the 360 and PS3 are done at half the resolutions of 1080p or rendered at the full 720p and then are scaled up if going to 1080p... yet, still look good at it.

What would a Radeon 4770-based Nintendo console is capable of then? A much less pixelated and smoother looking Super Smash Brothers Brawl and Twilight Princess at 1080p that's for sure.
 
Last edited:
I disagree with this in reality it does not work just like this. Even though it is possible it is also possible to highly optimize for anything but in reality it is not done nearly as much as people think. Many game devs are developing for 4 platforms at the same time they run porting software to help them there is no way they are digging into all thre consoles optimizing things down to assembly they are just dropping settings then letting the console or TV upscale. Just think about in the early console wars Sony even though it had more power was losing out to M$ with the better development platform showing that companies were more concerned with speed of development than Power or optimization. The point is almost no one is doing a really good job optimizing for the hardware. This FUD is just something sold to people to justify how far behind consoles are just like apple spent all those years trying to justify PowerPC when they were falling behind and eventually abandoned it, eating every word they had said.

Stone cold as other said people have tried the common update and like I said that makes a console very much like a PC and then it begs the question why would the maker make the console without thehuge reward of years of low cost production and high profits, why would developers develop games for consoles if not for the single platform with a very large user base, and why would customers buy it when they could get a comparable all in one PC. If M$ looked at that they would probably just say skip it lets get XBL working on PCs and let people buy their own hardware and lets get our peripherials working better on PCs. Basically lets make a dedicated add on suite for windows that turns any computer into a 4 controller device. Personally I think that would be great but I know it would not happen because the high profit rip off model is not there.

Also as others said alot of companies use consoles as a way to support their primary business. Heck did you know the only reason M$ made Office was to sell more windows?

Well, that's interesting, then can you care to explain why Crysis 2 can be run on medium high settings on PS3/Xbox 360 while it would be unplayable on a PC with a Radeon X1800GTO or a Geforce 7800GT using the same settings? Of course, not all games are heavily optimized, some games runs like crap and looks like crap on both consoles plus the PS3 is hell on earth due to its hard programming interface, plus the fact that Xbox 360 has more general purpose computing which matters more in games than PS3 theorical performance with their Cell SPU's that will never be unleashed in games. Remember than the PS3 and Xbox 360 were released in 2005-2006, it isn't like they were able to use a brand new 8800GTX GPU or a Radeon HD 2900XT at that time, specially when those consoles were engineered years before launch.
 
This is pretty great news.

We have to remember: The Radeon 4000-series going into "Project Cafe," as others have mentioned before, is not saddled by a complex OS, its libraries, and APIs.

Consoles are much more elegant and "simple" (so to speak) in their design. By focusing on a specific set of hardware, console developers can exploit the hardware to its near 100% capability.

For example, take a look at the X1950 GT in the XBox 360. Take a look at the games developed so far for that system and ask yourself: Would those games even be possible by desktop PCs using Windows and the same GPU using a common DirectX API? No.

When the X19xx series was out, the games that were only possible on the 360 did not exist then. It only took GPUs of the past few years (AMD 4xxx to 5xxx and Nvidia 2xx to 4xx) to deliver the performance capable of running games that looks equivalent to what the consoles were doing on older hardware since 2005/2006.

Crazy to think about, isn't it?

I know computers are more powerful than consoles themselves. But, just imagine what a new Nintendo console is capable of without a full-fledged desktop OS and a DirectX API limiting the GPU's capability, and Nintendo using a CUSTOM API to handle calls to the entire GPU. Let's say that Nintendo uses AT MINIMUM a Radeon 4770 GPU and we compare it to the 360 (X1900-series) and PS3 (7800-series) GPUs.

(Note: Source is from http://en.wikipedia.org/wiki/Comparison_of_ATI_graphics_processing_units)

"Project Cafe" w/ Radeon 4770 un-customized (as an example)
750 MHz GPU
640 Unified shaders
32 texture mapping units
16 ROPs
24 GT/sec fillrate (Gigatexels)
12 GP/sec fillrate (Gigapixels)
51.2 GB/sec memory bandwidth to 512 MB GDDR5 memory at 800 MHz
XBox 360 Xenos GPU (Custom X1950 GT)
500 MHz GPU
48 Unified shaders
16 texture mapping units
8 ROPs
8 GT/sec texture fillrate
4 GP/sec pixel fillrate
22.4 GB/sec memory bandwidth (GPU to 512 MB GDDR3 Shared RAM at 700 MHz)
(Note: 500 MHz Logic chipset with 10 MiB eDRAM on GPU is used to assist the Xenos GPU to handle 4x FSAA, alpha-blending, and Z-buffering without affecting the GPU's performance and has 256 GB/sec bandwidth to its own eDRAM memory. Take that into consideration.)
Playstation 3 RSX GPU (Custom 7800-series GPU)
500 MHz GPU
24 Pixel shaders
8 Vertex shaders
24 Texture filtering units
8 ROPs
12 GT/sec texture fillrate
4 GP/sec pixel fillrate
22.4 GB/sec memory bandwidth (GPU to 256 MB GDDR3 RAM Dedicated at 650 MHz)
(Note: Compared to the 360, AA, alpha-blending, and Z-buffering is handled directly on the PS3's GPU with no assistance.)
Just by going on numbers alone, this system if it uses a Radeon 4770 will be more powerful than either the PS3 or 360 by a big margin. And, there are critics here stating that Nintendo should have gone with a 5000-series or even 6000-series GPU?

It would, one, increase the cost of the console, and, two, make the 360 and PS3 look obsolete for several years. Nintendo is about affordability and gameplay first. They were not concerned with graphics or eye-candy. Games is their sole business and nothing more. It's why they've lasted this long in the likes of Microsoft and Sony, whose business has not been solely on games. And, Nintendo would not want to repeat the same mistake Sony did by releasing a console at an exorbitant price using exotic hardware. Radeon 4770 goes for around $100 and that's the price of it retail, not the GPU itself. So, the system will be relatively cheaper to produce given that it is an older GPU.

If the "Project Cafe" console does use an R700 series, likely a 4770 GPU, the console will not be burdened by a Windows OS nor its DirectX API and the hardware libraries needed to run a full desktop OS. So, if you remember how Radeon 4770 handled Crysis in Windows, imagine if it had a smaller OS kernel, much less libraries and a custom API and what kind of games are seemingly possible on this new Nintendo system.

The system will probably be the first console system to render games at 1080p without fancy hardware/software scaling if it is done right. You have to realize that the majority of the games on the 360 and PS3 are done at half the resolutions of 1080p or rendered at the full 720p and then are scaled up if going to 1080p... yet, still look good at it.

What would a Radeon 4770-based Nintendo console is capable of then? A much less pixelated and smoother looking Super Smash Brothers Brawl and Twilight Princess at 1080p that's for sure.

I strongly agree with you, what's killing PC gaming creativity is its overhead and its strict DX limitations. I hope someday it can be fixed with a much better CTM like API.
 
. Nintendo is about affordability and gameplay first. They were not concerned with graphics or eye-candy.. And, Nintendo would not want to repeat the same mistake Sony did by releasing a console at an exorbitant price .

Really?? Where the hell have you been??? $250 for a DS with a $5 piece of parallex paper over a 3 inch screen is an affordable 3D system??? What glorious world of money growing on trees do you live in? The Wii was the most overpriced pos that ended up being Gamestops biggest trade in item EVER. That was $250. Miyamoto said after its release HE wanted it released for under $100. Dual gpu gamecube and a $10 wand. Wow, how far can you kiss a companies ass???

Here is the article from Miyamoto. Yeah, your Big N is no different than the rest of business period. Don't make them out to be any freakin different.

http://kotaku.com/#!215349/miyamoto-i-wanted-wii-to-cost-100
 
@Snowbeast

I agree. The problem with the 3DS is hardware costs. We're talking about relatively new hardware, specifically the 3D glasses-free screen and its PICA GPU. It is the same problem as the PS3 when it was launched using a CELL CPU, RSX GPU and the relatively new Bluray drive at a cost of $499 to $599 for the console. That console costs around $800 to $900 to manufacture in terms of BOM (bill of materials).

This reminds me of a discussion I had with a friend last night. He asked my opinion on if the PS3 included ONLY a DVD-ROM drive, a quad core CPU and shared RAM instead of two separate dedicated RAM units, would the PS3 cost less to make and cost less to sell? I answered him, "Yes." The hardware and what goes into the console/handheld determines the price. A DVD-based, quad core CPU PS3 would have cost much less than $499/$599 for sure. Then again, that's not even considering R&D costs as well.

If Nintendo went with a non-3D screen and a slightly less powerful GPU, it would definitely have cost much less than $250 for sure. If Nintendo went with a PowerVR GPU like in the Sony NGP and quad core CPU, it would cost a bit more than $250.

It is always the hardware that will determine the price of the system. The Wii cost $250 because of the hardware. Like Miyamoto said in the article itself in Kotaku, if they hadn't used NAND memory and other parts such as probably the motion controllers and tracking system, maybe, just maybe the console would have cost less than $250. But, Nintendo was trying something new and not tried before on a console system at that time-- motion controls.

But, remember and something you forgot in your tirade: The Wii came out at $249 originally, much less than the $399 that the 360 was released at originally and considerably a lot less than the $499/$599 that the PS3 was originally priced at. It was still relatively affordable compared to the 360 and PS3, and was priced to be purchas-able by non-gamers and gamers alike, especially those without deep pockets and fat wallets.

Nintendo would have priced the 3DS at less than $250 if they didn't go with something new and not seen on handhelds yet. Would you rather be paying for something old that you have seen before or something new that you haven't seen yet? It's the same argument with graphics card purchases. Would you rather get the slower or older GPU or the faster and newer GPU to run a game at the highest settings to look good? I would go with the latter for sure, just like a lot of [H] posters here.

Can you blame Nintendo for trying something new? If you can make a 3D-capable handheld with graphics nearly equivalent to a PSP that cost less than $250, go ahead and I'll be sure to take a look at it. Until that time happens, Nintendo should at least be congratulated for being the first with a 3D-capable handheld even if it costs a bit.
 
I strongly agree with you, what's killing PC gaming creativity is its overhead and its strict DX limitations. I hope someday it can be fixed with a much better CTM like API.

Most likely, it would take Microsoft to completely redo the DirectX API and Windows OS kernel to make it modular or expandable. It would probably also take Microsoft to redo its entire drivers system.

In my opinion, I think it would probably require GPU makers to make specific drivers per GPU that is made. But, at the same time, it would increase costs and complexity.

It would most likely take a concerted effort by both Microsoft and GPU makers from Nvidia and AMD to redo the entire drivers and API system. If games are to use the full power of the GPU, I would believe a more modular/expandable driver and API combination, an operating system that would automatically custom-tailor itself to the GPU used.

Think of something like this:

Game -> DirectX API -> Driver base -> GPU "plugin"/"extension" -> GPU

The base driver would take the DX API and expand (or contract it) its capabilities based on the plugin used. And, the plugin would just be a small piece of software that fully exploits the GPU. I believe the right word for it is "scalable." And, Microsoft would have to make a DirectX API that would scale itself to all shader and rendering units on the GPU. The driver and a plugin/extension system would in return give the DX API full access to those units.

If you take a look at the Windows NT kernel here, a system that would have to exploit full access to the GPU would need direct access to the GPU itself. And, that would mean going from software (Win32/x64 application) to DirectX API to driver to hardware.

DirectX would have to be its own set of hardware drivers or even kernel with object manager from the looks of it. Then, a base driver and a driver extension for the GPU would go below that.

Something like...

Win32/x64 application -> (Executive) DX API "driver" -> (Kernel mode drivers) [Base GPU driver -> specific GPU extension] -> GPU

(All theory in my head just by going over what I read in the past hour.)
 
One of the benefits of launching any console or handheld is that you know there are millions of people that are happy to pay a hefty premium to be the first to own a device.

There were about 4 million 3DS devices allocated for sales in Europe, the US and Japan. About 3.6 million of them sold. They could have sold all 4 million if they priced the 3DS at $199 but it was better to sell 3.6 million at $250 for their bottom line.

If Sony, Microsoft or Nintendo could get away with selling you their consoles for five times the price they would do that. Businesses are not charities and they watch the markets to see what is the best price point to bring in customers and still make fat margins.

The 3DS is relatively new and has little competition and any hit it is taking in sales is mitigated by very high margins. By the time the competition rolls around or the sales get too low they will just drop the price of the system again.

This console model is hardly any different to the GPU model. When ATI put out a new GPU since they have no competition they can charge what they want for it and by the time Nvidia catches up they will slash their prices.
 
I also think it doesn't really matter too much that the hardware itself isn't DX11 capable. There will never be any version of DX on a system that it isn't created by Microsoft, so you can count that out of the picture right away. The 4800 series does however have tessellation hardware, so it's possible that there could still be some DX11 style graphics thanks to the close to the metal api approach.

At any rate, I'm excited that it has the R700 architecture, as I think that it can really be quite powerful in the console space.
 
This is pretty great news.
snip.

as I pointed out earlier the x360 gpu does not even out perform the x1800xl let alone the 1950 series... and the 1950 series has roughly twice the render power of the x1800xl. (go look it up on Tom's VGA charts 2008)

pixel pushing power (aka end results) bear this out as you can easily find 360 games render resolution and FSAA and ansio levels (if used)
 
Last edited:
R700 as in RV770, the core found in the 4800 series? That's just weird, why use a 55nm chip in 2011? They could get alot better yields and power efficiency out of a new 40nm chip. The 360 and PS3 have been using 45nm for almost a year now.
Just like Microsoft and Sony, no one uses off-the-shelf graphic chips, they license the technology to make their own gpus, like what is currently done. More than likely, the GPU in Nintendo's next machine will be produced on a 40nm process, with a bit of tweaking.

The CPU will be at least a dual core PPC 800MHz - 1GHz, and more than likely 512MB RAM.
 
I don't understand where people come with these comparisons about the X360 and the PS3 GPUs but I believe I like your approach because you base it on pixels.The X360 is difficult because at the time there wasn't an GPU like it. The PS3 on the other hand was a tweaked 7800 GPU with a fancy name (RSX).

Moreover, I think many here don't realize the talent of some of these console developers. Sega and their VF5 series look phenomenal yet it ran on inferior hardware(6800GT/Pentium arcade hardware.) We all know that all these GPUs we buy each new release will never see their true potentials in their lifetime. That is the PC's curse.


If anything, think of it as Nintendo honoring the memory of one of AMD(ATI)'s previous GPUs by taking it to newer heights than before. :D

pixel pushing results is the only way that you can make meaningful comparisons between systems

Just like Microsoft and Sony, no one uses off-the-shelf graphic chips, they license the technology to make their own gpus, like what is currently done. More than likely, the GPU in Nintendo's next machine will be produced on a 40nm process, with a bit of tweaking.

The CPU will be at least a dual core PPC 800MHz - 1GHz, and more than likely 512MB RAM.

CPU will be dual core at least running 3+ Ghz A single core cannot deliver the performance required to game at 1080p with any respectable eyecandy...doubt me? go look at single vs dual core gaming perfomance... as for using an off the shelf GPU with today's cost, it makes perfect sense and since the chip has already recouped it's cost it can be sold cheap enough for the big N to buy it at a price they like.
 
Last edited:
CPU will be dual core at least running 3+ Ghz A single core cannot deliver the performance required to game at 1080p with any respectable eyecandy...doubt me? go look at single vs dual core gaming perfomance... as for using an off the shelf GPU with today's cost, it makes perfect sense and since the chip has already recouped it's cost it can be sold cheap enough for the big N to buy it at a price they like.


1. He said the cpu will be at least a dual core
2. Says in the first post "It will be a triple core"
3. Using off the shelf doesn't make sense. Again, first post says its a revamped design. You want proof? Wait till it comes out. Nintendo will use a gpu that fits the parameters they chose, not whatever they can save $3 on.
 
CPU will be dual core at least running 3+ Ghz A single core cannot deliver the performance required to game at 1080p with any respectable eyecandy...doubt me? go look at single vs dual core gaming perfomance... as for using an off the shelf GPU with today's cost, it makes perfect sense and since the chip has already recouped it's cost it can be sold cheap enough for the big N to buy it at a price they like.

Do you think the 360 actually uses all 6 threads while playing games? The GPU does most of the work, as long as the CPU doesn't hold it back. A ~2GHz dual-core PPC would be enough to push 1080p.

You seem not to understand that they do not use already made chips besides the CPU, off-the-shelf chips would take up too much space on the console's PCB.

They license the graphics technology to make it themselves. This was the problem Microsoft had with Nvidia on the original Xbox. Nvidia would not license the underlying technology to Microsoft, so MS had to abide by Nvidia's manufacturing schedule.

Nintendo licensed the 'Flipper' technology from ArtX before they were acquired by ATi, Nintendo manufactured the GPU, just like they did with 'Hollywood' in the Wii.
 
Wii – 84.64 million as of 31 December 2010

That's a lot of collected dust. :p

Then again, what with the past week's recent unfortunate events and all, you could probably say the same thing for the PS3.

I'm extremely excited to hear that Nintendo's GPU will be much more powerful than the current generations top performer in terms of raw hardware performance (the PS3). This should mean that Microsoft and Sony will up the ante and offer even more powerful hardware, which in turns means games will actually start looking like, well, not shit and stop using DX9 and more DX11.
 
I'm just happy that we don't have to wait until 2015 or something for the next generation of consoles to hit. After Nintendo launches, I'm sure Microsoft won't be too far behind. Who knows when Sony will launch ps4. In the end, this should translate to more graphically advanced games on pc's and finally the jump that we pc gamers have been waiting for.

Exactly. Sony and Microsoft are very much ready for a next generation console war. No way they'd let Nintendo have all the spotlight for long.
 
Back
Top