Xbox Prototype

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,894
Would this be even more of a holy grail than the nintendo playstation? What's new is apparently the shots of the posterior revealing it was a functional unit with DVD drive, etc.

"A look at the first Xbox Prototype"

 
I remember that like it was yesterday. someone did a case mod inspired by that, you could also do it with a hacksaw:

X-BOX-Cut-by-waterjet.jpg
 
Would this be even more of a holy grail than the nintendo playstation? What's new is apparently the shots of the posterior revealing it was a functional unit with DVD drive, etc.

"A look at the first Xbox Prototype"


Nah. There is meaningful history with the SNES PlayStation system. There is zero worthwhile history with th Xbox.
 
Without Xbox the PS4 would be Sony's current and only console - 2.5 times faster than the PS3 and only $799. No mid-generation refreshes, no reason to use high-end hardware and manufacture at near-cost.
 
Without Xbox the PS4 would be Sony's current and only console - 2.5 times faster than the PS3 and only $799. No mid-generation refreshes, no reason to use high-end hardware and manufacture at near-cost.
Without consoles the PC market would actual evolve graphics without being tied to console hardware, like it currently is now. The PS4's biggest competitor isn't the Xbox but PC. There's a reason why all of the Xbox's exclusives have been ported to PC, and Sony is considering this either. The PC is a neutral platform that has a very large market, and that market would go unchanged without consoles.
 
Without consoles the PC market would actual evolve graphics without being tied to console hardware, like it currently is now.

PC games aren't limited by console hardware, they're limited by cost. Anyone could make a game that looks a hell of a lot better than anything a console can produce, but the money isn't there to do so. Consoles not existing wouldn't change that. I'd argue without consoles the game industry would be a shell of what it is now. Consoles are the sole reason gaming is as big as it has become. It never would have happened if PC was the only option, especially in the 90s and early 2000s.
 
PC games aren't limited by console hardware, they're limited by cost. Anyone could make a game that looks a hell of a lot better than anything a console can produce, but the money isn't there to do so. Consoles not existing wouldn't change that. I'd argue without consoles the game industry would be a shell of what it is now. Consoles are the sole reason gaming is as big as it has become. It never would have happened if PC was the only option, especially in the 90s and early 2000s.
In the early 90s PC gaming was expensive and dumb. By the late 90's PC gaming was taking off. You had games like Doom, Quake, Unreal, Half Life, Doom 3, Half Life 2, and etc. These games demanded serious hardware that a 486 DX @66Mhz wasn't going to cut it. You could run Doom on it but that's about it. Today PC gamers are limited to console hardware. You don't need a RTX 2080 to game on PC. Something like a GTX 660 would still be fine today, just not higher than low settings and you'll achieve 30fps. I would say that Red Dead Redemption 2 is as good as consoles will get in terms of graphics.

There's also a reason why PC graphic cards are getting slower as time goes on. The GTX 970 performed like the 780 Ti which was like a $700 graphics card compared to the $330 970. The GTX 1060 performs like a GTX 980, but a RTX 2060 performs like a GTX 1070. Majority of Steam users are using GTX cards from 2016. There's no need to upgrade when games are limited by console hardware. Until the PS5 and Xbox Series X are released, us PC gamers aren't going to see hardware demanding games.
 
In the early 90s PC gaming was expensive and dumb. By the late 90's PC gaming was taking off. You had games like Doom, Quake, Unreal, Half Life, Doom 3, Half Life 2, and etc. These games demanded serious hardware that a 486 DX @66Mhz wasn't going to cut it. You could run Doom on it but that's about it. Today PC gamers are limited to console hardware. You don't need a RTX 2080 to game on PC. Something like a GTX 660 would still be fine today, just not higher than low settings and you'll achieve 30fps. I would say that Red Dead Redemption 2 is as good as consoles will get in terms of graphics.

There's also a reason why PC graphic cards are getting slower as time goes on. The GTX 970 performed like the 780 Ti which was like a $700 graphics card compared to the $330 970. The GTX 1060 performs like a GTX 980, but a RTX 2060 performs like a GTX 1070. Majority of Steam users are using GTX cards from 2016. There's no need to upgrade when games are limited by console hardware. Until the PS5 and Xbox Series X are released, us PC gamers aren't going to see hardware demanding games.
To add to this the PlayStations back in the day had good hardware vs now they are just low end of running custom sw.
 
In the early 90s PC gaming was expensive and dumb. By the late 90's PC gaming was taking off. You had games like Doom, Quake, Unreal, Half Life, Doom 3, Half Life 2, and etc. These games demanded serious hardware that a 486 DX @66Mhz wasn't going to cut it. You could run Doom on it but that's about it. Today PC gamers are limited to console hardware. You don't need a RTX 2080 to game on PC. Something like a GTX 660 would still be fine today, just not higher than low settings and you'll achieve 30fps. I would say that Red Dead Redemption 2 is as good as consoles will get in terms of graphics.

There's also a reason why PC graphic cards are getting slower as time goes on. The GTX 970 performed like the 780 Ti which was like a $700 graphics card compared to the $330 970. The GTX 1060 performs like a GTX 980, but a RTX 2060 performs like a GTX 1070. Majority of Steam users are using GTX cards from 2016. There's no need to upgrade when games are limited by console hardware. Until the PS5 and Xbox Series X are released, us PC gamers aren't going to see hardware demanding games.

It was taking off, but it wouldn't have allowed gaming to become this big on it's own. A lot of PC stuff back then was hobbyist, or kids getting into it because one of their parents liked to mess around with hardware. It really wasn't until around Windows 7 that computers started to get easy to both build and maintain. Even with XP you'd run into a lot of issues that required command line repairs or digging into ancient settings that took some real computer knowledge to work around. Vista was better, but had too many early issues that really hurt it in the long run, by 7 things had gotten a lot better.

Again: It's not about being limited by console hardware, it's limited by money. You're putting the blame in the wrong place. Consoles are only the limit because they make the most money and offer the least "risk". While PC gaming is bigger than it's ever been, the money is still in consoles. We don't see games like Crysis anymore not "because consoles" but because publishers are risk adverse and don't want to spend $50+ million on a AAA PC exclusive with graphics to completely max out modern hardware.

As for RDR2: Hard to say. If we're going to take the sheer size of the game world and the insane amount of detail into account, then it probably is the limit of current consoles. On a pure technical level however, we'll see. Last of Us Part II and Ghost of Tsushima look really good and could give RDR2 a run for it's money.

GPUs are getting slower for the same reason as most tech advances have slowed down: We've hit a brick wall in what can be reasonably achieved generation to generation. This was predicated ages ago and we're seeing it. Going from 14nm to 7nm does not provide the same boosts that going from 20nm to 14nm would have and 20nm itself was a lesser change compared to prior die shrinks. There are diminishing returns at every step, nothing is ever going to increase at the same rate forver. There is always a drop off. Even the generational improvements on ARM have started to slow down compared to what they were a few years ago. There is a limit to what silicon can achieve.
 
It was taking off, but it wouldn't have allowed gaming to become this big on it's own. A lot of PC stuff back then was hobbyist, or kids getting into it because one of their parents liked to mess around with hardware. It really wasn't until around Windows 7 that computers started to get easy to both build and maintain. Even with XP you'd run into a lot of issues that required command line repairs or digging into ancient settings that took some real computer knowledge to work around. Vista was better, but had too many early issues that really hurt it in the long run, by 7 things had gotten a lot better.

Again: It's not about being limited by console hardware, it's limited by money. You're putting the blame in the wrong place. Consoles are only the limit because they make the most money and offer the least "risk". While PC gaming is bigger than it's ever been, the money is still in consoles. We don't see games like Crysis anymore not "because consoles" but because publishers are risk adverse and don't want to spend $50+ million on a AAA PC exclusive with graphics to completely max out modern hardware.
PC gaming was really taking off long before Win 7, and was commonplace by XP. Especially since PC was a useful tool you could game on, while consoles were a luxury.

The money is in PC as much as in consoles, but both are beaten by mobile. The money is also primarily in microtransactions, and games that advance the field in graphics have no place in that regard.

Consoles didn't hurt PC, retarded gamers did.
 
To add to this the PlayStations back in the day had good hardware vs now they are just low end of running custom sw.

Which back in the day are you referring to? Reality or rose tinted glasses?

The PS1 had an awful CPU with no FPU, and an equally anemic VDP (GPU) which had no concept of a Z buffer. It was a miracle that they got 3D working as well as they did. The MIPS R3000 host CPU was already six years old by the time of the console's launch in 1994. A PC from the same time frame and even a couple of years earlier would crush the PS1 in basically every category. But you can't really fault the console because it was a 1980s design significantly delayed by Nintendo that Sony picked up the aftermath of. The PS2 was a whole lot better, but it was still far behind PC hardware of the time.

Consoles are always going to fall behind PCs in specs because they are built to a cost the market will realistically pay for. The only outlier that I can think of that built a console no holds barred is SNK with their Neo Geo AES system that was upwards of $700 in 1990 and individual games were at minimum $150-300 a piece.

Consoles built in the last couple of decades have been built at cost or at a loss with a business model to make up the revenue with licensing fees on publishers, something very lucrative in the console business. The PS3 was probably the worst when it came to cost vs retail, the early PS3s were fantastically complicated with basically an entire PS2 subsystem built in for backwards compatibility. Sony was smart to release a later revision without the backwards compatibility, it was eating them alive in cost.

It was taking off, but it wouldn't have allowed gaming to become this big on it's own. A lot of PC stuff back then was hobbyist, or kids getting into it because one of their parents liked to mess around with hardware. It really wasn't until around Windows 7 that computers started to get easy to both build and maintain. Even with XP you'd run into a lot of issues that required command line repairs or digging into ancient settings that took some real computer knowledge to work around. Vista was better, but had too many early issues that really hurt it in the long run, by 7 things had gotten a lot better.

I think you have your decades confused. PCs as a hobby was in the late 70s and 80s. PC gaming took off in the mid to late 80s and was very widespread by the early 90s with games like Wolfenstein 3D, Doom, Commander Keen, Duke Nukum and more. Gaming exploded in the mid to late 90s with the advent of 3D accelerated graphics cards like the 3dfx Voodoo and there were huge gaming communities by then. We were playing FPS like Quake 3 and Unreal Tournament long before XP even existed.

And Windows XP required command line what? I think you're confusing it for Windows 1.0 through ME which were all DOS based. Windows XP was based on NT which had ditched the command line back in the early 90s. If you have to escape into safe mode and muck around in a command console, something is VERY wrong with your computer, like it is on fire and burning wrong. I'd also argue that a text console is a far better debugging tool than the dumpster fire that Microsoft has done since Windows 8. The failed Metro UI added a second control panel with incomplete duplicated functionality, where you have to go back and forth to set things. Some stuff was removed from the conventional control panel and is only available in Metro and vice versa. With a text console, everything is laid bare, not hidden behind hundreds of hidden windows and cryptic setting names.

While PC gaming is bigger than it's ever been, the money is still in consoles. We don't see games like Crysis anymore not "because consoles" but because publishers are risk adverse and don't want to spend $50+ million on a AAA PC exclusive with graphics to completely max out modern hardware.

To many of us old timers, PC gaming (and games in general) is a decrepit zombified husk of its former self. The golden era was in the 80s and 90s where there were huge genres of games that all tried to be unique and entertaining. As the millennium rolled around, large publishers started subsuming smaller ones and games increasingly were made based on market research and "tried and true" tactics that worked in the past. It's why now the "AAA" games industry churns out shit like "call of booty 69 electric boogaloo" for $69.95 every year, which has been the same game for the last 10 years, just with a couple of new maps and a new coat of paint.

Back in the 90s, games were made on their community generated content. People were attracted to games based on the community and bought into it. Games now, the publisher doesn't want the community to do anything but fork over that predictable chunk of change every year, plus microtransactions for content only they have control over. The few that do allow community content want complete control and rights to all of it, case and point Blizzard going insane after DOTA2 was picked up by valve (an original mod for WC3 that they flagrantly ignored) and made millions. Their response was a shitty EULA update that means anything you make for their games is their property and they don't have to give you anything for it.

The only good games made these days are from small indie studios, I quite literally have not bought a new game in probably close to a decade now.
 
It was taking off, but it wouldn't have allowed gaming to become this big on it's own. A lot of PC stuff back then was hobbyist, or kids getting into it because one of their parents liked to mess around with hardware. It really wasn't until around Windows 7 that computers started to get easy to both build and maintain. Even with XP you'd run into a lot of issues that required command line repairs or digging into ancient settings that took some real computer knowledge to work around. Vista was better, but had too many early issues that really hurt it in the long run, by 7 things had gotten a lot better.
By Windows XP the differences to newer OS's like Vista and Windows 7 were minor. By Windows XP the use of jumpers was largely gone, and hardware was mostly plug and play. Also with the use of HAL from Windows 2000, the XP OS was far more stable. If anyone remembers before XP, when an application crashed then the whole OS crashes. Also PC gaming wasn't a niche when in 1995 Doom was installed in more computers than Windows 95.
Again: It's not about being limited by console hardware, it's limited by money. You're putting the blame in the wrong place. Consoles are only the limit because they make the most money and offer the least "risk". While PC gaming is bigger than it's ever been, the money is still in consoles. We don't see games like Crysis anymore not "because consoles" but because publishers are risk adverse and don't want to spend $50+ million on a AAA PC exclusive with graphics to completely max out modern hardware.
Its common sense dude. Developers will make games based on the lowest common denominator, which is console. Also I think PC has like 1/3 of the gaming market, which isn't enough to justify making a game exclusive to PC and then later on having a nightmare trying to port it to console. This is what happened to a lot of popular games back in the late 90's and early 2000's because consoles were relatively weak compared to PC. So much so that games like Doom when ported to console had some of the features removed. Like levels were made smaller or music was omitted. Doom 3 was released in 2004 and the only console that got a port at release was the original Xbox.

GPUs are getting slower for the same reason as most tech advances have slowed down: We've hit a brick wall in what can be reasonably achieved generation to generation. This was predicated ages ago and we're seeing it. Going from 14nm to 7nm does not provide the same boosts that going from 20nm to 14nm would have and 20nm itself was a lesser change compared to prior die shrinks. There are diminishing returns at every step, nothing is ever going to increase at the same rate forver. There is always a drop off. Even the generational improvements on ARM have started to slow down compared to what they were a few years ago. There is a limit to what silicon can achieve.
GPU's are not CPU's in that they don't have the same limits. Because CPU's do serial work, there's a limit in how fast you can make a CPU. GPU's are pure math machines, and the math they perform doesn't care which is done first or last. There's a reason why GPU's have little cache and use high latency ram compared to CPU's, because it doesn't help in the execution of math. So if you want a faster GPU then you put in more Cuda cores or Stream procesors and memory bandwidth. The problem with this is cost as you're pushing for a bigger GPU die and faster memory. Things like 7nm are also for reducing cost as well as a performance boost/power consumption. The bigger the GPU the more likely a defect will occur on the silicon and the lower the yield. Less GPU's per wafer means higher cost. AMD's chiplet design would do wonders here to reduce cost but so far I've yet to see an AMD GPU using this.
 
Back in the 90s, games were made on their community generated content. People were attracted to games based on the community and bought into it. Games now, the publisher doesn't want the community to do anything but fork over that predictable chunk of change every year, plus microtransactions for content only they have control over. The few that do allow community content want complete control and rights to all of it, case and point Blizzard going insane after DOTA2 was picked up by valve (an original mod for WC3 that they flagrantly ignored) and made millions. Their response was a shitty EULA update that means anything you make for their games is their property and they don't have to give you anything for it.

The only good games made these days are from small indie studios, I quite literally have not bought a new game in probably close to a decade now.
People forget that some of the best games that we play today has come out of the 90's as free mods. Team Fortress was a mod to Quake, which is how we have Team Fortress 2. Counter Strike was a free mod for Half Life that became the smash hit that it is today. DOTA was a free mod for Warcraft 3. So many games were once a mod that became its own game. Today we have Natural Selection, Chivalry Medieval Warfare, DayZ, and the list goes on. If you enjoyed any of these games then you owe it to PC gaming.

 
PC games aren't limited by console hardware, they're limited by cost.
Actually, developers have specifically mentioned console constraints as the reason for certain limitations in multi-platform titles. There's no question the consoles hamstring design size and scope. Example:

A one-handed ballistic shield that was intended to be introduced in Battlefield 4's fourth DLC Dragon's Teeth had to be dropped because of memory constraints of Xbox 360 and PlayStation 3, DICE has said.

The shield was supposed to offer players protection while firing a handgun, but the required animations supposedly took last-gen consoles past their allocation, forcing DICE to drop the idea across all platforms.

"We didn't have enough memory," senior animator Ryan Duffin said during a GDC talk. "It added about a megabyte, but a megabyte on a 2006 console is a lot, so it didn't work out."

I remember something similar from a Gearbox dev re: Borderlands 2. These anecdotes are far between since it's dirty laundry and there's no business upside for developers to shine a light on divisions between platforms on a multiplat title.

That said, I'd agree that PC gaming has net benefited from consoles - many of the big AAA multiplatform titles simply would not have been feasible to produce for PC gaming alone.
 
Last edited:
Whether PC or Console or what have you, I will venture a guess that there are two main things game makers have to deal with and one of them is the Board of Directors and the other is our ever increasing demands for more and better and more realistic and on and on it goes, right? We want "open world" with limited world gameplay LOL, etc. I know I've made comments about games good and not so good but I think it was usually about the incomplete games that I paid full price for such as Mass Effect Andromeda (which BTW I still do play often). Metro Exodus has some things I feel they ought to have fixed by now but they haven't (crosshair dead on target, pull the trigger multiple times and the enemy still doesn't get hit). Seems a common problem when I play Battlefield 4 and other FPS games as well. And in Exodus who the heck is going to jump side to side while firing a rifle and then they seem to still be able to score a hit on me (the AI in Exodus has uncanny aiming ability).

So maybe for a while, regarding future game releases ... it's not about increased eye candy and the like but refining how the game plays and improving the story line and limiting the number of and length of cut scenes :barefoot:

IMO, I think the gaming industry has been doing quite well actually
TF1.jpg
 
Last edited:
People forget that some of the best games that we play today has come out of the 90's as free mods. Team Fortress was a mod to Quake, which is how we have Team Fortress 2. Counter Strike was a free mod for Half Life that became the smash hit that it is today. DOTA was a free mod for Warcraft 3. So many games were once a mod that became its own game. Today we have Natural Selection, Chivalry Medieval Warfare, DayZ, and the list goes on. If you enjoyed any of these games then you owe it to PC gaming.

QWTF was an amazing Quake mod. I played it up until about 10 years ago when the ProzacTF server I frequented was shut down. Having an entire team of TF players vs Quake monsters was great. I still run bot servers for NS, TFC and Sven-Coop and occasionally make a newer map for each.

I will gladly take 20-30 year old games over the modern dumpster fires that exist now, I can care less how good the games look as long as they are easily moddable. Thank jesus for Valve for keeping these old games alive and working. I can tell you with certainty that if they were EA titles or any other "AAA" game studio, they would have been gone and buried decades ago, especially with the push for online only content where the publisher has full control over what you can do with games you buy but no longer own.
 
That said, I'd agree that PC gaming has net benefited from consoles - many of the big AAA multiplatform titles simply would not have been feasible to produce for PC gaming alone.
That's only true because consoles make up the majority of the market. This is why Crysis felt like a mistake for Crytek because the return wasn't justified compared to the investment. You had a game that pushed PC hardware so hard that to this day it will still make modern PCs struggle, and only a select people could actually play the game properly. Like console ports before it, the game had to have some elements of the game missing from the PC version. Remember when Dark Souls 2 had graphics that looked impressive at E3 but were stripped because console limitations? How about Watch Dogs from E3 vs release?

But this does create issues for console exclusives, because the pendulum swings both ways. Making a game exclusive for the Playstation or Xbox will help sell hardware but you're also losing out on potential sales on PC. Multi-platform games don't have this problem as they benefit greatly from releasing a game on all platforms, but exclusives are getting harder to justify. When a console fails like the Xbox then you have no choice but to port your games over to PC. Nobody wants to make their games exclusive to Xbox, and Microsoft isn't going to pony up that kind of money to make up for the loss. Hence why Bungie left Microsoft and left behind the Halo franchise, because Destiny was far more profitable.

 
Last edited:
Back
Top