Why Microsoft switched from Intel to Power PC

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
10,785
Impressed? The Xbox 360 was a good console.

"Microsoft and Intel's partnership stems back to the early 80's with MSDOS and Windows. Microsoft would use Intel to power the Original Xbox in 2001. Yet in 2005 with the next generation Xbox 360, they famously split from the chipmaker in favor of IBM and their PowerPC architecture - made famous by Apple and with PowerMac line of computers. In this episode we take a look at why Microsoft dumped Intel for the Xbox 360 game system."

 
The Xbox 360 really wasn't a good console, but neither was the PS3 for the same reasons. Design flaws with the console lead to widespread failures and it took them years and several console revisions to fix. They couldn't be entirely blamed though, the European Union had a lot to do with it with their then recent ban on hazardous materials in consumer electronics, namely leaded solder. The quickly rushed into production ROHS replacement solder formulas were horrific and caused widespread failures of basically everything that used them, which is why you don't see a lot of electronic gear from the mid to late 2000s around anymore, because it all died. Even the heavily refined ROHS solders today are still crap and engineers hate working with it, especially on BGA chips with weird mechanical loads due to thermal expansion. There's no material or amalgam as forgiving as lead was.
 
The Xbox 360 really wasn't a good console, but neither was the PS3 for the same reasons. Design flaws with the console lead to widespread failures and it took them years and several console revisions to fix. They couldn't be entirely blamed though, the European Union had a lot to do with it with their then recent ban on hazardous materials in consumer electronics, namely leaded solder. The quickly rushed into production ROHS replacement solder formulas were horrific and caused widespread failures of basically everything that used them, which is why you don't see a lot of electronic gear from the mid to late 2000s around anymore, because it all died. Even the heavily refined ROHS solders today are still crap and engineers hate working with it, especially on BGA chips with weird mechanical loads due to thermal expansion. There's no material or amalgam as forgiving as lead was.
You know, its funny you mention that, because I remember from around 2006-2009, there were a LOT of equipment failures with electronics, workstations, parts, etc.
What you are saying makes sense, and I remember when RoHS became a new standard around that exact same time period, and that would have been right around when all of those equipment failures really started to ramp up, especially compared to previous years.

It wasn't until mid-2009 that I started to see a reprieve from all of the vast increase in seemingly random equipment failures, and with what you are saying, that makes perfect sense.
Good memory, you are definitely right on the nose! (y)
 
No console has ever impressed me /shrugs

Same from a hardware perspective, but their software/games/exclusives have been generally better than most PC games at least, esp. in the past few years on PS4 and it's pretty amazing the kind of image quality they can squeeze out of 8 year old hardware.
 
Same from a hardware perspective, but their software/games/exclusives have been generally better than most PC games at least, esp. in the past few years on PS4 and it's pretty amazing the kind of image quality they can squeeze out of 8 year old hardware.

Not impressed...eg. the Witcher 3 performs and look better on PC than any console...exclusives are needed because the consoles lack the power of a PC...so they need to lock people to their platform and since perfomance/image quality is not better than PC they cannot rely on the hardware...so exclusives it is.
 
Not impressed...eg. the Witcher 3 performs and look better on PC than any console...exclusives are needed because the consoles lack the power of a PC...so they need to lock people to their platform and since perfomance/image quality is not better than PC they cannot rely on the hardware...so exclusives it is.

Image quality and specs mean little too the quality of the game to me, and I have a gaming PC with a 2080 and 120 Hz Ultrawide G-Sync monitor. I don't care why the consoles need the exclusives. I love games and I'm going to acquire them on any platform that has games worth playing, and it just so happens that most of my favorite games in the past several years have been on consoles (PS4 and Switch specifically). There are a few exceptions for PC games as well or course and I will buy any cross-platform game on PC as well. I just don't buy into the PCMR shit or any fanboyism in general.
 
Last edited:
Same from a hardware perspective, but their software/games/exclusives have been generally better than most PC games at least, esp. in the past few years on PS4 and it's pretty amazing the kind of image quality they can squeeze out of 8 year old hardware.

Exclusives, maybe. Any game that's been properly ported to PC, or was built as a multi-platform game from the start always looks better on the PC. I am sometimes impressed by how good the console games can look at times, but they don't impress me beyond that "not bad for such ancient hardware" kind of context.
 
Exclusives, maybe. Any game that's been properly ported to PC, or was built as a multi-platform game from the start always looks better on the PC. I am sometimes impressed by how good the console games can look at times, but they don't impress me beyond that "not bad for such ancient hardware" kind of context.

Yeah, see above, I pretty much only play the consoles for their exclusives. But given they've made up most of my favorite games lately, they've been well worth the purchase. Of course I'm not impressed with their graphics compared to my PC too and I'd certainly like to have 60+ FPS on them, but personally I have no issues going from 120 Hz on my PC to 30 on my PS4, esp. for slower-paced single player and story-focused games that make up most of the exclusives.
 
You know, its funny you mention that, because I remember from around 2006-2009, there were a LOT of equipment failures with electronics, workstations, parts, etc.
What you are saying makes sense, and I remember when RoHS became a new standard around that exact same time period, and that would have been right around when all of those equipment failures really started to ramp up, especially compared to previous years.

It wasn't until mid-2009 that I started to see a reprieve from all of the vast increase in seemingly random equipment failures, and with what you are saying, that makes perfect sense.
Good memory, you are definitely right on the nose! (y)

Plus there was the whole capacitor plague going around at that time also, which added even more failure points in most electronics back then.
 
Image quality and specs mean little too the quality of the game to me, and I have a gaming PC with a 2080 and 120 Hz Ultrawide G-Sync monitor. I don't care why the consoles need the exclusives. I love games and I'm going to acquire them on any platform that has games worth playing, and it just so happens that most of my favorite games in the past several years have been on consoles (PS4 and Switch specifically). There are a few exceptions for PC games as well or course and I will buy any cross-platform game on PC as well. I just don't buy into the PCMR shit or any fanboyism in general.
Yep. I own everything. Missing out on something because of biases is dumb. I’m a gamer. I love games. That’s as complicated s as it gets for me.
 
Damn, I thought those design flaws were simply because they crammed a lot of heat into a package that didnt adequately cool it off
 
Damn, I thought those design flaws were simply because they crammed a lot of heat into a package that didnt adequately cool it off

That was part of it, but the ROHS solder was the nail in the coffin. Lead free ROHS solders back then and today lacked the ductility of lead based solders, and have a tenancy to work harden from both thermal induced material movement and the heat itself. The joints get really hard and fail from the stress of the minute chip movements.

BGA chip packages which started becoming common in complex and/or high powered chips in the late 90s have lots of heavy weird stresses that require being able to move around, something leaded solder did well. The solder balls under the chip do a lot to anchor the chip firmly in place, which is a problem as the chip heats up. Since a chip doesn't heat up evenly, usually being hotter in the center than the periphery, you end up with a radial stress pattern where the solder balls are pushed in and out from the center of the chip.



This is obviously greatly exaggerated, but shows what happens when a material heats up. These forces are measured in tons, so even tiny imperceptible movements can generate thousands of pounds of force, which is how the solder balls fracture, usually from a shearing stress.

GA-8-Figure-2-and-Figure-3-depict-a-cracked-solder.png
Solder-joint-progressive-crack-17-mm-BGA-V1.png

The only real fix for a failed BGA chip is to remove it from the PCB, get rid of all of the crap solder and reball and remount it to the PCB with leaded solder. You can sometimes get away with drowning the chip in flux and then reworking it with a hot air station, or a BGA rework station. This isn't really a fix though and the joints will usually fail again. You can prolong the life of such devices by keeping them at a uniform temperature. Keeping the device running for long periods of time will greatly extend the life of it since it will be subject to fewer heating and cooling cycles.
 
Yeah I used to work for a large Japanese manufacturer and our north bridge chips dying due to lead free solder and poor flex was a known design flaw.. I sent hundreds of boards for reballing.
Sad part is LG screwed this up on the otherwise near perfect v10 and v20, in a slightly different way with poor underfill.. so even if you reball with actual lead it still fails eventually but lasts longer. On my to do list..
 
Yeah I used to work for a large Japanese manufacturer and our north bridge chips dying due to lead free solder and poor flex was a known design flaw.. I sent hundreds of boards for reballing.

I worked at Keysight for a temporary project several years ago and had the pleasure of talking with one of their engineers about ROHS solder. He told me all of the horrific problems they had with it and how much engineering time had to go into their products to mitigate it, and even then stuff still was failing. He showed me a piece of prototype gear where you could induce BGA failure by just lightly pressing down on the PCB with your finger, it was very interesting.
 
I worked at Keysight for a temporary project several years ago and had the pleasure of talking with one of their engineers about ROHS solder. He told me all of the horrific problems they had with it and how much engineering time had to go into their products to mitigate it, and even then stuff still was failing. He showed me a piece of prototype gear where you could induce BGA failure by just lightly pressing down on the PCB with your finger, it was very interesting.
Lmao that was exactly one of the ways we test and induce failures too. To the point where I had to cut reliefs out of top covers of laptops with a dremel and a jig on 300+ laptops. Still have that as a memento xD
 
I'd agree, but damn it had some good games! I think it turned out to be a good console in the end. But the first few years were rough.

Yes, the games were good, console definitely not. I knew people that had bought 3, 4 and even 5 consoles because they kept failing. It wasn't until the "Elite" and "S" variants in 2009/2012 that reliability of the consoles came way up. I still see broken earlier versions of the console in second hand stores and the junk yard all the time.


Haha, replaced a lot of motherboards from those Dell OptiPlex GX260 and GX270 units due to that nonsense.
Thanks for bringing back so many fun memories! :p

I did that back in 2007, replaced 4000 motherboards in GX270 machines at a local government office. My thumbs got callused and bled from removing memory modules from those boards, not fun times.

But it wasn't fun times for the office either, the replacement boards failed less than a year later because they used the same garbage fake Nichicon/Rubycon capacitors. but by that time, they were due for a refresh and the entire PC got replaced.

I've personally recapped at least a few hundred Dell machines over the years, and they never failed because I sourced legit capacitors from known good suppliers.
 
The Xbox 360 really wasn't a good console, but neither was the PS3 for the same reasons. Design flaws with the console lead to widespread failures and it took them years and several console revisions to fix. They couldn't be entirely blamed though, the European Union had a lot to do with it with their then recent ban on hazardous materials in consumer electronics, namely leaded solder. The quickly rushed into production ROHS replacement solder formulas were horrific and caused widespread failures of basically everything that used them, which is why you don't see a lot of electronic gear from the mid to late 2000s around anymore, because it all died. Even the heavily refined ROHS solders today are still crap and engineers hate working with it, especially on BGA chips with weird mechanical loads due to thermal expansion. There's no material or amalgam as forgiving as lead was.

Yep. Furthermore, working with lead free solder simply sucks. It's easier to 'touch up' with leaded solder and then desolder components unless you want to risk lifting tracks and damaging electronics with 450 degrees C.

I by far prefer working on vintage electronics from the 80's/90's and earlier.
 
Yep. Furthermore, working with lead free solder simply sucks. It's easier to 'touch up' with leaded solder and then desolder components unless you want to risk lifting tracks and damaging electronics with 450 degrees C.

I by far prefer working on vintage electronics from the 80's/90's and earlier.

I never mix leaded and lead free solder. The resulting amalgam is a garbage stew of incompatible metals and fluxes that creates an even worse solder joint than it started out as. The resulting joint often looks like fractured boulders in the side of a mountain under the microscope, rather than the normal sheen of a tin/lead solder joint.

Lifting pads/tracks isn't a problem unless the PCB is damaged from overheating, or by using improper desoldering equipment. I use a desoldering gun with a vacuum pump and rarely have such problems. I have had to fix butcher jobs done by people that had no idea what they were doing with a $10 Walmart soldering iron, in cases like those, anything is possible.
 
Yeah, see above, I pretty much only play the consoles for their exclusives. But given they've made up most of my favorite games lately, they've been well worth the purchase. Of course I'm not impressed with their graphics compared to my PC too and I'd certainly like to have 60+ FPS on them, but personally I have no issues going from 120 Hz on my PC to 30 on my PS4, esp. for slower-paced single player and story-focused games that make up most of the exclusives.

The 30Hz PS4 just kills the experience for me. Personally, I'm hoping that the PS5 finally lets me go back and play these older games at higher refresh rates to see how they were envisioned to be played.
 
Sega Saturn blew my mind when I was younger and played VirtuaFighter for the first time.
Me and my friend saved all of our money and went halves on one. You should have seen the cashier's face when we started counting out change, lol.
 
The 30Hz PS4 just kills the experience for me. Personally, I'm hoping that the PS5 finally lets me go back and play these older games at higher refresh rates to see how they were envisioned to be played.
Yeah I hear that. I actually bought Far Cry New Dawn on PS4 (even though I just beat it on PC) because I wanted to compare the HDR implementation.

Surprisingly, the picture quality on a 4K TV was breathtaking, it definitely looked as good (or maybe better in some ways) than on PC with my monitor.

However, the frame rate killed it. 30 fps is a joke. It was so choppy on the verge of unacceptable. I don't know how people live like that.
 
The 30Hz PS4 just kills the experience for me. Personally, I'm hoping that the PS5 finally lets me go back and play these older games at higher refresh rates to see how they were envisioned to be played.
Not all games are frame-locked to 30fps on the PS4, and the remasters of Dark Souls 1 & 2 both run at 60fps at 4K on the PS4.
So, hopefully just like you are saying, older titles will release and will run at 60fps - hopefully like Bloodborne, Dark Souls 3, and Spider-Man.
 
Not all games are frame-locked to 30fps on the PS4, and the remasters of Dark Souls 1 & 2 both run at 60fps at 4K on the PS4.
So, hopefully just like you are saying, older titles will release and will run at 60fps - hopefully like Bloodborne, Dark Souls 3, and Spider-Man.

I have a hard time with Horizon Zero Dawn. It's just not fluid enough for me to be able to target accurately. I thought it was going to be more like Tomb Raider, but it feels unplayable. I had the same problem with Batman Arkham Knight. I played through the buggy mess on the computer years ago, but when I picked it up on the PS4, I couldn't play it. Too unsmooth to get into it.
 
Last edited:
I never mix leaded and lead free solder. The resulting amalgam is a garbage stew of incompatible metals and fluxes that creates an even worse solder joint than it started out as. The resulting joint often looks like fractured boulders in the side of a mountain under the microscope, rather than the normal sheen of a tin/lead solder joint.

Lifting pads/tracks isn't a problem unless the PCB is damaged from overheating, or by using improper desoldering equipment. I use a desoldering gun with a vacuum pump and rarely have such problems. I have had to fix butcher jobs done by people that had no idea what they were doing with a $10 Walmart soldering iron, in cases like those, anything is possible.

I gotta say, I don't experience this problem mixing lead and lead free solder and I have all the correct equipment. There's nothing to state I can't desolder lead free solder, at a far higher temperature and therefore increased risk to the board and components no matter what the equipment.
 
Yeah I hear that. I actually bought Far Cry New Dawn on PS4 (even though I just beat it on PC) because I wanted to compare the HDR implementation.

Surprisingly, the picture quality on a 4K TV was breathtaking, it definitely looked as good (or maybe better in some ways) than on PC with my monitor.

However, the frame rate killed it. 30 fps is a joke. It was so choppy on the verge of unacceptable. I don't know how people live like that.

I think it depends on the game, and on the person. I find some games much more tolerable at lower frame rates than others. I have an XOX for the convenience of couch play, though I'm mostly a PC gamer, and some titles are ok at 30 for me. FF15 is fine. I mean I notice the lower FPS don't get me wrong, but I am ok with it. Other games, not as much.

Also some people I think it just bothers more. Just depends on the person.
 
Yeah. For some reason, 3rd person games are okay at 30 fps, like Until Dawn, Ratchet and Clank, etc. But first person really need decent refresh.
 
Not impressed...eg. the Witcher 3 performs and look better on PC than any console...exclusives are needed because the consoles lack the power of a PC...so they need to lock people to their platform and since perfomance/image quality is not better than PC they cannot rely on the hardware...so exclusives it is.

Its all about price to performance. Most people don't buy their kids gaming PCs that out perform a console. Most adults don't buy themselves PCs that outperform them either. (at least in the first year or two after a new console gen)

Yes PC gaming will always be better... but until a console is 3-4 years old building a PC that does a better job in general carries a price premium most people won't pay.

I don't care who you are if your a middle age gamer today chances are your first real gaming was done on a console. (there are a very few of us that remember having Vic20s commodore 64s 128s Amigas and strings of PCs.... not all of us are born to computer programming parents lol) Same is true of kids today... I just helped one of my kids friends (there in their mid 20s now) become a member of the PC master race. To really get there you have to get to a point in your life where you have a few grand of disposable income. (or have parents with a little more then the average income willing to bankroll your early serious PC gaming lol)
 
I gotta say, I don't experience this problem mixing lead and lead free solder and I have all the correct equipment. There's nothing to state I can't desolder lead free solder, at a far higher temperature and therefore increased risk to the board and components no matter what the equipment.

The only equipment I've had issues removing lead free solder was from Pegatron gear (ASUS, ASRock) because their pad sizes for through-hole components are like half to quarter size the usual norm. They also like to print the component masking backwards, making it extra fun.

I know there are lead free solders that require much higher temperatures to melt, but I have not come across any on electronic gear I've repaired, it just looks like crap when it melts. If I had to compare it to something, it looks a bit like dirty Gallium.

Not trying to discredit your skills or anything, just going by what I've worked on.
 
The only equipment I've had issues removing lead free solder was from Pegatron gear (ASUS, ASRock) because their pad sizes for through-hole components are like half to quarter size the usual norm. They also like to print the component masking backwards, making it extra fun.

I know there are lead free solders that require much higher temperatures to melt, but I have not come across any on electronic gear I've repaired, it just looks like crap when it melts. If I had to compare it to something, it looks a bit like dirty Gallium.

Not trying to discredit your skills or anything, just going by what I've worked on.

Lead free solder looks like crap when it melts, this I 100% agree with, that's why you don't use it when performing repairs and dilute it with leaded solder first. Lead free solder is total garbage and the cause of most BGA failures.
 
I liked my Xbox 360...or actually 360s (red ringed the first). It was a pretty cool system for the time with some good exclusives that took a long time to make it to the PC. I have a lot of good memories with it; far more than the current crop of consoles.

Personally, I consider modern consoles as the entry point into gaming. They're appliance-like simple to use, inexpensive, and offer a curated gaming experience that will usually result in a relatively good experience. I think of them as a great entry point into gaming. I've been spoiled by high frame rate gaming so 30 or even 60hz gaming isn't for me anymore. However, I have family who prefer to play on them even with gaming PCs because 30HZ doesn't bother them, and they just wanted to chill on the couch with a controller and don't want to mess with settings, mouse and keyboards, resolutions, drivers, or anything like that.

I have to admit, I do miss the late 80s and early 90s when consoles had custom chips that could do things that PCs struggled with. Sonic the Hedgehog ran on a 1979 designed 7mhz Motorola 68000 with a VDP and 72kb of RAM at 60hz. It took a Pentium class CPU, Soundblaster, and decent graphics card to run Sonic when it was released on PC in 1995, and it still slowed down. Plus, there was something fun about knowing there was a chip in your video game cart that powered the game. Nintendo's most famous was the FX chip but they had dozens of different chips between the NES and SNES era. It was an exciting time.

Modern consoles just don't have that magic anymore.
 
I liked my Xbox 360...or actually 360s (red ringed the first). It was a pretty cool system for the time with some good exclusives that took a long time to make it to the PC. I have a lot of good memories with it; far more than the current crop of consoles.

Personally, I consider modern consoles as the entry point into gaming. They're appliance-like simple to use, inexpensive, and offer a curated gaming experience that will usually result in a relatively good experience. I think of them as a great entry point into gaming. I've been spoiled by high frame rate gaming so 30 or even 60hz gaming isn't for me anymore. However, I have family who prefer to play on them even with gaming PCs because 30HZ doesn't bother them, and they just wanted to chill on the couch with a controller and don't want to mess with settings, mouse and keyboards, resolutions, drivers, or anything like that.

I have to admit, I do miss the late 80s and early 90s when consoles had custom chips that could do things that PCs struggled with. Sonic the Hedgehog ran on a 1979 designed 7mhz Motorola 68000 with a VDP and 72kb of RAM at 60hz. It took a Pentium class CPU, Soundblaster, and decent graphics card to run Sonic when it was released on PC in 1995, and it still slowed down. Plus, there was something fun about knowing there was a chip in your video game cart that powered the game. Nintendo's most famous was the FX chip but they had dozens of different chips between the NES and SNES era. It was an exciting time.

Modern consoles just don't have that magic anymore.

It wasn't so much that older consoles had that magic... it is that older PCs didn't. IBM had a business focus with their systems and didn't put any kind of gaming features in their graphics chips. The biggest being no sprites, but also no scaling, only one pallet, etc. Some other earlier systems actually did, like the C64, which could do some cool stuff despite how primitive and cheap it was. Once 3D accelerators became a thing, well then PCs got all that magic. A good 3D card had support for everything that they could pack in, they were designed for gaming after all. So that's where we are today: On a PC you get all the features for gaming out there. AMD and nVidia are always going to pack in whatever latest features they can because of course they are. Thus PCs will have all the magic consoles do.

History could have been very different had IBM done gaming features in their video processors. There is no reason that a VGA card couldn't have had hardware sprites, a scaler, and other functions that consoles did. Had it had those, games would have been able to be much more advanced and/or needed much less CPU. However since it didn't, you had to do all that on the CPU which is why such a seemingly weak console could do so well.
 
It wasn't so much that older consoles had that magic... it is that older PCs didn't. IBM had a business focus with their systems and didn't put any kind of gaming features in their graphics chips. The biggest being no sprites, but also no scaling, only one pallet, etc. Some other earlier systems actually did, like the C64, which could do some cool stuff despite how primitive and cheap it was. Once 3D accelerators became a thing, well then PCs got all that magic. A good 3D card had support for everything that they could pack in, they were designed for gaming after all. So that's where we are today: On a PC you get all the features for gaming out there. AMD and nVidia are always going to pack in whatever latest features they can because of course they are. Thus PCs will have all the magic consoles do.

History could have been very different had IBM done gaming features in their video processors. There is no reason that a VGA card couldn't have had hardware sprites, a scaler, and other functions that consoles did. Had it had those, games would have been able to be much more advanced and/or needed much less CPU. However since it didn't, you had to do all that on the CPU which is why such a seemingly weak console could do so well.


That's a great perspective on it, and I wish IBM had gone down that route. I recall the Apple II being a relatively good gaming machine for it's time, as well as the C64, but as far as the old 8086 machines I can't think of much from my childhood I played on it. Those were all closed platforms compared to the rather open IBM Compatible. I also remember the early days of gaming video cards. S3, Rendition, Voodoo, Power VR. All good times. One thing that my memory likes to block out is how much of a pain it was making sure your card was compatible with the game due to drivers. It seemed everyone had their own proprietary API back then, and it lead to some serious game incompatibilities. I remember having both an Nvidia card and Voodoo card because some games simply weren't compatible. And that was when there were only 5 or so main chips on the market.

If IBM had standardized the parts across their entire line, including the chips, I could see it have being a better gaming platform. But back then there were dozens of chips, dozens of PC manufactures, that could lead to hundreds of chip combinations depending on price point. Without a standardization across the board it would mean having to program and optimize your game dozens of times over. For the younger crowd, Direct X wasn't a thing back then.
 
That's a great perspective on it, and I wish IBM had gone down that route. I recall the Apple II being a relatively good gaming machine for it's time, as well as the C64, but as far as the old 8086 machines I can't think of much from my childhood I played on it. Those were all closed platforms compared to the rather open IBM Compatible. I also remember the early days of gaming video cards. S3, Rendition, Voodoo, Power VR. All good times. One thing that my memory likes to block out is how much of a pain it was making sure your card was compatible with the game due to drivers. It seemed everyone had their own proprietary API back then, and it lead to some serious game incompatibilities. I remember having both an Nvidia card and Voodoo card because some games simply weren't compatible. And that was when there were only 5 or so main chips on the market.

If IBM had standardized the parts across their entire line, including the chips, I could see it have being a better gaming platform. But back then there were dozens of chips, dozens of PC manufactures, that could lead to hundreds of chip combinations depending on price point. Without a standardization across the board it would mean having to program and optimize your game dozens of times over. For the younger crowd, Direct X wasn't a thing back then.

In the early days it was pretty standard by fiat. IBM released a graphics adapter and all the others were compatible. CGA, EGA, VGA. Occasionally someone would extend it a bit, Tandy did with CGA made a version with more RAM that could do 16 colors that a few games supported. However basically everyone just did what IBM did, and IBM didn't offer any kind of gaming features. Then the super VGA days came about and everything went to shit. IBM was really slow getting anything out, and it was expensive, so companies started going their own way and that is when all the compatibility nightmares started. Some cards DID offer acceleration back then, but it was still mostly "business targeted" and so there didn't tend to be things like sprites or anything, it was faster blitters and so on which games could make use of, but wasn't as good as hardware dedicated to doing "game stuff". Also games often didn't rely on it and did things on the CPU since not everything had the same features and all you could really rely on was a given resolution/color support.

However, we don't have to worry about that now. We get to have ALL THE TOYS ALL THE TIMES! :D
 
Back
Top