PCI Express 2.0????

DoggyDaddi

Gawd
Joined
Sep 10, 2002
Messages
783
Ok.. are any applications, games, or interractive/usefull software even CLOSE to apraoching bandwidth saturation with current PCIe? I get the creepy feeling this is just another "Why improve/support what we have, when we can induce upgrades yet again?"

We never really got close to filling AGP 8x did we? What about AGP Pro and all that nonsense? It seems the only time development continues is if its still 'buzzworthy'?


*flails!!* :mad:
 
What makes you think the update is all about bandwidth saturation?

It's not like the AGP -> PCIe switch. It's more like the AGP to AGP 2x switch. New motherboard coming out are just going to support the newer standard. I'm also pretty sure (though not 100% positive, having not read the tech briefs) that PCIe 2.0 will be backwards compatible with PCIe 1.0.
 
No one is forcing you to upgrade.

Systems will automatically have the new feature set this fall and onwards, which is compatible with current PCIe video cards.

The bandwidth improvements are certainly welcomed when considering SLI and Quad SLI and other multiple-GPU solutions. The 8800 GTX in SLI eats up tons of bandwidth, and faster PCIe bus speeds can improve performance at very high settings such as 2560x1600 with AA.

There are many other improvements in PCIe 2.0 though besides just bandwidth. One new feature I am really looking forward to is external PCIe. Imagine setting up an external box JUST for video cards.

Also, remember, PCIe is not just for graphics, it includes other devices as well, it is a true "system" bus, not a graphics only system like AGP.
 
Hardware has to come before Software, it there´s no hardware to work on then no game developer is gonna code for it. Since the early times of video hardware acceleration that has been the norm, i remember when the Geforce1 came out, everybody was talking about what the hell is TnL, what does it does and why should we pay $300 for a card that has features that no game (at that time) supports, If i remember correctly the first TnL game was Evolva, and it came out later that year, but still it´s inevitable that the hardware has to come BEFORE software. And no we have a LOT of unused bandwith with the current PCIe
 
Also, one other very welcomed spec improvement is that PCIe 2.0 slots provide more power to video cards, reducing the amount needed from auxiliary 6-pin power connectors.
 
I guess my 'creepy feeling' comes from the almost immedeat declaration of it's effectively doubling bandwidth. Heck.. we can't even get games to properly support dual cores, and now we have quads... and 64bit support is mediocre at best.

I can't even upgrade to XP-64 or Vista because most of the support software I use won't function properly under them :mad:

All my new gear is arriving tomorrow, and dang if the only thing holding me up NOW is software.

I look at it like this. You can use a sledgehammer to drive a tee at 400 yards or better, but in the end you're still using a sledgehammer to do the work of a fine club and skill.

I would like to see tools and software that actually implement and optimize the situations we have NOW, rather then continuously 'brute forcing' our way thru every situation in the goal of a handfull more frames per second.

My shoulder hurts, and I'm tired of swinging this damn sledgehammer :p
 
I guess my 'creepy feeling' comes from the almost immedeat declaration of it's effectively doubling bandwidth. Heck.. we can't even get games to properly support dual cores, and now we have quads... and 64bit support is mediocre at best.

64-bit support is weak, I'll give you that. But I tend to agree with Gabe Newell who, while talking about retooling Source to be multithreaded, said that the ROI on dual-core processor support isn't worth the cost of development. He said games won't truly take of on multicore platforms until at least 4 cores are in place. I tend to believe him.

I would like to see tools and software that actually implement and optimize the situations we have NOW, rather then continuously 'brute forcing' our way thru every situation in the goal of a handfull more frames per second.

My shoulder hurts, and I'm tired of swinging this damn sledgehammer :p

The bandwidth is just one part of the upgrade. The important part about that, even, is that the transceiver technology they are recommending for it is a mature product. If it was all new tech, I'd be concerned. But it's not. It is a logical, evolutionary, next step. It's hard to look past the news bulletins and read between the lines, but there are plenty of other improvements and changes in PCIe 2.0, beyond bandwidth.

Also, PCIe has nothing to do with 64-bit or multi-core support in games. And don't forget, we are currently on version 3 of the good ol PCI bus. Everything changes and evolves. Especially with computer technology.
 
It always blows my mind when people freak out about stuff like this.

If they don't stay ahead of the game with the PCIe specification, then video cards which require the bandwidth will never be built.

PCIe is the foundation upon which all other devices will be built. If PCIe doesn't stay ahead of everyone's needs, then nobody has room to grow.

Why in the hell would anyone look at this announcement and think someone was trying to make you upgrade? It's short-sighted and paranoid, I swear.
 
It always blows my mind when people freak out about stuff like this.

If they don't stay ahead of the game with the PCIe specification, then video cards which require the bandwidth will never be built.

PCIe is the foundation upon which all other devices will be built. If PCIe doesn't stay ahead of everyone's needs, then nobody has room to grow.

Why in the hell would anyone look at this announcement and think someone was trying to make you upgrade? It's short-sighted and paranoid, I swear.

Ummm... because AGP was abandoned before its capacity was even close to being approached?

Sure.. everyone likes to claim PCIe isn't JUST for graphics..

Yeah.. sorry, but it has no PRACTICAL use for 99% of the consumer base, and has even less support by the industry itself. For over a year we've had PCIe 1x, 2x, 4x, and 8x... and Jack crap for a selection in implmenting them. No hardware, and CERTAINLY no software.

I keep reading people bitch about Creative "getting off their ass" to make a PCIe sound card.... why the hell SHOULD they? The industry just keeps trotting out new standards and dumping support on anything previous. All that does is crush developing companies who simply can't affort to retool and redirect their Dev every 6 months a new standard or 'feature' gets thrown at them. Creative isn't terribly interested in jumping on any bandwagons, and I don't blame them.. I wish MORE companies were willing to wait, as all it does it drive costs up.

Something elsee to look at. DDR has been thoroughly imbedded into the computer enthusiast community, with DDR2 being slow to adopt... I sit here thinking, "it's been almost 10 years, shouldn't the costs for production be almost negligible at this time?"

Yes.. but they keep the costs artificially inflated, otherwise folks avoid upgrading to DDR2. They still make both, so it's not like going from 939 to AM2.

I constantly see posts and 'news' that quote corporate buzz-speak.. using words like 'quantum' .... to me its a lot like swearing... you do it enough, an it stops really carrying any relevant meaning.. its just more horseshit to try and make a point. Problem is, by the time you actually get around to MAKING your point, everyones allready stopped listening

BTW... why do you think Microcrap made DX10 Vista only? Its ALWAYS been a seperate install, irelevant to the OS version. And DON'T tell me its to cut down on scattered support and Dev costs for multiple platforms, as they plan to have something like what... SIX versions of Vista? From everything I've read, most of the 'steps' are isolated and seperately installable modules, and the more you pay, the more of those are included. You CAN'T tell me DX10 couldn't have been a seperate module. It can't possible be that hard to adapt DX10 to XP-Pro or 64, as Vista is supposed to be backwards compaitble allready..

Perhaps you've taken the situation as "the glass being half full".... I'm sitting here thinking "Why the HELL should I pay someone $5 for that glass of water, when ive got ice cold water waiting for me in my fridge at home?" I go home to enjoy My water, you pay for that half a glass of water, and when he pours it into your hands instead of a glass, don't take "The industry hasn't adopted or implemented this water yet, so you're going to have to wait for a glass" as an answer!
 
We dont need it yet, take it back! Seriously they should probably burn their warehoused quadcores as well. Terrabyte hard drives? Crap Platters. :rolleyes:
 
Makes sense to worry when some stuff like this could make your next upgrade a pain in the neck and make you buy a new video card. Thats reason enough for me to worry. If its backward compatible though its all good.
There is a long history of worthless upgrades. I dont have a problem with what microsoft is doing, why should they have to spend the time making Dx10 Xp compatible especially since it will shoot their sales in the foot. The only problem I have with it is the fact that its illegal for anyone else to try and Direct X is a monopoly.
 
With some of you guys attitudes we wouldn't progress very far with computer technology, we'd stagnate. Honestly, I don't understand it. Is this not the [H]?

Me personally I want to see technology move forward and improve to the ultimate goal of life like graphics, holograms and holodecks. Technology is great, and it must keep moving forward, the software is always going to lag behind, been this way forever.

What came first, chicken or egg? In the case of computers the chicken has to come first, and the egg follows.
 
Ummm... because AGP was abandoned before its capacity was even close to being approached?

Sure.. everyone likes to claim PCIe isn't JUST for graphics..

Yeah.. sorry, but it has no PRACTICAL use for 99% of the consumer base, and has even less support by the industry itself. For over a year we've had PCIe 1x, 2x, 4x, and 8x... and Jack crap for a selection in implmenting them. No hardware, and CERTAINLY no software.

I keep reading people bitch about Creative "getting off their ass" to make a PCIe sound card.... why the hell SHOULD they? The industry just keeps trotting out new standards and dumping support on anything previous. All that does is crush developing companies who simply can't affort to retool and redirect their Dev every 6 months a new standard or 'feature' gets thrown at them. Creative isn't terribly interested in jumping on any bandwagons, and I don't blame them.. I wish MORE companies were willing to wait, as all it does it drive costs up.

Something elsee to look at. DDR has been thoroughly imbedded into the computer enthusiast community, with DDR2 being slow to adopt... I sit here thinking, "it's been almost 10 years, shouldn't the costs for production be almost negligible at this time?"

Yes.. but they keep the costs artificially inflated, otherwise folks avoid upgrading to DDR2. They still make both, so it's not like going from 939 to AM2.

I constantly see posts and 'news' that quote corporate buzz-speak.. using words like 'quantum' .... to me its a lot like swearing... you do it enough, an it stops really carrying any relevant meaning.. its just more horseshit to try and make a point. Problem is, by the time you actually get around to MAKING your point, everyones allready stopped listening

BTW... why do you think Microcrap made DX10 Vista only? Its ALWAYS been a seperate install, irelevant to the OS version. And DON'T tell me its to cut down on scattered support and Dev costs for multiple platforms, as they plan to have something like what... SIX versions of Vista? From everything I've read, most of the 'steps' are isolated and seperately installable modules, and the more you pay, the more of those are included. You CAN'T tell me DX10 couldn't have been a seperate module. It can't possible be that hard to adapt DX10 to XP-Pro or 64, as Vista is supposed to be backwards compaitble allready..

Perhaps you've taken the situation as "the glass being half full".... I'm sitting here thinking "Why the HELL should I pay someone $5 for that glass of water, when ive got ice cold water waiting for me in my fridge at home?" I go home to enjoy My water, you pay for that half a glass of water, and when he pours it into your hands instead of a glass, don't take "The industry hasn't adopted or implemented this water yet, so you're going to have to wait for a glass" as an answer!


So what's the problem? Really? The spec is backwards compatible. All of the current PCI-e 1.0 spec devices will work in boards supporting the 2.0 specification, as will 2.0 devices work in 1.0 spec boards.

There are a number of devices available for PCI-e. Has Creative done it? No, but that's because they are fat lazy fucks that are sitting on their ass playing the, "We're the only game in town, so what are you going to do about it!" card. The fact is there is no good reason for them not to develop for the PCI-e interface. They better face it. That is where the technology is going, and they better get onboard, or get the fuck out.

PCI replaced ISA, and went through a number of revisions during it's time. PCI-e is going to replace PCI, and the new 2.0 specification is nothing different than the numerous updates made to PCI during its life cycle.
 
No one is forcing you to upgrade.

Systems will automatically have the new feature set this fall and onwards, which is compatible with current PCIe video cards.

The bandwidth improvements are certainly welcomed when considering SLI and Quad SLI and other multiple-GPU solutions. The 8800 GTX in SLI eats up tons of bandwidth, and faster PCIe bus speeds can improve performance at very high settings such as 2560x1600 with AA.

There are many other improvements in PCIe 2.0 though besides just bandwidth. One new feature I am really looking forward to is external PCIe. Imagine setting up an external box JUST for video cards.

Also, remember, PCIe is not just for graphics, it includes other devices as well, it is a true "system" bus, not a graphics only system like AGP.

Exactly. One possible improvement MIGHT be latency. According to Creative Labs the high latency of PCIe is why there won't be a PCIe X-Fi anytime soon. PCIe 2.0 might address this issue.

People need to get over thinking that PCIe was a design intended to replace AGP. It wasn't. Or at least, replacement of AGP wasn't it's only design goal. It replaces PCI primarily. Although AGP had plenty of bandwidth, and arguably still does, it has other limitations which PCIe addresses.
 
I was telling Mark yesterday how I can't wait for an all PCIe motherboard, with no PCI at all. I'm looking forward to that day. He said that Creative is most likely what's holding people back, as Dan stated, they think the latency is too high to make the switch to PCIe, until Creative does we'll be stuck with PCI.
 
With some of you guys attitudes we wouldn't progress very far with computer technology, we'd stagnate. Honestly, I don't understand it. Is this not the [H]?

Me personally I want to see technology move forward and improve to the ultimate goal of life like graphics, holograms and holodecks. Technology is great, and it must keep moving forward, the software is always going to lag behind, been this way forever.

What came first, chicken or egg? In the case of computers the chicken has to come first, and the egg follows.

The problem is they keep trying to sell us tires speed rated for 200 on cars that go 60. And the only way to get the car that goes 65 is to buy it with 250mph rated tires and get a new motherboard too (couldnt think of a car analogy for that one lol).
 
Simple mind to look at a complex issue, if Creative wants PCI, then whats to say they don't just start realeasing mobo with only 1 or 2 PCI slots:p First off, I believe the ASUS Socket F mobo only has 2 PCI slots and 4 PCIe (2@ X16, 2@ x8) and 2 PCIe x1 and on most SLI mobo, the PCI slots are most unusable because of the bigger badder PCIe cards. The X-Fi series of cards was not that great and really has way to many flaws with way to many ppl I know switching to Audigy cards because they could work out there X-Fi cards.

Glad I'm waiting should make end of this year look like alot of fun:D
 
I don't know about that. I have an X-Fi, and I have no problems. It sounds 5000% better than the old Audigy before it.

Like it or not, Creative is a major player in hardware, and them supporting (or not supporting) PCI-Express is important.
 
I don't know about that. I have an X-Fi, and I have no problems. It sounds 5000% better than the old Audigy before it.

Like it or not, Creative is a major player in hardware, and them supporting (or not supporting) PCI-Express is important.

No it's not. They aren't that significant at all. The bulk of computers on the market are using integrated audio, and the users never, ever, upgrade them to a standalone card. With onboard audio being on every consumer level board made, Creatives sound card presence is relegated to mostly enthusiest level markets.

It basically comes down to the same reason Intel actually have the largest market share of the video market...integrated video. The same applies here. Most motherboards use Realtek or Crystal audio codecs, and I'm willing to bet they have a far larger market presence than Creative when you look at the big picture.
 
I was telling Mark yesterday how I can't wait for an all PCIe motherboard, with no PCI at all. I'm looking forward to that day. He said that Creative is most likely what's holding people back, as Dan stated, they think the latency is too high to make the switch to PCIe, until Creative does we'll be stuck with PCI.

I was thinking about this just the other day. I would like to see this as well. The only thing holding us back right now are hardware companies that won't get with the program. (Ageia, Creative Labs)

No it's not. They aren't that significant at all. The bulk of computers on the market are using integrated audio, and the users never, ever, upgrade them to a standalone card. With onboard audio being on every consumer level board made, Creatives sound card presence is relegated to mostly enthusiest level markets.

It basically comes down to the same reason Intel actually have the largest market share of the video market...integrated video. The same applies here. Most motherboards use Realtek or Crystal audio codecs, and I'm willing to bet they have a far larger market presence than Creative when you look at the big picture.

Sad, but true. Creative Labs cards are typically found in enthusiast computers, but those are the boards we are talking about right now. That doesn't mean that Creative is insignificant, the enthusiast crowd has the loudest voice, even though we are a small percentage of the market. We are responsible as a group for these companies pushing new technologies. It is because of us that these advances are made and $600 graphics cards are created. Eventually, as the features in new high end cards become cheaper, they filter their way down to the other market segments. Creative Labs is like 3Com/US Robotics used to be. Though most systems used the junk modems that came with them, US Robotics and later 3Com sold more aftermarket modems than anyone else. Creative Labs sells more aftermarket sound cards than any other company. Though they are outnumbered by the onboard audio codecs, they are none the less a significant voice in the market.

I don't know about that. I have an X-Fi, and I have no problems. It sounds 5000% better than the old Audigy before it.

Like it or not, Creative is a major player in hardware, and them supporting (or not supporting) PCI-Express is important.

They are a major player and while I agree that their support of PCIe is important, I don't think it would make of break PCIe as a standard. Companies like Creative releasing PCIe products will impact just how much longer legacy PCI slots remain on our boards.

Simple mind to look at a complex issue, if Creative wants PCI, then whats to say they don't just start realeasing mobo with only 1 or 2 PCI slots:p First off, I believe the ASUS Socket F mobo only has 2 PCI slots and 4 PCIe (2@ X16, 2@ x8) and 2 PCIe x1 and on most SLI mobo, the PCI slots are most unusable because of the bigger badder PCIe cards. The X-Fi series of cards was not that great and really has way to many flaws with way to many ppl I know switching to Audigy cards because they could work out there X-Fi cards.

Glad I'm waiting should make end of this year look like alot of fun:D

There are several boards with only 1 or 2 PCI slots. I don't know where you were going with this.

On the second part, the Creative X-Fi sounds FAR superior to all Creative audio cards that came before it. The only people I know of that had to switch back to an Audigy were people that own 680i boards and are experiencing the rice crispies sound issue with SLI enabled.
 
Whether or not they dominate market share doesn't make Creative less of a player. Look at percentages of PC to Mac. Are you going to also say that Mac is not a major player? When you think of a sound upgrade, the first word that pops into your head is 'Creative'. Why? Because they're a major player.

And, for the record, when I get my EET degree and a job with Intel, integrated everything is going to go bye-bye.

THIS IS TIMESTAMPED, STEAL MY IDEA AND YOU HAVE TO PAY ME ROYALTIES.

Why aren't northbridges just sockets like CPU sockets are? Why can't you have a GPU socket on a motherboard? Sure, you could bundle an Intel Extreme Media Graphics Accelerator Plus XLXGA 3990 (names will have to get longer in the future :D ) with your motherboard, and enthusiasts like myself would still want a standalone card, but integration helps no one. The standards will change, prices will go down, and computers will not be exactly like they are now ever again.
 
Whether or not they dominate market share doesn't make Creative less of a player. Look at percentages of PC to Mac. Are you going to also say that Mac is not a major player? When you think of a sound upgrade, the first word that pops into your head is 'Creative'. Why? Because they're a major player.

And, for the record, when I get my EET degree and a job with Intel, integrated everything is going to go bye-bye.

THIS IS TIMESTAMPED, STEAL MY IDEA AND YOU HAVE TO PAY ME ROYALTIES.

Why aren't northbridges just sockets like CPU sockets are? Why can't you have a GPU socket on a motherboard? Sure, you could bundle an Intel Extreme Media Graphics Accelerator Plus XLXGA 3990 (names will have to get longer in the future :D ) with your motherboard, and enthusiasts like myself would still want a standalone card, but integration helps no one. The standards will change, prices will go down, and computers will not be exactly like they are now ever again.


Wow. Just wow.

I'll leave this one to the electronic/electrical engineers.
 
Whether or not they dominate market share doesn't make Creative less of a player. Look at percentages of PC to Mac. Are you going to also say that Mac is not a major player? When you think of a sound upgrade, the first word that pops into your head is 'Creative'. Why? Because they're a major player.

And, for the record, when I get my EET degree and a job with Intel, integrated everything is going to go bye-bye.

THIS IS TIMESTAMPED, STEAL MY IDEA AND YOU HAVE TO PAY ME ROYALTIES.

Why aren't northbridges just sockets like CPU sockets are? Why can't you have a GPU socket on a motherboard? Sure, you could bundle an Intel Extreme Media Graphics Accelerator Plus XLXGA 3990 (names will have to get longer in the future :D ) with your motherboard, and enthusiasts like myself would still want a standalone card, but integration helps no one. The standards will change, prices will go down, and computers will not be exactly like they are now ever again.

No, Mac isn't a major player in the computer market. In the scheme of things they are a niche market, and the same goes for Creative Labs.

I'm sorry to say, but the bulk of the market just doesn't recognize Creative Labs the way the people on these type of forums do. Do you really think "Joe Schmuck" is going to go down to CompUSA to look at sound cards, see the information below, and then grab that X-Fi card?

X-Fi Fatality: $149.99
X-Fi Gamer: $99.99
X-Fi Platinum: $236.99

And that's just the X-Fi line. Let's look at some of the older cards.

SB Audigy SE: $32.99 ---> Very plausible choice
SB Live! 24-bit: $29.99 ---> Very plausible choice
SB Audigy 4: $79.99 ---> Not much more likely than the X-Fi
SB Audigy 2ZS Plat: $255.99 ---> Yeah, right...
SB Audigy 2ZS: $123.99

Now when "Joe Schmuck" sees those prices for a sound card, he starts thinking it better clean the house and do BJ's. Now you have the more generic and CompUSA branded stuff right next to these more expensive sound cards, and the older and lower end CL cards are going to be within the price range that "Joe Schmuck" is willing to sink into this product.

Yes, you and I both know that you can get better prices at online stores, but "Joe Schmuck" doesn't run in these circles, and he goes down and pays full MSRP at B&M stores figuring that is what everyone pays...or at least close to it.

Sorry to say, but anyone that thinks that CL is going to slow down hardware progress because they aren't making the move to PCI-e is just fooling themselves. What will likely happen is that mobo makers that make enthusiest level boards will continue to offer a PCI slot or two for awhile to accommodate the enthusiest who likely has an add-in sound card.
 
I hate asking the obvious question but how long before Intel and ASUS etc, come out with new board supporting PCIe 2.0 :confused:
 
Whether or not they dominate market share doesn't make Creative less of a player. Look at percentages of PC to Mac. Are you going to also say that Mac is not a major player? When you think of a sound upgrade, the first word that pops into your head is 'Creative'. Why? Because they're a major player.

And, for the record, when I get my EET degree and a job with Intel, integrated everything is going to go bye-bye.

THIS IS TIMESTAMPED, STEAL MY IDEA AND YOU HAVE TO PAY ME ROYALTIES.

Why aren't northbridges just sockets like CPU sockets are? Why can't you have a GPU socket on a motherboard? Sure, you could bundle an Intel Extreme Media Graphics Accelerator Plus XLXGA 3990 (names will have to get longer in the future :D ) with your motherboard, and enthusiasts like myself would still want a standalone card, but integration helps no one. The standards will change, prices will go down, and computers will not be exactly like they are now ever again.

Hope you are minoring in Chinese. Take a couple of business classes as well.
Sockets are expensive and reduce reliability.
Same answer and good luck getting ATI and Nvidia to agree on a pinout standard.
Intergration helps 95% of the PC market which is NOT composed of people like us, but office worker business users with entirely different critera when making computer purchases.
Yep, yep and yep but not how anyone of us can imagine. I miss my punch card machine.
 
I hate asking the obvious question but how long before Intel and ASUS etc, come out with new board supporting PCIe 2.0 :confused:

Not sure, but I doubt it will take long once the chipset is complete. ASUS is very quick to bring products to market. Then three or six months later, they'll be a 32 or Premium, or -E version that's far superior.
 
Not sure, but I doubt it will take long once the chipset is complete. ASUS is very quick to bring products to market. Then three or six months later, they'll be a 32 or Premium, or -E version that's far superior.

Unless I'm wrong this is just a completion of the standard, not a chipset. Now it's up to the chipset makers to implement this new standard. Unless chipset makers have a new chipset ready, and would only need to tweak it to comply with PCI-e 2.0 specs, I don't see any compliant chipsets in the very near future.

Or maybe I misread what you wrote, and that is exactly what you were saying?
 
Unless I'm wrong this is just a completion of the standard, not a chipset. Now it's up to the chipset makers to implement this new standard. Unless chipset makers have a new chipset ready, and would only need to tweak it to comply with PCI-e 2.0 specs, I don't see any compliant chipsets in the very near future.

Or maybe I misread what you wrote, and that is exactly what you were saying?

You are right. The fact is the standard does need to be complete, and then a chipset needs to be made that complies with the standard and implements the features of it. A reference board needs to be designed, and then the motherboard manufacturers need to either use that reference design, or design their own motherboard using reference schematics as a point of reference, and go from there. We could be looking at 6 months, or 2 years for this to happen. I really don't know. I don't know exactly at what stage of development PCIe 2.0 is in.
 
So I was jumping the gun a bit then? :D Still from what I read it seemed like manufacturers would have product supporting the PCIe V2 by Q3, 2007...
 
And, for the record, when I get my EET degree and a job with Intel, integrated everything is going to go bye-bye.

THIS IS TIMESTAMPED, STEAL MY IDEA AND YOU HAVE TO PAY ME ROYALTIES.

Why aren't northbridges just sockets like CPU sockets are? Why can't you have a GPU socket on a motherboard? Sure, you could bundle an Intel Extreme Media Graphics Accelerator Plus XLXGA 3990 (names will have to get longer in the future :D ) with your motherboard, and enthusiasts like myself would still want a standalone card, but integration helps no one. The standards will change, prices will go down, and computers will not be exactly like they are now ever again.

1. Remember the days of 486's where the ONLY integrated thing was the keyboard? Good luck trying to fit your SLI video cards into a case that needs at LEAST 3 other cards for basic I/O functions.
2. Different Northbridges require different pinouts, different lines on the motherboard; in short, each chipset has it's own "road map". Trying to make a standard then making a socket-based NB would be incredibly dumb and incredibly expensive.
3. Video sockets; as above, only with memory latiency problems. You know how much HyperMemory sucks? You know why? No dedicated memory. And if you're going to have to add another 4 RAM slots just to have dedicated memory for your video chip, what's the point? Why not just stick it on a card, and bump up the speed [and then we get back to where we are right now]? (Faulty) circlular reasoning.
 
So I was jumping the gun a bit then? :D Still from what I read it seemed like manufacturers would have product supporting the PCIe V2 by Q3, 2007...

IMO, I think we'll see PCI-E 2.0 this year.
 
Back
Top