Why are video card so expensive?

Motley

2[H]4U
Joined
Mar 29, 2005
Messages
2,497
Isn't it strange, that we spend more on video cards than on the actual cpu processor. For example my current system I spend $500 on 2x260s SLI and an E8400 was only $200.

I"m upgrading to the 920 which is ironicly only $200 and I want to upgrade to the fermi or 5870s which are upwards of $400 each! omg

So there lies the question, why? Is the technology to produce an Intel high-end cpu like the 920 is less than half of the latest video cards? Seems strange.
 
Modern graphics cards are much more complex than modern processors. On top of that, not only are you paying for the GPU chip, but you're also paying for the graphics card RAM, monitor display ports, and the PCB itself.

With the CPU, you are just only paying for chip.
 
Isn't it strange, that we spend more on video cards than on the actual cpu processor. For example my current system I spend $500 on 2x260s SLI and an E8400 was only $200.

I"m upgrading to the 920 which is ironicly only $200 and I want to upgrade to the fermi or 5870s which are upwards of $400 each! omg

So there lies the question, why? Is the technology to produce an Intel high-end cpu like the 920 is less than half of the latest video cards? Seems strange.

When you buy a CPU you are buying the chip only. When you buy a videocard you are buying the GPU, the board that the GPU is on, 1Gigabyte of DDR5 which is way more expensive than DDR2 or DDR3, etc. It's like it's own mini-platform, and you'd have to really compare it against the cost of a CPU+Motherboard+Ram before you are on even terms when it comes to complexity, etc.

You also have to keep in mind that the 920 is NOT the top-end i7. An i7-975 extreme edition is $1000 compared to "only" ~$650 for a 5970. Point is that there are CPU's and GPU's both that range from expensive to <$150, so you could pick any two and make a claim that one or the other is more expensive.

The only thing that would really lean things toward GPU's being "overpriced" is the fact that CPU's tend to have a bit more overclocking headroom compared to most GPU's right now. That lets us get away with not having to buy the very fastest processor while still running at basically the fastest speeds anyway. On the other hand you can't just get a 5770 and overclock it to 5870 levels of performance.
 
Isn't it strange, that we spend more on video cards than on the actual cpu processor. For example my current system I spend $500 on 2x260s SLI and an E8400 was only $200.

I"m upgrading to the 920 which is ironicly only $200 and I want to upgrade to the fermi or 5870s which are upwards of $400 each! omg

So there lies the question, why? Is the technology to produce an Intel high-end cpu like the 920 is less than half of the latest video cards? Seems strange.

i7: 731 Million transistors

5870: 2.15 Billion transistors.

So that's just a gnat's hair under 3x the transistors. You also get free ram with that, and a heatsink, and a "motherboard" (graphics card pcb).
 
Wouldn't it be cool if they could build a graphics card that you could replace the gpu,like a motherboard?I'm sure it could be done.
 
Wouldn't it be cool if they could build a graphics card that you could replace the gpu,like a motherboard?I'm sure it could be done.

AMD and Intel are both trying to do this.
Posted via [H] Mobile Device

EDIT: Sorry, I got that wrong. I meant something else.
 
Wouldn't it be cool if they could build a graphics card that you could replace the gpu,like a motherboard?I'm sure it could be done.

They did do that. But the idea never took off. It was more cost-effective to buy a new GPU altogether.
 
Wouldn't it be cool if they could build a graphics card that you could replace the gpu,like a motherboard?I'm sure it could be done.

No, it wouldn't. Think about it - main difference between the 4850 and 4870 was GDDR 3 vs. 5. Of course, cards also have fastly different power needs and circuitry. The 5970, for example, can handle a 400w GPU load. So now you are talking swapping out the GPU *and* the RAM while staying withing a fixed power envelope. Factor in that you now need to make sockets for swappable RAM and GPU and all sorts of other crap and it all ads up into a ridiculously expensive base card that would suck.

Look at motherboards. How many people keep the same motherboard across more than maybe 1 CPU upgrade, if that?
 
Wouldn't it be cool if they could build a graphics card that you could replace the gpu,like a motherboard?I'm sure it could be done.

I remember:

shorter trace lenghts for tighter memory timings: something not as easily accomplished by socket based CPU/GPU + RAM.
Posted via [H] Mobile Device
 
Also, you put a ton of research into a CPU technology and the benefits of that technology will give you more sales of your product then a GPU technology. There are lots of computers that only have onboard video but they ALL have a CPU in them. More units sold means you dont have to sell each unit for as much to make your profit and you can be more competitive. GPU's have a much shorter shelf life and a lot smaller market so to make the money back on them you have to charge a lot more.

It's surprising video cards dont cost more than they do. :eek:
 
I remember paying $210 for an AMD 486 DX2-80, which was the fastest non-Pentium at the time (back when a Pentium 60 was like $1000). So I guess CPU prices are about the same? :)
 
A GPU is almost its own computer designed to render graphics and do specific work (very very fast). A GPU is lightyears faster than a CPU in some ways, but is less versatile.
 
I spent $1000 on my 5870 and 3 monitors for eyefinity.. don't regret spending a cent of it.
 
Doesnt matter, prices change

Before CPU's cost a fortune, HD's were alot, PSU were cheap and motherboards were cheap

Some other year, CPU are dirt cheap, ram is dirt cheap, cases and psu's cost alot and vid cards are mid range.


in the end if you look back say every 2-3 years for the same amount of money you could always build a nice rig for that time frame.

11 years ago i build a PIII 533 system and an ATI AIW 32MB 2x AGP rig and 19' CRT the CRT was $500, the 40G HD i had was $400, total was about $2k and that was high end, today you can still build a high end rig for $2k, but just parts fluctuate.

You dont have to get SLI either, you dont have to get the top card, the fastest is always going to be most expensive.
 
2x260's for $550 is in my opinion excessive, the same way the extreme chips are excessive. for $150 you could get a card that would perform just as well under most conditions, another $150ish for a cpu, and you have a system that will play all games on high on most resolutions.

really graphics cards arent any more expensive than processors
 
2x260's for $550 is in my opinion excessive, the same way the extreme chips are excessive. for $150 you could get a card that would perform just as well under most conditions, another $150ish for a cpu, and you have a system that will play all games on high on most resolutions.

2x260s for $550 are excessive, considering even EVGA ones don't go for much more than 400 for two. :)
 
Isn't it strange, that we spend more on video cards than on the actual cpu processor. For example my current system I spend $500 on 2x260s SLI and an E8400 was only $200.

I"m upgrading to the 920 which is ironicly only $200 and I want to upgrade to the fermi or 5870s which are upwards of $400 each! omg

So there lies the question, why? Is the technology to produce an Intel high-end cpu like the 920 is less than half of the latest video cards? Seems strange.

Isn't it strange, that we spend more on CPUs than on the actual GPU processor. For example my current system I spend $500 on Core i7 940 and an ATI 4890 was only $200.

Do you REALLY have to ask this question?

If you want top-end GPU performance, you buy a top-end GPU.
If you want top-end CPU performance, you buy a top-end CPU.
If you want top-end everything, you spare no expense.
If you don't really care that much, you choose neither, and compromise.

You have obviously made your decision on what you value more.
 
Last edited:
Isn't it strange, that we spend more on video cards than on the actual cpu processor. For example my current system I spend $500 on 2x260s SLI and an E8400 was only $200.

I"m upgrading to the 920 which is ironicly only $200 and I want to upgrade to the fermi or 5870s which are upwards of $400 each! omg

So there lies the question, why? Is the technology to produce an Intel high-end cpu like the 920 is less than half of the latest video cards? Seems strange.

That's somewhat of an apples to oranges comparsion though.

The 920 isn't the top of the line core i7 CPU. $400.00 buys the top of the line ATI card (well ignoring the 5970), give or take some markup due to demand. The top of the line Intel CPU is roughly $1000.00 if you get a i975-EE, ignoring the exteme edition ones, the i960 is..I think around $600.00 or so. So if anything, the top of the line i7 CPU is more expensive then a single top of the line GPU card right now.
 
Everyone seems to be forgetting the industrial aspect and the engineering aspect. Intel is a huge company that produces probably 3-4 as many chips as Nvidia does. Because of Intel's size, intel can keep the cost of their product down. The price of pcb, silicon, and ram is nothing compared to the cost of engineering.
 
The price has to be jacked up in order to make a profit. Probably around 5% of the people buy high end video cards. The price of vga depreciates kinda quick too.

I wouldn't be surprised if the $400 dollar card cost to make like a $50..
 
Your buying a top of the line video card, and comparing to a mainstream cpu?. Intel Extreme Editions cost $999 or 2x 5970s....

High end stuff are high margin products, simply put they need to be high enough so they can make money of the tiny margins from mainstream products.

Combine that with sufficient demand, and you got yourself a pretty high price for the consumer.

Thats why competition is a beautiful thing, it helps everyone except the people competing.

Here's hoping Fermi kicks ass for everyone's sake!!!
 
Isn't it strange, that we spend more on CPUs than on the actual GPU processor. For example my current system I spend $500 on Core i7 940 and an ATI 4890 was only $200.

Do you REALLY have to ask this question?

If you want top-end GPU performance, you buy a top-end GPU.
If you want top-end CPU performance, you buy a top-end CPU.
If you want top-end everything, you spare no expense.
If you don't really care that much, you choose neither, and compromise.

You have obviously made your decision on what you value more.

Except for CPUs you can buy the cheaper counterpart and easily surpass top-end speeds with OCing, unlike GPUs.
Personaly I wouldn't value a 940 at $500 when the 920 is $200.
 
really expensive because it's a marketing thing.
The people that really care will pay and the people that somewhat might care will wait for it to drop
 
No competition. There are only 2 competitors, so they can charge a fairly high price for their product.
 
Wouldn't it be cool if they could build a graphics card that you could replace the gpu,like a motherboard?I'm sure it could be done.

It wouldn't work out too well. The costs would end up being significantly more to begin with becuase they'd have to add a socket, which wouldn't be cheap becuase it would have to handle very high current loads, much higher than modern processors. The socket would also make the cards even bigger than they are now, certainly don't need that. And finally, when was the last time you upgraded your video card and the ONLY difference was the actual GPU?
 
because people in enough numbers are willing to pay that certain "expensive" price. Blame your fellow consumers.
 
The comparison inherent in the question is a non-sequitur.

CPUs are a single piece of silicon with a heatsink. Graphics are a complete subsystem which includes GPUs and among other things, specialty DRAM which costs 2-4x the per unit price of system DRAM. The assumption that anyone in the graphics business is making outrageous profits is terribly wrong. Just look at the annual reports of the GPU suppliers versus that of Intel.

As for a socketed GPU, yes it could be done, but the performance would suck. Among other things the memory paths between a GPU and graphics DRAM are essentially point-to-point and highly tuned, so that they can run anywhere from 2-4x faster than system DRAM. The capacitive and inductive loads imposed by a socket would greatly slow things down. There would be other penalties as well. On the whole a socketed GPU would make you wonder why you bothered to upgrade.
 
i7: 731 Million transistors

5870: 2.15 Billion transistors.

So that's just a gnat's hair under 3x the transistors. You also get free ram with that, and a heatsink, and a "motherboard" (graphics card pcb).

I wouldn't say "free".
 
As to the technology guesses and what not, its currently headed to everything on-die. CPU's will turn almost completely into self-sufficient machines that don't need anything else to operate your computer. Look at what's coming up in Q1 2010, the new core i5's will contain not only the PCIe controller, but also the the GPU itself in the form of an IGP. Performance will be nothing short of atrocious compared to standalone GPU's, but you get the idea of where Intel is heading with this.

This is just pure speculation on my part but there are a few things that I find interesting in the tech industry today, specifically with Intel:

1. The majority of investment capital for Lucid comes from Intel

2. Intel's significant interest in "Larrabee"

3. Carbon nanotube technology (specifically to transistors in the upcoming future)

4. Nvidia's insistent move towards GPGPU

All of these things to me represent an imminent fusion of CPU's and GPU's as we know them today. GPGPU sheds some light on this but only as to demonstrate the ability of a GPU to process other things. The Lucid reference is there to show Intel's interest in GPU related controllers(future knowledge for CPU related drivers?). Larrabee is obvious and Carbon nanotubes transistors will allow all of this to happen with the decimation of moore's law. Haha well maybe not by the time the tubes can be utilized effectively, it might be the exponential by then but it would sure be cool.
 
Isn't it strange, that we spend more on video cards than on the actual cpu processor. For example my current system I spend $500 on 2x260s SLI and an E8400 was only $200.

I"m upgrading to the 920 which is ironicly only $200 and I want to upgrade to the fermi or 5870s which are upwards of $400 each! omg

So there lies the question, why? Is the technology to produce an Intel high-end cpu like the 920 is less than half of the latest video cards? Seems strange.

Well, the 920 isn't really high-end. It's more like mid-range. The i7 975 is $999 and the i7 960 is $590 (those extra 130 MHz are tasty, but pricey).

A 5870 crossfire setup should be compared to a dual-socket Nehalem-based Xeon setup, not to a single mid-range CPU.

I spend about the same on my video card and CPU last time I upgraded - about $180 each. The CPU is turning out to be a bit more future proof than the video card, though.
 
R&D is probably what makes any component expensive. Also, other non-material costs are expensive such as management, QA, Marketing, facility overhead ETC...
 
Modern graphics cards are much more complex than modern processors. .

That is far from the truth... And for those measuring the complexity of a chip based on the number of transistors... That's like saying OMG my P4 3.8GHz >>> your i5 2.66GHz!!!

It is true however that you are paying for the PCB, VRAM and everything else that comes with it (like a bigger cardboard box than a CPU box, video cables like VGA DVI or maybe even Crossfire/SLI Bridge!!!!!!!!! it's a bargain really)
 
Look at motherboards. How many people keep the same motherboard across more than maybe 1 CPU upgrade, if that?

Just me apparently.

I went from Athlon 3500+(939) to Athlon X2 3600+(AM2) to Athlon X2 6000+(AM2) on the same motherboard.
 
Your also paying for engineering and continued support from the manufacturer. Nvidia/AMD has to pay employees to constantly release new drivers.
 
Modern graphics cards are much more complex than modern processors.

That is far from the truth...

No it's not, he's absolutely right on several levels including power.
A gpu core is much more powerful than a cpu; look at CUDA and F@h for instance - compare apples to apples.

It takes a GT200 core a fraction of the time that it takes a cpu to do the same work.

You can argue "oh well it's optimization" but I'd still say there's more horsepower in a gpu than there is in a cpu...therefore they are definitely more complex.
 
The assumption that anyone in the graphics business is making outrageous profits is terribly wrong. Just look at the annual reports of the GPU suppliers versus that of Intel.

nVidia and AMD are both struggling to break even on their GPU business. Only Intel makes descent profits on GPUs.
 
No it's not, he's absolutely right on several levels including power.
A gpu core is much more powerful than a cpu; look at CUDA and F@h for instance - compare apples to apples.

It takes a GT200 core a fraction of the time that it takes a cpu to do the same work.

You can argue "oh well it's optimization" but I'd still say there's more horsepower in a gpu than there is in a cpu...therefore they are definitely more complex.

No, he isn't right. GPUs have more single and double precision massively parallel floating point performance than CPUs, yes, but that is it. CPUs aren't designed for floating point work, and they aren't designed to be massively parallel. GPUs and CPUs have extremely different goals and priorities, so your comparison isn't apples to apples. Switch to single threaded integer work (which is what CPUs primarily do), and suddenly your GPU is as useful as glasses to a blind man.

As for complexity, GPUs are a huge array of very simple cores. Each GPU core is also much *slower* than a CPU core. GPUs are faster because of the sheer number of things they can do at once. GPU cores are also exceedingly simple compared to a CPU core.

Performance != complexity.

Just me apparently.

I went from Athlon 3500+(939) to Athlon X2 3600+(AM2) to Athlon X2 6000+(AM2) on the same motherboard.

How did you pull that off? 939 and AM2 aren't compatible...
 
Back
Top