When will video cards be obsolete with the CPU taking over that function

computerinfo

Limp Gawd
Joined
Oct 19, 2007
Messages
381
When will video cards be obsolete with the CPU taking over that function.

I assume that as the CPU gets more powerful, the function of current video cards can be integrated into the CPU. Where can I find out information on this and what is the status.
 
From the propaganda spreading around the internet (largely perpetrated by nVidia) it'll be the other way around.

Don't think there's anyone that can actually answer questions like this though lol.
 
I think that for the cpu OR video card to become obsolete, it first takes an assumption that digital rendering will peak. This it to say, the human eye will not be able to distinguish between reality and digital images. In my opinion, by the time this happens, we will be at a point where we can improve the human eye to operate on a higher level.

To put things simply, I do not think that computer hardware will ever advance enough to where either CPU or GPU are no longer needed.
 
Probably never. Integrated graphics are so far behind.. it's like asking when Power Wheels motors will replace big rig 18-wheeler engines.
 
In the mid 90's, the CPU did all the 3d stuff.. in software :p

Discrete video cards will always be more powerful than integrated. There's more space for the GPU, power circuitry, cooling etc. There's also the problem with memory bandwidth.. a 4870 has got well over 100GB/s of bandwidth. Even DDR3 in dual channel would only provide a little over 20GB/s which needs to be shared between GPU, CPU etc.

However integrated graphics (whether it's integrated into the mobo or the CPU) might become more powerful in the future, making videocards below, say $200 pointless.
 
The idea that has gotten this into peoples heads is that video card GPU's are becoming more like CPU's. You've got ATI and nVidia GPU's able to do general processing, being used for other applications such as accelerating encoding or protein folding. The next generation will be even more flexible. And with Intel's Larrabee, where all of the cores are x86 based... it really will be where the CPU has replaced the GPU (granted, Larrabee will still have some fixed function graphics API so it can be backwards compatible). But even though it's basically a CPU and could theoretically fill the role of a CPU, it is still going to be a separate addon graphics card used primarily for graphics.

Video cards obsolete? If it ever happens, it's not going to be until 2014 at the earliest.
 
I can see it replacing integrated graphics but not dedicated graphics cards in the near future.
 
When will video cards be obsolete with the CPU taking over that function.

I assume that as the CPU gets more powerful, the function of current video cards can be integrated into the CPU. Where can I find out information on this and what is the status.

You got the trend reversed. Video cards are getting more powerful, not CPUs. CPUs have changed very little, compared to what happened with 3D Video cards. Considering how programmable video cards are nowadays, with things like CUDA, there won't even be a need for APIs such as DirectX in the future. CPUs will never be able to replace that.
 
From the propaganda spreading around the internet (largely perpetrated by Intel) it'll be any day now.

Don't think there's anyone that can actually answer questions like this though lol.
Fixed.
 
I think that for the cpu OR video card to become obsolete, it first takes an assumption that digital rendering will peak. This it to say, the human eye will not be able to distinguish between reality and digital images. In my opinion, by the time this happens, we will be at a point where we can improve the human eye to operate on a higher level.

To put things simply, I do not think that computer hardware will ever advance enough to where either CPU or GPU are no longer needed.

I think you’re on to something and I think what will happen is that what we call graphics today will be defined differently tomorrow.

Did anyone even realize how awful standard television really was until HDTV signals were available? The next step for video card graphics is equally difficult to perceive. I remember when the first Farcry came out; it looked so real, so beautiful. I thought that video graphics were finally reaching their peak. Now a days games are so much more visually appealing then Farcry was, making it look old and dated in comparison. Was I wrong in thinking it was a gorgeous game? No because I was comparing it to then. The bar is perpetually getting raised, longer draw distances, more detail on screen, physics engines, and a ton of crap I don’t understand but it all equals to pretty stuff.

Perhaps in the future we'll be going in a new direction like 3 dimensional holograms. Generally science fiction has many ideas that eventually get turned into reality. If the future of games is to play on a holodeck of sorts wouldn't this take specially designed hardware? ...and then wouldn't that hardware have to always be perpetually upgrades to make its immersion even more real, something that is hard to perceive because nothing like it exists today.

So I don't think the GPU will ever get re-integrated back into the CPU, in fact I think the opposite. The GPU will continue to evolve and will spawn off different types of processors that will all work in conjunction. Graphics cards will spawn different types of visual hardware, just like tape drives are the great great great grand parents of usb memory sticks. The end result will be foreign, different and impossible to predict.
 
It's simply a cost issue. Most computer applications today do not require a powerful video card, and by keeping them as separate entities it allows us to sell CPUs cheaper. Not to mention most computerized tasks do not require the same type of processing power as GPUs provide, such as quick FP calculations.
 
It will never happen. Graphics are a distinctly VERY multithreaded friendly application. All other programs, word processing, web browsing, and the like are distinctly SINGLE threaded applications. You will never be able to make a CPU that is competitive with what a GPU can do because the advances ARE advances in the other. I wouldn't be surprised if a CPU today could give you Graphics circa 2000, but who TODAY would want circa 2000 graphics?

The reality (not the propganda) is some applications like photoshop, matlab, and others that are very multithread friendly are being ported to GPUs. The GPU won't kill the CPU though because the CPU will allways be faster at a single thread.
 
The only way a CPU will replace a GPU, is if they integrate the GPU into the core of the CPU, in effect combining the two. Thus not really doing away with either, just doing away with descrete graphics. But even that is unlikely unless there is also dedicated graphics memory in the system.

Current CPU designs just don't have the capability to produce the graphics for modern games or design applications. If the CPU were changed to perform these functions, it would essentially become a GPU, thus the GPU would be replacing the CPU. Larabee is all Intel marketing hype, the GPU is going to be around for quite a while, at least until the API of graphics is changed to work across a large array of CPUs, which is doubtful.
 
Can you explain to me how the question in the first place was brought up? I can't see the logic behind such a flawed statement.
 
Even when a GPU can be replaced entirely by a CPU, the question is whether or not it is worthwhile to do so. I think ultimately specializing the components makes them more efficient and effective.
 
Back
Top