How much does a upgraded graphics card help for digital photo editing?

Archaea

[H]F Junkie
Joined
Oct 19, 2004
Messages
11,826
My friend has an older 5800FX AGP card.

It's got very fast memory and core speeds.

He saw a Nvidia 6200 AGP card on sale for 50$ - 45$ in rebates so $5 bucks.

He only uses the PC for video and photo editing. No gaming.

He was considering 'upgrading' his 5800 card to the 6200 card. I told him it's actually probably downgrading because the 6200 has much slower memory and core speeds.

But then as I've been doing some reading ---- I can't verify how much a faster memory card even helps applications like photoshop.

Would there be a difference? Do video cards even help much at all in the world of photoshop?


5800
Manufacturer: nVidia
Series: GeForce FX
GPU: NV30
Release Date: 2003-01-27
Interface: AGP 8X
Core Clock: 400 MHz
Memory Clock: 400 MHz (800 DDR)
Memory Bandwidth: 12.8 GB/sec
Shader Operations: 1600 MOperations/sec
Pixel Fill Rate: 1600 MPixels/sec
Texture Fill Rate: 3200 MTexels/sec
Vertex Operations: ? MVertices/sec

http://www.gpureview.com/geforce-fx-5800-card-144.html


6200

Manufacturer: nVidia
Series: GeForce 6
GPU: NV43 + HSI
Release Date: 0000-00-00
Interface: AGP 8X
Core Clock: 300 MHz
Memory Clock: 275 MHz (550 DDR)
Memory Bandwidth: 8.8 GB/sec
Shader Operations: 1200 MOperations/sec
Pixel Fill Rate: 1200 MPixels/sec
Texture Fill Rate: 1200 MTexels/sec
Vertex Operations: 225 MVertices/sec

http://www.gpureview.com/show_cards.php?card1=192&card2=
 
Video card performance doesn't matter at all. Some might say that the different generations of video cards present colors slightly different but that's about it.
 
On my Dell work station at work, the onboard GPU is so slow it lags when scrolling some web pages up and down. If you arent experiencing anything like that you are probably good to go (8mb gpu memory FTW!)
 
I guess some features in Photoshop CS3 would benefit from a faster GPU,
cs3-settings.jpg
 
Adobe has not said anything about GPU acceleration in CS4, it's just an online rumour. Read John Nack's blog:

-- I didn't say whether the next version of Photoshop would or would not be called CS4. Instead, I was simply trying to point out that what I was showing was a technology demonstration that was independent of a particular version.
-- Similarly, I didn't say that GPU-enabled features would or would not ship in the next version of Photoshop. Think, "I can neither confirm nor deny..." When developing any product, details are always subject to change, and it's always possible that some unforeseen roadblock will appear. That's why we try so hard to wrap a lot of caution tape around any future-looking statements: we're excited to be showing you some of what we're building, and we hope you are, too, but we want to manage expectations & not over-promise anything. Make sense?
-- Lastly, I didn't say that the next version Photoshop would or would not ship on a particular date. My (badly made) point was that nothing had been announced, so the fact that a date of "October 1" kept getting repeated should be taken with the appropriate grain of salt.



Video cards do make a difference for photo editing, but it depends on what software you're using. Basically, unless you use specialist software (or run Mac OS X), it makes a very marginal difference.

Stay with the 5800, though. There's no sense paying money (even if it's only $5) for a card that's WORSE than what you have now.
 
In time, the software used for image processing will eventually tap the GPU in a mishmash "hardware assist" method similar to what video cards already do when it comes to DVD or HiDef content playback nowadays. They either do all the decoding of the content's compression on the GPU itself or they "assist" the CPU in doing the math, basically.

The primary benefit to having a fast video card with fast RAM is being able to display the images you're working with just that much faster. Imagine you've got a monster PC with gobs of physical RAM in it, but you get cheap on the video card and get something low end that might have 128MB of RAM on the card. Let's say it runs fairly fast, but it's like a low end edition of a specific GPU chipset.

Let's say you load up an image in your image processing software, and the software and the computer itself can easily manipulate it all over the place quick as lightning, but the data needs to be passed to the video card and there's a bottleneck because of the low RAM amount on the card and it being a low end card in terms of processing power in the first place.

Contrast that to something high end, maybe with a 128 bit or even better 256 bit wide data pathway for the RAM and the GPU itself. Now we're talkin'...

So while such a card as the low end one can't really keep up - you move the image around onscreen and the image actually lags behind wherever the mouse pointer ends up, etc - it will work, but it's slowing your productivity down.

I was pretty excited a few years back when they started tapping GPUs with programs like Maya and 3DStudio to do "hardware assist" for the rendering parts of 3D graphics work, that was a boon for developers and creative artists, it really was.

At some point, Photoshop will do GPU assisted rendering and processing, that's a fact. Whether or not they'll admit it yet is up in the air, but it's coming, bet on it. And other uses for the massive GPU number crunching power most high end cards have nowadays. There have been great strides into using the GPU cycles for things like Folding@Home (ATI has been doing it for a while, Nvidia just announced a client coming soon), and other things. Look at what the Cell processor in the PS3 is doing these days too.

They ain't always about the gaming... ;)
 
I was pretty excited a few years back when they started tapping GPUs with programs like Maya and 3DStudio to do "hardware assist" for the rendering parts of 3D graphics work, that was a boon for developers and creative artists, it really was.
The only thing people ever render with GPU-acceleration is particles and such, and even that is usually done entirely on the CPU. If you want to find an example of hardware acceleration used in real-world production, 3D rendering is pretty much as far off as you can get. The only thing hardware acceleration is really used for, is OpenGL/D3D viewports.
 
http://www.nvidia.com/page/gz_learn.html

Just released a few days ago in an updated format. I use it myself, and now that Gelato Pro is free also, I have to say I'm very pleased with the results...

Of course the downside is that the Gelato project is now toast as this is the final release; Nvidia will refocus all their 3D GPU rendering assist applications and support towards their Mental Ray featureset from now on.
 
Gelato...

-- Is no faster than CPU-based rendering, negating the advantage of using GPUs in the first place
-- Is used by, like, five people
-- Has been cancelled, as you mentioned

Also, I suspect we're OT. ;)
 
Question: Does the AGP Acceleration in PS CS3 only work in Vista? The option's greyed out on my end in XP 32bit.

If you want to find an example of hardware acceleration used in real-world production, 3D rendering is pretty much as far off as you can get. The only thing hardware acceleration is really used for, is OpenGL/D3D viewports.
Exactly. No major renderers use the video card for non-real-time rendering as far as I know, and I've use Brasil R|S, MentalRay, VRay and FinalRender Stage1, all with 3DS Max.

Nvidia will refocus all their 3D GPU rendering assist applications and support towards their Mental Ray featureset from now on.
I hope so. With video cards being insanely faster than CPUs and having so many more cores (instead of 2-4), it really surprises me that people don't use them for rendering yet.

Being able to use video cards for rendering (even if not 100% of the time) could bring a HUGE leap in performance and visuals in many fields...plus it'll really help those of us without million dollar render farms to have nice-looking animations with light bounces, soft shadows, etc. like the big boys.
 
If he isnt using any 3d applications/games then buy the 6200 as it runs cooler/quieter and consumes less power than the 5800. Then sell the 5800 on ebay or something. Even if you dont you probably recoup the $5 in power savings over the year.
 
It doesn't. In the CS3 suite there are new features that allow you to integrate and manipulate 3d elements placed on your canvas, but in reality, a gfx card makes very little difference.

There is talk of full blown GPU support in CS4 though, really putting them to work, which couldn't hurt.

tut tut, you guys, video cards are used in rendering - viewport rendering, some particles, beyond that they - in a nutshell - good at processing quick and dirty stuff (which, in the scheme of things, is what a game is, quick and dirty), but they lack the raw punch to do massive lighting, raytracing, caustics, shadow calculations which take up the majority of render times, which is why the CPU gets dumped with the work when it comes to rendering.

essentially, your GPU is only good for pumping out alot of rough frames per second, not for doing big calculations etc
 
Back
Top