Separate names with a comma.
Discussion in 'Video Cards' started by erek, Jun 24, 2018.
Poor Lamar, so misunderstood.
Looks super VRAM limited. Remember default is 128MB / 4 chips = 32MB effective VRAM. I think if he did the upgrade to 64MB per chip like some folks did with the Voodoo 5 5500, it'd be a lot better.
For reference, the best GPU's available on HL2 release was the GeForce 6800 Ultra/ATI Radeon X850 XT with 256MB VRAM.
It was so tight playing this on my x800xt-pe back in the day.
My oppy 165 @ 2.8GHz + x800xt monstered HL2 and the lost coast tech demo, to the point it even smashed it out smooth as silk 1920x1080 60fps 8 years later!!! I couldn't believe how well it handled source all those years later at that resolution to be honest. Not bad for 256mb
I had an X800XL with a P4 2.4c when this game came out.
It's remarkable how well the graphics have held up over the past 14 or so years.
There is a 256MB mod for the voodoo 5 6000
Yeah, but why in the world would you try to mod a collector's piece for more performance. There is no value in the performance of these cards. Just like your precious 5800 Ultra...it's not about performance otherwise you'd get a cheaper and better performing 9800 Pro.
i ask myself the same thing, but seems there's people modding them
Or 5950 Ultra. But I agree 100% I dont see the point in modifying one to that extremity considering their age and performance potential at this point.
I think the desire for the 200 MHz VSA-100 chips and 256/512M memory mods is people wanting to know what the V5 5500/6000 could have been. I've seen a few benchmarks and the higher clocked VSA-100s and extra memory give noticeable performance gains. The V4 4500 also apparently performs much better with these mods.
There's apparently only one guy you can get the VSA-100 chips from, he apparently bought up trays of them from Quantum3D or something years ago after 3dfx went out of business.
I'd never get my V5 5500 modded unless it actually developed a fault that stopped it from working. I replaced the heatsinks on it earlier this year because both fans had failed and getting low profile fans like it had are neigh impossible.
i remember exactly how in awe i was when i fisrt fired up HL2 and saw the realism and the physics. No game since then has ever made me feel like I was playing something truly special.
That didn't happen for me until I started throwing saw blades at zombies with the gravity gun. I'm pretty sure i had to pick my jaw off the floor. Couldn't get enough. And that was playing on a slot A Athlon at 950 and a Radeon LE.
I also remember being so stoked to see how these new Voodoo cards performed. And how disappointing it was
To this day I still use HL2 as my reference game. Every time I build a new computer, I will install HL2 and play all the way through it. I love that game.
Dang, and I thought my Athlon 2400 with a Radeon 9600 was slow at the time. I remember running the DX9 demo benchmark at 1024x768 and was getting 10-20 fps.
I remember the CompUSA demo rig they had at the front entrance advertising the Voodoo5 5500 next to whatever ATi and nvidia had at the time. They had Quake 3 timedemos running and a sticker conveniently placed on the upper corner of the CRT monitor where the frame rate was lol.
It was on the shelf for $599.99 next to the other cards, which maxed out at like $329.99.
You only got to 2.8? Wuss I pushed mine to 3. But that was on water...
Yeah it was slow. But it worked. I remember being able to walk away from my desk with my wireless headphones between levels and do something while I waited for the next level to load
Because there are 3dfx glide api specific games that wrappers don't run 100% on. You rarely ever see any Nvidia or ATI modding, because there isn't any point to those.
With a late revision VSA-100 chip and increased memory capacity, you could run older games at modern resolutions. I don't have the time or money to do it, but I assume that's the reasoning.
I have no idea the technical reasons for is demise, but glide was years ahead of dx and ogl in image quality. Pity it died with 3dfx
Glide was not ahead of its time. OpenGL at the time was mainly used for workstations and professional graphics. Therefore, a lot of the API had limited uses for games (back then). In the early days (starting late 1995 with the Voodoo), image quality was NOT the first objective; frame rate was the priority in games. Hardware was very limited in capability. But by the time Direct3D 6 rolled around (late 1998), the situation had reversed itself. Video cards like the Nvidia TNT/TNT2 and ATI Rage128 was fast enough and things would only get progressively better.
It didn't help that 3dfx was aggressively suing glide emulators and was trying to promote Glide API games that made it unplayable (quite literally) on other vendor hardware. Even Direct3D compatible games (like FFVII or Unreal) played like sh1t on anything except 3dfx hardware for months after release.
Much of this may be true, I do not know. I do know glide games on 3dfx hardware weren't matched in IQ for years. This I know because I was there. Ymmv.
I think that's nostalgia more than anything else. As noted in my previous posted, the Voodoo / Voodoo2 seemed to have the best IQ simply because they were the only cards able to deliver acceptable frame rate for months after release.
As noted at the time, Voodoo 2 was actually a step down in terms of image quality (its dithering was well known). But it was the first card capable of 60+ fps at 800x600 in most era games, so no one gave 2 sh1ts. Voodoo 3 was a big step up, especially with its 22-bit internal rendering. And finally, we have the VSA 100 (Voodoo 4/5). They were a HUGE step up in IQ (native 32-bit rendering, 2K texture support, HW FXAA). Hence they are the most sought after Voodoo cards, as they are also 100% Glide API compatible.
Another thing to note is that analog output quality mattered. Voodoo 1/2 used a passthrough card. So most people used Matrox Millenium's, as their IQ was top notch. On the Nvidia side, it ranged from bad (most OEM's) to decent (STB) to good (Diamond/Hercules) to excellent (Canopus). So if you were one of the 2 dozen people to get a Canopus TNT/TNT2U your experience was much better than most people at the time.
Diamond Viper V770 VS V3 3500, and then later on Radeon LE vs V3 3500. The radeon beat the piss out of the V3, provided the game didnt use glide. Glide enabled effects that made the game look super freakin sweet (Janes f15 actually blew my mind at the time). Those effects were not available on DX. The radeon had superior image quality on the desktop.
The stuff your talking about is true, however glide enabled things to be done that i'm assuming couldnt be done with the other API's on other hardware.
Love it! I actually owned stock in 3dfx back in the day. Memories.