Search results

  1. J

    What is the worst and best video card you ever owned?

    I'd say it's relative to the timeframe in which you owned the card. So the one that blew me (edit: away) the most would have to be the Voodoo 1. Gaming at 640x480 with no AA and bilinear filtering never looked so good or felt so right. :)
  2. J

    Nvidia for Better Or Worse

    HPC stuff is a niche market that in no way can compete against the quantity of sales NV currently garners from retail and OEM markets. And even if inefficient in comparison, x86 hardware is cheap and, perhaps more importantly, more compatible with today's existing enterprises. What you're...
  3. J

    Nvidia for Better Or Worse

    I'm currently running a GTX 285 and I'm playing through Batman: AA for a second time. Sure would be nice to be able to buy a Radeon 58xx for faster rendering and keep my GTX 285 installed so I don't lose out on the PhysX support in the game.
  4. J

    NVIDIA SLI & ATI CrossFire - Experiences & Opinions

    The dongle is a bit of a non-issue for me (as cluttered as the wires are beneath my desk, one more that connects to a PC on both ends and doesn't join the tangled nest isn't a big deal IMO), but the inability of users to create custom profiles for Crossfire is a major limitation. ATI may have...
  5. J

    AA high res use, who would really do it

    I wear special glasses that add aliasing to my world. Why? Because I live such a fast-faced lifestyle that the aliasing doesn't bother me. </sarcasm>
  6. J

    What videocard has impressed you, and that hasn't?

    Boards with a special place in my hardware geek's heart that I purchased with my own $$$: Voodoo 1 for showing me a bilinearly filtered gaming world. Voodoo 2 SLI for giving me 3D graphics at 1024x768. Voodoo 5 for showing me what an incredible benefit good AA is to overall image...
  7. J

    no AA with HDR... but why?

    This thread is a shining example of why I'm not fond of programs like the AEG one. Just how many unnecesary flame wars have been created by people with vested interests in always defending one particular company's parts. And, yes, if ATI has a similar program I won't be a fan of it either.
  8. J

    Hdr + Aa

    Yea, I tend to prefer feature implementations that can be more broadly appreciated by the hardware/gaming community too. Developer support of HDR is still in its infancy and hopefully the DX10 parts from both companies are more robust (for lack of a better word off the top of my head).
  9. J

    Hdr + Aa

    No chance a member of the dev rel team wouldn't ping them for an opinion on the lack of a widely expected option (HDR + AA) for their latest parts? Especially with the biggest title of '06 so far? Aren't you doing this in the above? Stating unequivocably that Chuck would be the last person...
  10. J

    Hdr + Aa

    It's a huge mess, IMO. Even the Wikipedia entry for HDR is full of bad information.
  11. J

    Hdr + Aa

    That functionality is derived from the backend, the ROPs. And I'm not sure what you're suggesting is "shared" in NVIDIA's current shader pipelines that would be relevent to the topic at hand.
  12. J

    Hdr + Aa

    You sure nothing in your post was wrong? Like suggesting it's a limitation in the ALUs that prevents NVIDIA hardware from applying multi-sampling to FP16 render targets?
  13. J

    Hdr + Aa

    Well, ATI doesn't have all forms of HDR covered by any stretch of the imagination (particularly float-based that's usable with multi-sampling). Both companies have a pretty limited set of formats they each support right now.
  14. J

    Hdr + Aa

    No one disputed that. But you're trying to compare apples to oranges with your screenshots.
  15. J

    Hdr + Aa

    And here I thought we were talking float-based HDR and not integer formats.
  16. J

    Oblivion running @ 1920x1200, thoughts?

    Well, the Carmack question isn't very relevent since he's never coded an engine with the intentional of it rendering large outdoor areas. Sure, some of his engines have been extensively modded for such usage, but that's another issue. Anyways, no doubt the engine could run better since...
  17. J

    Oblivion running @ 1920x1200, thoughts?

    I play at 1920x1200 with 4x AA, HDR, and 8x HQ AF. Even edited the oblivion.ini for ugridstoload=9, water reflections on, and have some higher res. texture mods running. My game system is a FX-60, 2GB of memory, and two X1800 XTs. Played a stealth character who used his marksman skill quite a...
  18. J

    Hdr + Aa

    I've got two X1800 XTs in Crossfire and I play the game at 1920x1200 with 4x AA, HDR, and 8x HQ AF enabled. Simply jaw-droppingly beautiful, the best graphics I've ever seen rendered real-time on my PC. And Oblivion isn't a shooter so anything close to 30fps is plenty smooth for me (heck, I...
  19. J

    HDR+AA in Oblivion for Ati X1K!

    The 1680x1050 score was with a single X1900 XTX. I can't get that rez in Crossfire mode either. I had to reboot to force AA via the CCC once I reinstalled Crossfire.
  20. J

    HDR+AA in Oblivion for Ati X1K!

    40fps at 1920x1200 with 4x AA, HDR, and 8x HQ AF with two X1800 XTs.
  21. J

    HDR+AA in Oblivion for Ati X1K!

    35fps at 1280x1024 and 41fps at 1280x720 with 2x AA and 2x regular AF.
  22. J

    HDR+AA in Oblivion for Ati X1K!

    1680x1050, 4x AA, 8x HQ AF, HDR, average frame rate of 27 fps on a FX-60 and X1900 XTX. Ran around the waterfront area outside the Imperial city in daylight, by the docks and then outside the buildings facing the water and the hills across the Rumare Lake. Very little grass, though some trees...
  23. J

    HDR+AA in Oblivion for Ati X1K!

    I've got two X1800 XTs I reviewed a few months ago, so I'm hoping for playable frame rates at 1680x1050 with 4x AA and HDR on my 2405 display. If they drop this driver tonight I'll find out after work. I've also got a X1900 XTX so I might give that a whirl too to see how single-card...
  24. J

    HDR+AA in Oblivion for Ati X1K!

    Be afraid, be very afraid. :D
  25. J

    Good Oblivion Benchmarks on Firingsquad

    http://www.beyond3d.com/forum/showpost.php?p=733800&postcount=183 Gee, dare I try this at 1920x1200 with 4x AA and HDR enabled on two X1800 XTs? :D
  26. J

    Good Oblivion Benchmarks on Firingsquad

    What I find odd is why a 7900 GTX would lose almost 33% of its frame rate going from 10x7 to 12x10 with 4x AA and 8/AF enabled?
  27. J

    Good Oblivion Benchmarks on Firingsquad

    I've been told Crossfire support for Oblivion is coming with the 6.4s (AFR to be exact), so, yes, benchmarking the game in Crossfire is a bit daft. Not sure about tweaks/optimizations in the 6.4, though.
  28. J

    ATi vs. NV --- Availability, Plenty of X1900's, very few 7900's

    Firingsquad's Oblivion benchmark article has a comment that claims there's a 7900 GT shortage going on. For whatever that's worth (insider knowledge or just an off-handed opinion thrown out).
  29. J

    ATi vs. NV --- Availability, Plenty of X1900's, very few 7900's

    Thank you for providing the one sane post in this entire thread.
  30. J

    Oblivion Video Card Benchmarks

    There is no SM3 support in Oblivion and the only way any HDR format would require a specific shader model would be because the developer arbitrarily tied the feature to a shader profile in their game. HDR and DirectX shader models have nothing to do with each other.
  31. J

    x1900xt or 7900GTX for TES4: Oblivion

    Eric told me that in e-mail Friday too. Bear in mind he's not on the software team. I was told by someone else, a hardware reviewer at a major site who chats with IHV employees quite a bit, both companies are working on updated drivers to tune performance now that they have final code (that's...
  32. J

    x1900xt or 7900GTX for TES4: Oblivion

    Hey, if both companies can eek out another 15-20% (enough to be noticeable while playing) by tuning their drivers for the game now that they have their hands on final code, more power to them. I heard NVIDIA has it running with QuadSLI (I can only drool at the thought of playing it at 2560x1600...
  33. J

    x1900xt or 7900GTX for TES4: Oblivion

    Heh, apparently both companies are quickly working on updated drivers for this game. The race to win the Oblivion benchmark battle is on!
  34. J

    x1900xt or 7900GTX for TES4: Oblivion

    Absolutely.
  35. J

    x1900xt or 7900GTX for TES4: Oblivion

    Depends on what you want. The 360 gives 1280x720 with I believe 2x AA and HDR. Techies with disposable income and high-end machines may prefer Oblivion on the PC as a showcase title that finally delivers some of the eyecandy that justifies spending a lot of $$ on their gaming rigs.
  36. J

    x1900xt or 7900GTX for TES4: Oblivion

    One of the PC Gamer reviewers has a review copy of Oblivion and says on his 4800+, 1GB RAM, and single X1900 rig that he plays at 1680x1050 with HDR and all in-game settings maxed out and, using Fraps to monitor the frame rate, says the game stays pegged at 60 (Vsync is enabled and I suspect the...
  37. J

    x1900xt or 7900GTX for TES4: Oblivion

    Contributory as always. You rock, dude.
  38. J

    x1900xt or 7900GTX for TES4: Oblivion

    Gonna depend on the amount of off-angle surfaces in the scene being rendered. ManicOne: I doubt either company considers their ROPs as part of their pixel shader pipeline. And ATI does not consider the texture units a part of the pipeline with the 1xxx parts since they were decoupled...
  39. J

    x1900xt or 7900GTX for TES4: Oblivion

    Considering you butchered those fragment pipes in your descriptions I'd suggest you follow your own advice.
  40. J

    x1900xt or 7900GTX for TES4: Oblivion

    http://www.beyond3d.com/forum/showpost.php?p=716900&postcount=40 This post is a guess by a fairly technical poster at B3D, but I wouldn't be surprised if he's right. The 360 can do HDR + AA, but the developer has said no existing PC graphics chip can, which means a FP16 format certainly...
Back
Top