Search results

  1. B

    GTX 750 Ti based on Maxwell with 64-bit ARM CPU?

    Who cares about benchmarks? How about some real world applications/games? GPGPU is still in its infancy, in 2012 when Kepler was launched even more so. What tessellation was for AMD (in the end not really important aside from synthetic benchmarks and a few select games) with HD5000/6000 now is...
  2. B

    4 Weeks with Radeon R9 290X CrossFire @ [H]

    I cannot understand how the author arrives at the opinion that 290X CF is better than Titan SLI (in Far Cry 3 which he mentions mainly). In the [H] review 290X CF does 43.3 fps avg, Titan SLI does 41.1 fps avg. That is basically the same. Titan would perform even better if the temp target were...
  3. B

    Question for [H] (Kyle and Brent)

    Thank you Kyle and Brent! I have read this explanation, but it still doesn't address the following question: All of what you write there also applies to Nvidia's Boost temp/power targets. It is not overclocking, it is under warranty, it was set conservatively to make the card quieter and less...
  4. B

    Question for [H] (Kyle and Brent)

    Out of the box is out of the box, no changes to the card. Nvidia's Boost is also an advertised feature with all the possibilities it comes with. So we have to deal with both. Take a look at this slide from the 780 Ti review slide deck. It clearly conveys the fact that you can increase...
  5. B

    Question for [H] (Kyle and Brent)

    No it is not. If you open the window, put a fan in front of the card etc. it will boost higher without any change to the card itself or the software at all. Boost 2.0 allows for selecting different settings, it is within the specification and within warranty. No, it is neither overclocking nor...
  6. B

    Question for [H] (Kyle and Brent)

    Welcome to the world of temperature dependent clock speeds. Both go hand in hand nowadays. And neither uber mode nor increased temperature target settings are overclocking. See above. Performance and cooling go hand in hand how. You cannot separate the two anymore. This isn't about 4K or...
  7. B

    Question for [H] (Kyle and Brent)

    No, why? Is AMD's BIOS switch in their drivers? ;) I know one could nitpick over this until the end of time, but the fact of the matter is that with both solutions you have the ability to change these settings without needing to download extra software (the tools come with the cards) and keep...
  8. B

    Question for [H] (Kyle and Brent)

    Hi guys, I was wondering about something. You test the new AMD cards in uber mode, thus changing the default setting. Yet on the Nvidia side you don't change the temperature target or fan speed although those modifications would be covered by warranty as well. Why is that? In my opinion...
  9. B

    GTX 780/Titan Owners, are you jumping ship to 290X?

    No. And anyone who selected "yes" in the poll, is stupid. 290X won't be a seizable upgrade over the 780, even with Mantle. The reasonable thing to do is to upgrade for 70+% performance.
  10. B

    Nvidia DX11 AA-bits petition

    Just a heads up: https://forums.geforce.com/default/topic/544701/geforce-drivers/petition-for-directx-11-anti-aliasing-driver-profiles/post/3891774/#3891774 So if you haven't already, please support this petition
  11. B

    2x Radeon HD 7990s (4 GPUs) or 2x GTX 780s (2 GPUs) ?

    Wrong. If you know how to use decent settings, at least 3 GPUs can scale very well in most games.
  12. B

    Cuda Core Comparison

    Hm, no they are not. K6000 is GK110, GTX680 is GK104. GK110 has dedicated DP-compute units inside the CUDA cores which GK104 has not. GK110 also has CUDA 3.5 capability while GK104 only has 3.0
  13. B

    GTX titan paired with FX 8350

    Depends on the game and settings and scene. The 8350 can be definitely quite a bit slower than the 2500K. I would not pair it with a Titan. If one can spend 1000 bucks on a gfx card, one can spend 200-300 on the best gaming CPU.
  14. B

    Nvidia DX11 AA-bits petition

    As many of you know, Nvidia has AA-bits that can be used to force antialiasing in (some) games where AA is not natively supported or to improve the quality of AA vs. the ingame solution. The problem: These bits are only available for DX9. There is a handful of DX10 games with predefined bits...
  15. B

    AMD Crossfire a scam - Almost no benefit over single card

    Lorien is posting the same shit in other forums, too. He isn't objective and he doesn't think before he posts, making ridiculous comparisons. He should be thankful that problems are fixed, no matter who finds them. Instead he is spewing hate - disgusting and ungrateful.
  16. B

    NVIDIA Prepares GeForce Titan Ultra Graphics Card

    And yet the same source is cited... You should definitely change the thread title and first post to clarify that this is only speculation, nothing else.
  17. B

    NVIDIA Prepares GeForce Titan Ultra Graphics Card

    No, you're posting speculation from one single source as fact. Just because other sites regurgitate these speculations doesn't give them more substance. That is the problem.
  18. B

    NVIDIA Prepares GeForce Titan Ultra Graphics Card

    People, please! Read the original source before posting nonsense. This is pure speculation only. And this speculation speaks of Q4'13/Q1'14 for a possible Titan Ultra. There is absolutely no hard evidence or concrete rumors that collaborate these speculations.
  19. B

    NVIDIA Prepares GeForce Titan Ultra Graphics Card

    Nope. If you had bothered to check the original source, it says that this is what a Titan Ultra could look like. Nothing more, nothing less.
  20. B

    Enthusiast setups now CPU limited?

    No. CPU bottlenecks don't ask "why". If the game is CPU bottlenecked, it just is. We laypersons do not have the competence or knowledge to judge which title could be better optimized and which one could not.
  21. B

    7970 Crossfire smooth in Bioshock Infinite?

    Bioshock Infinite is not very demanding. You will be CPU bottlenecked, thus no microstutter. I run it with two Titans@1100 MHz at 2880x1620 and my GPU load is 60% at maximum on both GPUs. I'm using vsync, though. Don't know if you would.
  22. B

    Interesting data on Crossfire vs SLI

    Not if you fall below the fps cap, in your case 60fps. What if you want to increase image quality so that you get drops to 40fps in some scenarios? Limit the frame rate at 40? Maybe CF should be benched with a fps cap at min fps, then :D
  23. B

    GTX Titan (final specs and bench)?

    A CPU bottleneck doesn't have to be complete, it can be relative as well. Meaning card A could be 40% faster than card B, but due to the CPU it is only 20%.
  24. B

    Nvidia has no new cards planned for 2013 either?

    What about a traditional refresh in between?
  25. B

    What's the point of SGSSAA ?

    The newer drivers (310 series and above) have automatic LOD adjustment which is needed to improve image sharpness and detail. Sometimes SGSSAA can conflict with PP effects, thus it may blur. In those cases custom AA bits can be employed that reduce or eliminate blur: It's German, but the list...
  26. B

    New GeForce Titan??? (GK110)

    690 is dual-GPU. A performance card SLI always beat the highend part of a certain generation. 560 Ti SLI > 580, 460 SLI > 480, 260 SLI > 280 etc.
  27. B

    Manufacturing costs of video cards?

    Q2/2009: Q3/2011: Both from Mercury Research.
  28. B

    Upgraded gtx570 to hd7970 but not much faster?

    GTA 4 and Guild Wars 2 are quite CPU bound. Test some more demanding games.
  29. B

    Any news on GeForce GTX 780?

    SLI/CF with lowend cards doesn't make sense.
  30. B

    Anyone running SLI 670/680 and play FarCry 3?

    See if you maybe don't need them, first. But I read about a low GPU usage bug when shooting at specific things and in cut scenes iirc. If you want to use the new custom SLI bits, you would need Nvidia Inspector (which only modifies driver settings that are not exposed via the NV control panel)...
  31. B

    Anyone running SLI 670/680 and play FarCry 3?

    I'm running SLI on FC3. You'll need to make sure that the games's exe is included in the profile - this was a bug, that was corrected with the latest drivers. But you never know. Additionally, you should use custom SLI bits. I'm using 0x080942F5 for DX11 and get perfect scaling when GPU bound...
  32. B

    Far Cry 3 Performance on Nvidia cards

    Force vsync via the driver instead of using ingame vsync? No stutter here. Interestingly the ingame vsync stutter only occured after a game update - in the beginning it was fine.
  33. B

    Best Driver Removal Software?

    This. Don't use these tools, they are completely unnecessary and do more harm than good.
  34. B

    Far Cry 3 Performance on Nvidia cards

    Good to hear :) Below 50fps it seems not smooth to me with 580 SLI, microstutter probably.
  35. B

    Far Cry 3 Performance on Nvidia cards

    You cannot force AA in DX10+ applications. Run the Nvidia update to get the DX11 exe added to the profile and use these SLI compatibility bits: 0x080942F5
  36. B

    7970 CFX vs GTX 670 SLI

    SLI Smoother gameplay due to framemetering and another goodie that is coming soon, but I have promised my source not to divulge more information. Editable SLI profiles: Low GPU usage, stuttering or no SLI profile at all (yet)? Just ask at guru3d forums, Nvidia forums or 3dcenter and mostly...
  37. B

    Which card coming first ? HD 8970 or GTX-780 ?

    It's normal: New gen on new process: 50-70% faster Refresh on same process: 10-20% faster Rinse and repeat.
  38. B

    Which card coming first ? HD 8970 or GTX-780 ?

    It's not next-gen though, just a refresh on the same process node. 6970 was 15% faster than 5870, too ;)
  39. B

    Which card coming first ? HD 8970 or GTX-780 ?

    33% faster than 7970 GHz? How would they do that without blowing the power envelope? They cannot increase memory bandwidth, they already have that maxed out. 15-20% would be my guess.
  40. B

    Which card coming first ? HD 8970 or GTX-780 ?

    Never :D We are still on 28nm. GTX680+50% at best, that could fit into a 250W TDP. I think HD8970 will come first, but GTX780 will come shortly afterwards. My guess would be that they will be about one month apart.
Back
Top