I've been using Windows 8.1 Pro with Media Center 64-bit since I upgraded from Windows 8 Pro the weekend before last.
I've found Windows 8 itself to be OK although I'm not a fan of the Start screen at all; it's pretty much useless for me, particularly as I've been using Start8 more or less...
I upgraded to Windows 8.1 Pro with Media Center 64-bit from Windows 8 Pro with Media Center the weekend before last, ahead of its official release, as ever since I did a clean install of Windows on my new Haswell i7-4770K system back in June of this year I've been unable to play The Raven...
I downloaded this GTX TITAN driver from Internet Explorer 10: http://catalog.update.microsoft.com/v7/site/ScopedViewInline.aspx?updateid=82fad0a4-4a8e-46b2-9609-02ec610d6176, extracted the CAB file and was able to install the drivers without any problems or INF modifications just by running...
Yeah, I have noticed the same issue as JCNiest5 and consequently only score 6.7 in the W.E.I. test with SLI but 7.8 if I force Single GPU mode. Not a big deal, I know, but the idea behind W.E.I. was to assign a rating to your PC so that you'd know whether a game would run well or not (via Game...
Thanks David. I thought as much but I thought I'd ask as I'm waiting for new drivers from NVIDIA to fix the constant freezing I'm getting with Tomb Raider.
On the second page, you refer to using v314.70 beta drivers for the NVIDIA graphics cards at the bottom of the page (but v314.70 WHQLs in the Video Card Configuration graphic; is that a typo and you meant v314.07, which are WHQL certified, or are you using an as yet unreleased driver, one that...
This game is absolutely excellent but despite having the hardware to run at maximum settings at 1920x1200, 2xSSAA, 16xAF with tessellation, High Precision and TressFX on (the benchmark returned at ave. of 50 fps), I had two total system freezes that required a hard reset at the point where Lara...
The news that this game is locked at 1024x720 and 30 fps reads to me like the developers have ported the console engine over literally without any real optimization. I'm betting that if we could set resolutions higher than 1024x720 that performance would diminish and show just how poor the...
I was getting at least one or two crashes per browsing session when I had Firefox 13.0 and Firefox 14.0 beta 7 installed with Flash 11.3.300.257. However, I was able to fix those by using the ProtectedMode=0 in the Flash MMS.cfg file. I'm now using Flash 11.3.300.262 with Firefox 14.0 beta 8 and...
Erm... why is it? I mean NVIDIA cards perform better than AMD cards in DiRT 2 and 3, both AMD Evolved games, but that doesn't mean it's sad, just that their drivers are better optimised for those games. :confused:
I saw one post on the Guru3D forum (I think) that claimed SSAO looked better than HDAO and, sure enough, when I tested it I found that it did look better. HDAO has very thin, faint shadowing whereas SSAO looks bolder and more obvious. Maybe HDAO is more realistic in terms of ambient occlusive...
Since the patch the retail and non-Steam versions now work with MSI Afterburner and EVGA Precision X's OSD and it appears the VRAM usage is correct from my testing as I'm using 0xMSAA + FXAA=Very High and it shows around 1.9 GB of memory out of 4 GB available required (the VRAM is doubled...
P.S. I should add that Max Payne 3 is a great game. Apart from the embarrassing custom install issue, which caused the game to not load if there were spaces in the pathname (thankfully this issue was fixed in the patch released within a week of the game's launch), and the apparently broken MSAA...
Great article but I'm a little disappointed at how poor MSAA looks in this game. As you can see in my lossless PNG comparison *HERE* MSAA barely offers any anti-aliasing at all compared with FXAA.
What is even more puzzling, at least on my setup as the issue may be specific to my PC, is that...
Buy a 4 GB card for a game with broken MSAA support?!? :confused:
Seriously, I don't know whether it's a bug in the game, something specific to the GTX 680 or its graphics driver but MSAA simply doesn't work properly for me as I've done extensive screenshots comparisons. MSAA even breaks FXAA...
No, although the first time I ran the game after it updated today's patch it started up windowed. However, pressing Alt-Enter fixed it and now the game is back to loading up fullscreen as it should be.
The patch also enabled the use of MSI Afterburner and FRAPS in the retail version (which...
Interesting so it isn't just NVIDIA/GTX 600 cards that are affected. It looks very much like MSAA isn't working properly on AMD's HD 7970 either based on your screenshots. Thanks for that.
MSAA + FXAA on my system looks as bad as MSAA + *NO* FXAA which is strange because you'd think having both...
I've posted some better comparison shots of the different combinations of MSAA and FXAA using lossless PNG images here: http://forums.guru3d.com/showpost.php?p=4333270&postcount=59
MSAA appears to not be working properly on my PC as the images confirm that only FXAA *on its own* effectively...
I'd be interested to hear more feedback and may try and take some uncompressed PNG images at full 1920x1200 resolution and from the same angle so I can better show the differences. One other person on the Guru3D forum has also said that MSAA looks worst than FXAA but I'd like to see what others...
Here is a screenshot comparison showing the differences between various combinations of MSAA and FXAA, granted I couldn't get the exact same angles and the site I used to upload the images downsampled them from 1920x1200 to 1600x1000.
http://forums.guru3d.com/showpost.php?p=4332692&postcount=26...
I have a pretty decent setup so I've been able to run this game maxed out except for 8xMSAA as I have insufficient VRAM (going from 4xMSAA to 8xMSAA requires an extra 1.4 GB or 700 MB per GPU!). So I settled on 1920x1200, maxed settings (Tessellation=Very High), 4xMSAA, FXAA=Very High and...
I bought a Point of View GTX 680 at launch to replace my GTX 580 and was very happy with the performance increase on my 24" 1920x1200 display. In benchmarking I saw between 15% and 70% increases in my framerates (Heaven 3.0 and 3DMark11 saw the biggest increases) and overall I'd say the card...
This game is definitely an outstanding one I've thoroughly enjoyed and it deserves all the praise it gets. However, it seems to me that the game was designed from the off with an eventual console port in mind as I remember thinking that as soon as I loaded it up last year and noticed that the...
I guess it ultimately comes down to user experience as I've seen both those kinds of tearing on LCD and CRT TV/monitors.
For example, as I mentioned earlier, on the same HDTV, GTA IV on the Xbox 360 has screen tearing only at the top of the screen, usually in the overscan area or with 1:1...
It is. The article is incorrect IMO as many others have pointed out. If a game with v-sync disabled drops to, say, 50 fps on a 60 Hz display (as would happen with Adaptive V-Sync) and stays at that level for a few seconds (unlikely but let's pretend it does for the sake of argument) then it will...
Exactly. I wish I'd written that!
I've seen screenshots of tearing in action and those are taken directly from the frame buffer before the image is sent to the display using a piece of hardware so that is proof that the display has no bearing on whether tearing happens or not.
Again I disagree completely with that statement as I've been using triple buffering in both OpenGL and Direct3D games for many years now and all I've noticed is how much smoother the gameplay feels in comparison to double-buffered v-synced games. As I cannot stomach screen tearing, even in minor...
I respectively disagree with that as I've played games across a wide variety of displays and I've seen bad tearing across all of them, even on my old 17" CRT monitor. LCDs in my experience are no worst than CRTs for showing screen tearing but, of course, the amount of tearing and how obvious it...
Screen tearing is a very subjective thing though. You can prove it occurs as I've seen console game benchmarks were the percentage of screen tearing is shown and the tears are shown as red vertical lines in the graphs that display during gameplay videos (such as the ones on Eurogamer.net's...
P.S. It would have been interesting if this article had also compared the framerates from using Adaptive V-Sync and normal V-Sync + Triple Buffering (using D3DOverrider).
Also, wouldn't capping the framerate at the refresh rate of your display with V-Sync off also effectively be the same...
I just came in this thread to post the very same thing as I was surprised to read an article on this site that implied that tearing doesn't occur when the framerate dips below the refresh rate of the display because it most certainly does, at least on every display I've ever used, CRT and LCD...
I'm surprised that the GTX 680 starts to throttle its clocks at 70 C, particularly as the thermal shutdown temperature of this CPU is 98 C according to NVIDIA. Considering almost every other high-end GPU hits 75-85 C under heavy load 70 C seems a little overcautious IMO. 75 C would have been a...
Yeah, I see the same behaviour here with my GTX 680.
With 132% Power, +150 MHz GPU offset, +400 MHz memory offset, a custom fan profile (at 40 C, it raises the fan speed by 1% for every 1 C increase in temp) and NO voltage tweak, my card will run at 1,287 MHz GPU/6.8 GHz memory clocks until...
My Point of View GTX 680 was doing 1,137 MHz on the GPU clock out of the box (i.e. the default 100% Power, +0% GPU/memory clocks) when I tested it with a number of games, including Crysis 2 and Alan Wake (@ 1920x1200 with max settings and 8xAA/16xAF/FXAA where possible). With Power set to 132%...
@Sojuuk - My problem was that when I loaded a game which had previously ran at a core clock of 1,006 MHz, it was still showing as 705 MHz, though the performance didn't seem noticeably worst which you'd expect if it really was 30% underclocked. That 705 MHz core clock never wavered either...
P.S. I can't help but wonder why the core clock showed as 705 MHz when it 'stuck' on my system. The pre-0.6.0 versions of GPU-Z for example erroneously display the GTX 680's default core clock as 705 MHz and FutureMark's SystemInfo still shows that value too even when everything else shows it is...
I'm not normally one to overclock my graphics cards as I've never considered the extra power consumption/heat worth the extra 1-3 fps I'd typically get from testing. That all changed with the GTX 680 though which auto-clocks itself anyway without any user intervention. I was regularly seeing my...