Looks legit. Their site says at the bottom,
Site down until further notice
February 21, 2015
FrozenCPU.com will not be taking new orders or processing pending orders for at least the next 30 days. We currently are attempting to hire new staff to get things moving again. We apologize for...
Uhh... the headline is misleading.
In the article he never says "nobody plays shooters with a mouse and keyboard". He talks about how Halo bucked various shooter conventions, and says that nobody plays shooters like the used to. He probably just meant that those things that were in his words...
Well yeah, but the question is... can it play Crysis?
:p
I could personally never justify spending $10k to play video games, but hey man.... to each their own. Most of us wouldn't be able to assemble a rig like that even if we wanted to.... but I think we all sorta secretly want to.
Aren't all these beta drivers a bit excessive? I mean, just stick out a quality WHQL driver each month and I think most everyone would be plenty happy.
7970. That's essentially the issue (I'm running card -> receiver -> monitor), but it happens constantly unless there's something graphically intensive open, like Flash or a game or video. So if I try to listen to iTunes without, say, my browser open to the Youtube home page, the audio keeps...
Did they fix the issue of HDMI audio cutting out intermittently y'know, the issue that's forced me to keep rolling back to 12.4?
No, they didn't. Sigh. With driver 'support' like this, my next card will probably be from nVidia.
I agree that it seems weird that they sort of left out that the oc'd 7970 was the fastest card, but there aren't any guarantees that a reference card will clock that high, so it's kind of thorny to use that as the basis for a recommendation when at stock speeds nVidia has the advantage. They...
The most interesting thing about this is the comment at the end... that you guys are "going Green". I guess that's because you have multi-monitor dual-GPU rigs, and you've made it clear in your evals that nVidia smokes AMD in those departments.
But for us mortals who have a single-monitor rig...
Alright, I stuck a video on Youtube showing VSync forced from the CCC with no tearing or jumping. Upon playing around a bit, it appears we were both half-right. My experience is this:
VSync forced in the CCC + in-game VSync OFF = no vsync, lots of tearing
VSync set to "application" in the CCC...
I upgraded from a 22" 1680x1050 to a 24" 1080p monitor with a nice LED backlight, and I like it much more. You don't really miss the height the 1080p format is closer to your peripheral vision anyway, so it's better to go wide than tall. Plus, the LED backlight makes a huge difference in the...
Wrong. It's triple buffering, not VSync, that does not work in DirectX. Forcing VSync at the driver level works fine in DirectX and always has; in fact, that's how I've almost always done it, even when I used Direct3D Overrider with Windows XP. I'd be happy to post a video to Youtube with it on...
????????? If that's the case then I have no idea why you objected to my original post or why you care about adaptive VSync. It sounds like you're saying you don't get the jumping at all (even with the control panel set to application setting and in-game VSync on) regardless of your settings...
THIS:
I booted up Crysis Warhead, with VSync enabled in-game. Sure enough, both FRAPs and the in-game counter showed a jump between 30 and 60 fps. But then I hopped over to the Catalyst Control Center, switch the VSync option to "On, unless application specifies", and my framerates returned...
Yes, cool guy, I read the review. And I just booted up a game of BF3 to see if I was tripping. No jumps... the frame rates were consistent regardless of where I was... around 50 in some areas, low 30s in others. None of this 60, 30, 20 stuff. I'm familiar with jumping because it annoyed me to no...
I'm confused here. I thought the old issue of framerate jumping was just a relic of Windows XP being designed primarily for CRTs, because LCD monitors don't technically have "refresh rates" the same way CRTs do.
I've been using VSync for years on all my games in Vista and now Windows 7, with...
I think it's a huge waste to blow $600 on liquid cooling when a high-end air cooler and an Accelero for your GPU will get you comparable results without any of the hassle or risk at least, comparable enough to the degree that it affects actual clock speeds, noise, and real-world performance...
Funny.... I guess I'm old school, but I'm gaming at 1900x1080 and I can still run all my games maxed out with my old 5870. Honestly, the fact that [H] has to use triple-monitor rigs to push just a single video card just shows that developers aren't really keeping pace with hardware technology...
Doesn't this require people to, y'know, actually be using Google Plus? I don't know if they've visited their own site lately, but there are a lot of cobwebs and crickets.
I do think it's pretty crazy how times have changed when these days we're disappointed when we "only" get an 800mhz overclock. A 4.2ghz Core i7 is still a ridiculously fast chip that far exceeds the normal usage needs of most. It wasn't that long ago that AMD-FX chips were all the rage and if...
It's truly baffling that id completely ignored basic things like
- optimizing the game for current drivers
- providing video settings since not everyone can run the game at full spec
- calibrating the texture streaming so it's optimized for the platform
Having said that, these are in the...
Hmm... well, it looks like the update went fine for those of us who have the Steam version. It auto-updated to the current version fine, and I installed the patches in moments without issue.
The game now looks closer to the first game, though personally I still think the first looks better (...
Honestly, I'm growing more than a little weary of reviewers whining about "consolization" every time a game doesn't push graphical boundaries like Crysis. And hell, when Crysis came out, everyone still complained because no one had a rig powerful enough to run it on the highest settings. You...
You don't talk smack about your competitor's products unless you have a reason to be worried. Despite ATI's relative success with the previous generation, they still control only about a third of the discreet GPU market, while nVidia controls the rest. nVidia has what could well be a very...
Nice card... BUT...
I'm getting a little annoyed at how [H] editors seem to be gushing about how much it's beating nVidia cards at the same price point. They always seem so totally amazed that a new card is faster than a last-generation card that was built on architecture that's over a year...
Okay, so wait a second. You're comparing a $170 card to the 4870, which is around $100 more, and you're surprised that the latter performs better? It was my understanding that the 9800GTX+ was meant to compete with the 4850, not the 4870. The 4850 is around the same price as the 9800GTX/+. Like...
Puh-leeze. We've been hearing this nonsense from Charlie at INQ' forever. He has a penchant for spreading FUD about nVidia, and I can't believe anyone would actually think anything he has to say qualifies as "news". If this is ever confirmed by anyone who has a shred of journalistic integrity...
Believe it or not, I don't use any RAM cooling at all. I tried using the Swiftech RAM sinks but the stupid things didn't stick worth a crap and kept falling off. They didn't make any difference at all in overclocking. My stress test is ATI Tool's algorithm scanner, so I'm pretty confident in the...
I run a water cooling rig myself, but while the 8800GTX got a nice boost from water cooling, with the 9800GTX my overclocks were the exact same on both stock air and water. The only real benefit is the noise elimination, because the 9800GTX has a loud, whiny fan.
Generally though I don't see...
I don't think article is as baseless as [H] portrays it. In my experience, I have known people who couldn't keep a job because they played WoW nearly constantly. A number of them used meth to be able to play for long stretches. Just because we're gamers doesn't mean we should assume that games...
I think the nature of the PC gaming community, with all its patches, mods, driver updates, custom game profiles, and frequent hardware changes, clashes with the Apple ideology of "it just works". PC gamers don't necessarily expect it to "just work"; they expect a high degree of personalization...
The surprising thing to me is the "playable settings" the [H] team found. As far as I can tell, on my 8800GTX + Q6600 (3ghz), many of the settings make zero difference in performance. In fact the only ones that make a difference at all are Object Quality, Shadows, Shaders, and Post Processing...
I'm surprised that anyone is surprised at the possibly short lifespan of these cards. The G92 is nothing more than nVidia lowering their manufacturing costs; thus they're able to bring 8800Ultra performance for a hair over $300. The G92 was never a completely new design.
We all know nVidia's...
I don't think that would make a difference. The problem isn't so much the limited RAM so much as the memory bus. If it had a 384-bit bus like the G80, with those kinds of memory clocks we would be seeing some truly outstanding high-res/AA performance. As it is though, the high memory clock is...
This is pretty much right in line with what I expected. Clearly nVidia is focusing more on cost reduction with the G92 than any kind of significant performance improvement. But at the same time, this is exactly what they did with the GeForce 7 series cards new core, then a fabrication...
I'm genuinely dumbfounded about these kinds of comments. Surely they can't be coming from people who have actually played Crysis. I own all the graphics-whoring next-gen games, and there is positively nothing that even comes close to Crysis. I play Crysis in DX10 with all settings at "Very High"...
Ha! Yeah, that wouldn't surprise me one bit. That would be a sweet card. I'm truly dumbfounded as to why nVidia is using a 256-bit bus on these cards. I've seen countless benchmarks where the G92 cards might be competitive or even have a negligible lead, but as soon as you turn on AA, the...