I finally ripped the bezels off my Alienware OptX AW2310's and wrote a guide for it - let me know what you think!
http://australia.tweaktown.com/guides/4714/alienware_optx_aw2310_23_inch_monitor_de_bezel_modification_guide/index.html
Without the bezels, it's just eye candy :)
I know... but my problems disappeared when I did that.
My temps dropped drastically, from overheating and keeping the clock speeds up to going much lower and not running ~100C the entire time.
Yeah I did... I had all sorts of problems at first (especially with the GPU's down clocking on 3 screens) found out it was an 8/8x limitation.
I'm using an Asus Maximus 3 Extreme, and ports 1 and 2 are 8/8 so I moved to 2 and 3 which I think are 16/16x and my clock speeds and hence idle temps...
FWIW, I run 3 x Alienware AW2310's in Surround Vision Portrait Mode.
GTX570 SLI
Core i7 860 @ 4ghz
No problems at all - I only play BC2. 120 - 130fps 90% of the time, on the more hardcore maps ~100fps.
All I require is 120fps for my 120hz goodness.
My 5970 was a pile of shite for doing it...
I think it's my large resolution + requirement for 120fps + 4ghz chip.
So many people with yeah, as you said Q6600's and GTX580 wondering why it doesn't perform well :P
I run:
Core i7 860 @ 4ghz
GTX570 SLI
3 x Alienware AW2310's in portrait
BF:BC2 graphic details @ high, AA 1x, AF16x
3240x1920
~90fps constant. Impressive.
I get ~96 - 98% GPU use on both GPU's.
Edit;
Turned off HBAO... now:
120 - 130fps nearly constant! :D
In today's world of console-raping DLC, people are that tight to not pay $15 for this?
I'd rather pay DICE $15 than some other company $15 for yet-another-DLC.
Which set? The 263 drivers are the newest?
http://www.nvidia.com/object/win7-winvista-64bit-263.09-whql2-driver.html
That's what I'm using... so you're using the 260.xx drivers and it's OK in Portrait? I might have to try that tonight :D
Plus it's getting harder to know which GPU's are better for certain things - especially since virtually every game is a console port, with frame limitations, caps and not very good support for PC GPU's.
Oooh that news post caused quite the kerfuffle.
http://www.kitguru.net/components/graphic-cards/zardon/amd-hd6970-update-and-performance-indicators/comment-page-1/#comment-6097
From what I read, the 6970 will beat the 580 and be cheaper.
This quote:
The HD6970 is not aimed at the...
1.) I have a 5970 - if you're referring to my post about 3 x AW2310's - you'll see I have given details out of what I'm running.
2.) - Read 1.)
3.) - Read 1.)
4.) - Read 1.)
Below - I'm running a 5970, it has 2 x dual link DVI ports and a miniDP. I used a miniDP to dual link DVI...
My thoughts on the release, being pushed up so quickly:
What do I think? NVIDIA want to get in now, to show they have a better GPU to what is available *today*. In all benchmarks, it will be compared to a 5xx0 card, at best, the 68x0 release also. It will beat them all and come out on top as...
The only way we'll get OOTB support for Eyefinity/Surround Vision is never.
Consoled games are FTL - Oblivion worked because it wasn't designed on a console and then gang-raped and ported over to the PC.
The sums don't work that way.
480 quad SLI uses 1500W+ under load.
250W TDP = 1000W, so no idea where 500W is being used...
5870 4 way doesn't chew through that.
Not when the 480's TDP is kinda bullshit.
Read this thread which explains a fair bit of it, actually started by myself:
http://forums.overclockers.com.au/showthread.php?t=912686&page=13
http://www.amdzone.com/phpbb3/viewtopic.php?f=527&t=137582
Necro revival.
Was googling about a successor to the RV02 and found this thread.
That case is good for cooling - but, what if you wanted to install a longer PSU (1000W+ units are large). I like the cooling design and 90 degree angle installation, the cable management and HDD installation...
That's right - so then the question comes down to, what the fuck are NVIDIA and AMD doing with these gaming developers - why not work closer with them to scale the hardware better somehow?
This, my man.
I wish people would get over the "fanboy" shit that I deal with on various forums.
Whoever has the best - gets my cash. I've never been a fanboy of any company, why would you? What do NVIDIA fanboys do in times like this? Buy a 480? Wait for months during delays?
Why...
http://www.tweaktown.com/news/17066/nda_lifts_on_new_6k_amd_range_on_22nd_oct/index.html
22nd of October, eh?
Can't wait for the 69xx parts - hopefully it's not too far behind.
25% less power? The GTX uses between 260 and 320W (depending on which review you read) so the above poster meant more like 40% less power.
AMD seem to have done it - nearly double the power, same (if not, just a bit more) wattage.
We won't see great performance-per-watt NVIDIA GPU's until...
It's been sent out and some reviews are popping up - but they're useless.
Reviewing it as a motherboard - and not testing the Crossfire/SLI/Hydra mix, that's what I want to see, how it performs with CFX/SLI and then Hydra, all seperately.
I'll be getting this board the second it hits...
LOL
People need to calm down - it's all part of the "close to launch craziness" that goes on.
There are no exact specs - or else AMD would list them on their site - right now, everything is a maybe, most sites are using each other as sources.
I also think, people don't want to believe...
Specs included!
http://www.tweaktown.com/news/16943/amd_radeon_6850_and_6870_are_go_to_launch_on_october_18/index.html
Looks good, can't wait to see what they can do with a non-stripped down SP model.
Pulling the same performance from 960 SP's as the current 1400 SP's use... impressive...
No, I mean the 480.
http://www.yougamers.com/forum/showthread.php?t=127567
The 480 isn't 100% faster than it's previous gen (and very old 285).
It's 25 - 40% faster, yet from 1.4 billion to 3 billion transistors.
A mid range part that is twice as fast as current gen, keeping up with current gen 2nd fastest GPU? What more do you want?
NVIDIA haven't done this in years - their 480 can barely beat a GTX285 (only ~25% faster) - and that's pretty old now.
It's also a board for AMD users who want the best of both worlds (AMD and NVIDIA GPU's).
Lucid Hydra isn't on every board either..... so it comes with a premium.
And, who else better to sell it to? The overclocking market.
If you think most good X58 boards are $300USD+, it isn't that...
I like that comparison... but also remember how much NVIDIA came back after the 9700 - it took the 5, 6 and then 7 series to bounce back to their former glory days.
I truly believe we'll see that again.
I don't think NVIDIA will be able to be on top (I don't count using 300W+, dual card...
Here's the link to the Anandtech article that was mentioned before:
http://www.anandtech.com/show/2937
As for people saying it takes years in R&D - that's right, and for this reason alone - is why NVIDIA won't be able to swing against AMD until Q3+ 2011.
This tech will be all they have...
The same could be said for NVIDIA - even with their magical "perfect" drivers (that don't kill GPU's or overheat them ya know........).
If NVIDIA had great hardware, they'd put ATI down for the count.
The mighty has fallen, people will see it when the 6xxx comes out and NVIDIA have nothing...
What the shit?
Fermi had 7 months on AMD... seriously, it should've kicked AMD in the ribs with that much time.
Thankyou. Not a big member of [H] so thanks.
That's my point, ATI never really pushed the 5k series - same with the 6k series. Yet, NVIDIA were pushing Fermi before it was a...
I love how ATI never really bragged about how good this architecture was, and blew everyone away and stole 10% market share in <1 year.
NVIDIA brag about Fermi, and it fails miserably (when compared to all aspects of the 5 series, power consumption, heat, and no delays for 7 months).
I too would like to see higher resolution tests, with 4x/4x or even 16x/4x.
Interesting results tho - goes to show people that they don't need $800 motherboards to run tri SLI/CFX.