Three 680 cards, and one 089 card, that's gonna be a monkey wrench ...
LMAO. Well played, sir, well played.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Three 680 cards, and one 089 card, that's gonna be a monkey wrench ...
Three 680 cards, and one 089 card, that's gonna be a monkey wrench ...
Would it be a good idea for me to SLi 680s on a single 1200 res monitor? I play BF3 and i would like it to never dip below 60fps while playing @ ultra 4xaa.
I'm one of those people who must always keep his money in his wallet facing the same way, ordered from the smallest denomination to the largest. Yeah, the one box drives me nuts.lol. I am not OCD or anything. But how does a someone put those in the boxes, see the three and the fourth sticking out, and not get the twinge to turn it over?
If you get that twinge, you are OCD to some degreelol. I am not OCD or anything. But how does a someone put those in the boxes, see the three and the fourth sticking out, and not get the twinge to turn it over?
I'm one of those people who must always keep his money in his wallet facing the same way, ordered from the smallest denomination to the largest. Yeah, the one box drives me nuts.
You could also post in the nvidia forums on the official website. Sometimes they keep an eye there too.get nVIdia to fix this problem.
You could also post in the nvidia forums on the official website. Sometimes they keep an eye there too.
Comparing SLI 680 to xfire 7970 there are areas that are 10-20 fps slower in crysis 1/2, this is 2560 resolution with ultra textures / dx11...same for metro 2033 and Witcher 2 at the same resolution
Those are all games I want to play on my new 30" LCD at 2560. Is one 680 enough to play them MAXED OUT?
Not a chance. Even 680 SLI won't max out Witcher 2 and Metro 2033 at 1600p
But looking at the numbers I think its pretty clear that both Kepler and Tahiti are overkill on a single monitor @ 1080/1200p.
Are there titles that bring a GTX 570 or 560 Ti or Radeon 6970 or 7800 series to their knees at 1080p? (I'm curious.)
Most games to be "maxed out" have one inane setting that makes FPS plummet but does little for overall visual quality. In Metro 2033 it's "Depth of Field," which is just some over zealous bokeh filter, or in the Witcher 2 it's "Uber Sampling," which is just some crappy super sampling method. Either can be turned off with no noticeable visual quality loss and then those games will play fine on a single high-end card (7970, GTX 680, etc.).
YES! LOGICAL! THANK YOU!
The term MAXED OUT gets over used and in the hands of an anal gamer can be very misleading
Nvidia brought some much needed competition in prices, which is great for us consumers, though not so great for my 7970 resale value. I guess that's the price to pay for getting this level of performance 3 months early.
Google chrome extension:
Page Monitor
It refreshes a page at a specified interval (don't do it a crazy setting like 1 sec, 30 secs is more reasonable) and more importantly it tells you when that page has changed.
Says the guy who used it in the question in the first place If you don't define maxed out as all settings to the highest at close to 60 FPS then what is it?
YES! LOGICAL! THANK YOU!
The term MAXED OUT gets over used and in the hands of an anal gamer can be very misleading
Great post let me address a couple points. Battlefield 3, especially with 4x MSAA is where it matters. I'm going from a 580 to a 680. Looking at the review, a min of 40 fps vs 29 fps and an average of 59.4 vs 45.6. This is at 1080p. To me, especially in a fast paced multiplayer shooter, this makes a huge difference. The min fps is especially where it comes into play. Deus-Ex also shows an improvement against a 7970. While the average fps is the same, the min fps is far higher which smooths out the experience. I also hate tearing so Adaptive V-sync is something I'm going to spend a lot of time with and I hope it will work out very well. Some of the new tech in this generation is amazing. Add in the lower temps/power usage and it's a real winner.
I want to address what I think is a VERY common misconception or maybe even elitist attitude on the forums. That these cards are "overkill" for 1080p. In many of the games tested these cards are not overkill. A lot of it has to do with personal preference, so I will state mine. I have a 27" 1080p monitor and a 50" TV both of which are connected to the PC. Depending on the game I am playing I use the monitor or TV (BF3 vs Street fighter for example). However, I like to CRANK things all the way up. Having a 580 GTX (soon 680 GTX) allows me to turn on all the eye candy including MSAA etc to have the visuals really shine. I also expect 60fps as often as possible. Combine these settings and the 680 GTX is a perfect fit for max eye candy 1080p gaming.
I have also noticed that people who game on multi-monitors have to frequently turn down effects because the cards cannot handle everything maxed out. They also run into issues with proper multi-monitor support in some games as well as SLI issues if they are using a 2nd or 3rd card. Don't get me wrong, there is nothing wrong with multi monitor setups. If that is what you prefer then please by all means enjoy it and ignore my rant about 1080p single monitor. I just simply prefer to game from 1 display and wanted to raise awareness that high end cards are needed for even 1080p to drive the newest games with all settings maxed out.
Agreed that 7970 or 680 may or may not be overkill for 1080p depending on personal preferences. Minimum frame rates are the key and the newer cards do a much better job keeping the minimum 60 fps goal. Obviously some of us do not mind the lower frame rates as much.
I even have a friend who can't stand not being at 120fps+ and to achieve that he just lowers settings, seems fps is more important than texture resolution for him.
I have friends who tolerate fps below 60 but I cannot. I even have a friend who can't stand not being at 120fps+ and to achieve that he just lowers settings, seems fps is more important than texture resolution for him.
it all looks so stuttery below 60 to me
That's a very good point.
I've just got my POV GTX 680 today but I'm a little disappointed that there isn't at least one tech demo from NVIDIA showcasing TXAA. That means we'll have to wait until games include it to see just how good it really is. Hopefully, NVIDIA will eventually be able to add it as a selectable AA setting in the driver profiles like they've done for FXAA (finally!!!) in the v301.10 drivers. If they don't then I fear it'll be underused and as much of a letdown as PhysX was.
Zarathustra[H];1038527643 said:Really? How come this bug doesn't prevent my 7970 from running at PCIe 3.0 speeds?
It is if you have plenty of PCIe lanes or a single GPU.
Where it becomes relevant is when you are trying to run multiple GPU's on boards with fewer PCIe lanes, or poorly laid out motherboards that trip the X16 lanes down to fewer x8 or even x4 over nothing.
If you are running SLI and are forced into a 16x-4x config due tot he layout, if you are running at 3.0, that still means your board in the 4x slot gets the equivalent of 8x 2.0 speeds and there is less of an impact. If you are running at 2.0 speeds, there is a good chance of a performance impact from the board running at 4x.
I meant camera shots. They show the blurring on 120 Hz LCDs compared to CRTs.Screenshots show what?
Yes, but it is not a plus, nor does it smoother and more fluid picture make.It does work that way. Non-instantaneous pixel response times yield an after-image, e.g. ghosting, making each frame appear individually less discrete.
2. No motherboard forces x16 and x4 for an SLI config. They drop to x8/x8 at worst because x8 vs. x16 has no perf implications in real world gaming scenarios. Look up any article on GPU PCIE scaling. You're manufacturing a use case here.
2. No motherboard forces x16 and x4 for an SLI config. They drop to x8/x8 at worst because x8 vs. x16 has no perf implications in real world gaming scenarios. Look up any article on GPU PCIE scaling. You're manufacturing a use case here.
while I'm not going to go digging for a source right now, I am almost positive that this statement is false and there are boards that will not swap to 8/8. Of course these would typically be lower end and not bought primarily for sli/xfire, but I believe your blanket statement is incorrect.
You are correct, Tiporaro - there are boards that only support x16x4 CrossFire and SLI because they lack the PCIe switches to split the lanes to x8x8. The Asus P8Z68-V LE is one such motherboard, and there are quite a few others. They are lower-end boards though.