I'm really waiting for these cards to appear. They might be a worthy upgrade to my HD 6870.
Any word on where they are arriving? I've read March, but still no news.
Right now, I'm running my HD 6870 with 1015mhz on the core and 1172mhz on the memory.
This is done with the Voltage slider maxed out, at 1.3v
Is there anyway to override this? I'd like to try a little more voltage to see where and how far this GPU goes.
I have an HD 6870. It is exceptionally strong @ 1080p, which is my resolution.
However, at such higher resolutions like yours, you really need a faster GPU, or perhaps better yet, HD 6870 in Crossfire.
The reason your laptop beats the desktop in games, is because of the CPU.
i7 vs a very old Core 2 Duo E6400.
The GTS 250 is actually a better GPU than the GTX 460M.
The mobile versions are ALWAYS much slower than the desktop equivalent.
Well, the GTX 560 Ti 448 is pretty close the the GTX 570.
Can you wait another week or so, and see how much the HD 7950 will cost? According to benches, it outperforms the GTX 580 out of the box, and is bound to cost only 299$.
Is this thread for real?
What are the extra cores for? LOL
that's like asking "what does the extra horsepower on that car do?"
Are you stupid enough that you can't realize for yourself more shaders = more power?
OH MY GOD.
I use a SINGLE HD 6870 for gaming at 1080p. It's a great card, it's in the mid-high end range and is pretty close to the HD 5870 at stock, and mine even surpasses the stock HD6950 @ 1015mhz / 1172 OC.
WHY ON EARTH WOULD YOU USE CROSSFIRE FOR 1280x1024??!! WHY?
A single HD...
Ridiculous? No.
Unwise? Yes.
Ridiculous would be pairing the HD 7970 with a Pentium 4. That would epically absurd.
With a Q6600 (overclocked is a must) I'd say it's definitely not your best option, but it's not bad either.
It will be enough for CSS (maxed out no problem, since the Source engine is lighter than most engines) and Oblivion (although don't expect to max it out at big resolutions).
However, keep in mind that the HD 6450 is a very crippled GPU in terms of 3D perfomance.
If you have prospects of...
It's NOT the CPU.
Check any BF3 performance review.
http://www.techspot.com/review/458-battlefield-3-performance/page7.html
Such as this. You see an AMD dual core Athlon II X2 (weaker than Phenom II X2), giving an average of 67 FPS.
Sixty. Seven. Frames per second. On a dual core...
Even at stock speeds, a Q6600 wouldn't bottleneck so hard. I mean, the CPU is getting old, but it still games like a champ. But it IS bottlenecking that HD 5870 at stock speeds. All I'm saying is that it shouldn't be only giving you 15fps.
In any case, try to bring it to 3ghz at least and see...
It will bottlenck a lot, unfortunately.
In fact, there's a very big chance your current 8800GT is already being held back somwhat by your CPU, even with the OC.
I'll tell you why:
Back in 2008, I got my LGA775 system with a Core 2 Duo E8400 (3ghz stock), 4gb DDR2 and an 8800GT.
At the time...
The problems lies solely on that Pentium D of yours. It's a bottleneck for pretty much every GPU from 2005 onwards.
No amount of overclocking can compensate for the sheer horrible performance of those CPUs.
Netburst is probably Intel's architecture with the lowest IPC ever (not counting Atom...
What? NO!
Sandy Bridge-E is NOT Ivy Bridge.
Sandy-Bridge-E is just the extreme (enthusiast version) of Sandy Bridge chips, available on the enthusiast platform (2011).
Architecturaly, they are no different from LGA 1155 CPUs. The only advantages are on the platform itsled, Quad channel...
Any quad core will do.
BF3, while requiring some CPU power, requires a LOT of GPU power.
Your SLI GTX 560s won't be enough for 120 fps with maximum details.
A single GTX 580 on MAX details @ 1650x1050 only provides around 50 fps.
Of course it will.
Just like 32nm Sandy Bridge is spread over 2 platforms (LGA 1155 and LGA 2011), so will Ivy Bridge come to the SAME sockets.
Ivy Bridge is not a new architecture, just a Sandy Die shrink. Technically, Ivy is Sandy on a different manufacturing process.
LGA 2011 is just the enthusiast version of the "Tick" (Sandy Bridge).
We can't mix CPUs with platforms. The CPU architecture is just one (SB) spread over two platforms for different market segments.
The CPU architecture between LGA 1155 and 2011, is exactly the same. It's not the second generation of anything. It's the same Sandy architecutre, on a diferent platform. that's all.
They are both Sandy Bridge CPUs. All that changes are the features of the platform such as PCI-E lanes...
Not me.
LGA 1155 platform is more than enough for my needs: High end gaming.
Let's be realistic, a CPU with the caliber and power of the 2600k is going to remain a viable gaming chip for many years. And I do mean A LOT of them.
Take a look at LGA 1156 and 1366 i5 and i7s. They are 3 years...
You can go for Sandy right now on LGA 1155, and then simply upgrade to an Ivy CPU when they release.
I'm sure any 2500k, 2600k, or 2700k will be an easy sell on the used market.
I'm holding on to my 2600k for at least anoter 2 years, and then I'll just grab an Ivy to slap on my P67 Evo. Even...
On to your question then:
For 105$, the upgrade is 100% worth it. Not only will the i3 2100 be immensely faster than any previous generation dual core (including LGA 1156 i3), but it will be cooler, more power friendly and longer lasting, since in essence you have a four threaded CPU.
Heck...
Unrelated question:
Why are you running such an OLD videocard?
I mean, that thing has to be either AGP (not possible on your mobo), or PCI (non-express).
EDIT: Op has edited his post, but it said before he was running a Radeon 7500.
You got it all wrong buddy.
Sandy Bridge-E is not a new generation. It is just a different segment (hence the Extreme designation.)
Sandy Bridge (and Ivy Bridge) will be split into 2 platforms: 1155 and 2011.
The two platforms are not "generations", just different segments of the same...
My 2600k will last me a damn few good years down the road, but still, I want to upgrade to Ivy sometime in the future.
I'm sure Intel will resolve this situation in some way, maybe by altering some code in the IB chips themselves, or new BIOSes.
They basically confirmed a few months ago that...
Exactly.
I don't know where you got the idea that the HD 6850 unlocks to HD 6870. In fact, neither of them unlock to anything.
They are great GPUs and overclock fairly well.
It doesn't unlock and never will.
Hey man, relax a bit, I was just pointing out the new feature. No need for all that aggression :/
I'm not worried about my VRAM running out, just asking if the readings are accurate, since memory usage monitoring on AMD GPUs in kind of a new thing.
From what I know, most HD6870s can reach 1ghz on the core with some voltage twwaking.
As you can see in my sig, mine is sitting @ 1004mhz on the core with 1.274v, it's a bit high and to counter the high temps I have to use custom fan settings to keep it cool, but I don't mind.
Save a few coins...
I'm sure most of you are familiar with GPU-Z, but if you are not, take a look:
http://www.techpowerup.com/gpuz/
The thing is, until now I'm fairly sure there was no way to check memory usage on AMD GPUs, but possible on most recent NV GPUs, unless I'm missing something.
I just installed...
My HD 6870 absolutely maxes out @ 1004mhz on the core and 1158mhz on the memory.
All of this @ 1.274v with custom fan settings on MSI Afterburner. Yes, it is loud, but I game with headphones :)
But I see many people hitting 1200mhz on the memory, so mine must be from a bad batch :/