Extremely Disappointed with Ivy-Bridge

Status
Not open for further replies.
got my 3570K at 4.0 without touching the voltage control. I don't want to go any higher since I'm using the stock cooler for now. I was hoping to find a 1156 bolt through kit for my TRUE but it seems to be discontinued or out of stock everywhere.

I'll be holding out on getting a new cooler until I come back in June from a 3 week trip, hoping some next gen coolers will be available or announced. I am very disappointed with the stock cooler, it was almost falling off the first time I attached it, no wonder why my system crashed. The push pins are awful.
 
got my 3570K at 4.0 without touching the voltage control. I don't want to go any higher since I'm using the stock cooler for now. I was hoping to find a 1156 bolt through kit for my TRUE but it seems to be discontinued or out of stock everywhere.

I'll be holding out on getting a new cooler until I come back in June from a 3 week trip, hoping some next gen coolers will be available or announced. I am very disappointed with the stock cooler, it was almost falling off the first time I attached it, no wonder why my system crashed. The push pins are awful.

A buddy and I both bought i7-930's a couple years ago for our new builds. His stock cooler was so bad that his cpu was overheating at stock speeds. Like 90 degrees Celsius. He had no choice but to get an after market. I wanted to over clock mine but my cpu was pushing 80 degrees plus Cels just running full burn tests on it with the stock cooler. I have no faith in those stock Intel coolers.
 
got my 3570K at 4.0 without touching the voltage control. I don't want to go any higher since I'm using the stock cooler for now. I was hoping to find a 1156 bolt through kit for my TRUE but it seems to be discontinued or out of stock everywhere.

I'll be holding out on getting a new cooler until I come back in June from a 3 week trip, hoping some next gen coolers will be available or announced. I am very disappointed with the stock cooler, it was almost falling off the first time I attached it, no wonder why my system crashed. The push pins are awful.

This is what you need:

http://www.frozencpu.com/products/1...1366_Ultra_120_Series_HR-01_True_Spirit_.html

It's a bit expensive, but that will certainly work.
 
I'm using the stock cooler on my i3 530 (unRAID server) and haven't had any issues at all. Maybe I'll pick up a Hyper212 on the cheap and it'll hold me over until something higher end comes out that addresses the issue of super small CPU surface area.

I'm hoping that new cooler master with vapor chambers preforms well on ivy.


This is what you need:

http://www.frozencpu.com/products/1...1366_Ultra_120_Series_HR-01_True_Spirit_.html

It's a bit expensive, but that will certainly work.

Wow that looks legit.
 
I love that in the first post, most of your "I am dissapoint" items you list are actually improvements, why are you disappointed in improvements?

1.) It is 3% faster. I didn't expect much IPC improvement from a "Tick" but 3% is 0%, nothing more.
You are disappointed that it is faster? even though you didn't expect much IPC improvement? and no... 3% is 3%, and also... most people review it as being 7% quicker not 3%. see an example here: http://www.silentpcreview.com/article1259-page7.html

2.) It takes about 10% less power, going from 32nm to 22nm.
This is another improvement... imagine if your car suddenly used 10% less gas?, your girlfriend/wife lost 10% of her body fat.. You suddenly took of 10% on your mortgage payment.. Sounds like nothing to be disappointed in... especially in in a "Tick" like you say.

3.) There are few reports (suspicious), but it seems more clear that Ivy Bridge is hotter than Sandy Bridge.
This is your only point that actually is something to be disappointed about, however reasoning that the GPU in IVY is giving anywhere from 30-130% more performance, kindof makes sense, also looks like there may be heatspreader differences, and the digital temperature sensor may be different as well.


However despite some additional power consumption/heat, look what we see here:

perf-watt.gif

Ivy may be a bit hotter/consume more power, but the performance you gain by the platform more than makes up for the additional power consumed.

So, just me reading between the lines... what you should have actually stated in your first post is that you are disappointed in mild performance increases because you expected more, and you probably should have mentioned the worse overclocking capability of Ivy bridge vs Sandy Bridge on normal cooling solutions. These are actually valid concerns when expressed intelligibly.

In the words of the Ivy Bridge Anandtech review:
Reviewing a tick in Intel's cadence is always difficult. After Conroe if we didn't see a 40% jump in a generation we were disappointed. And honestly, after Sandy Bridge I felt it would be quite similar. Luckily for Intel, Ivy Bridge is quite possibly the strongest tick it has ever put forth.
 
On Topic:
I don't KNOW Intel's internal processes, but I cannot see how they would function optimally (within regards to Moore's Law) without any competition as they did when competition was stiff. Doesn't make sense. Intel is seeking to profit, not enrich the world.

They compete at their best because Intel's greatest competitor is itself.

The desktop/laptop computer market is saturated. It's still growing, but at a much reduced rate compared to a decade ago; this means that for the most part you get replacement purchases only, either for truly broken computers or by tempting users to upgrade (and gaming / overclocking enthusiasts like you and I are a very small percentage of these users).

Intel has learned how the market works, and that's why they have a product for every possible need at every price-point imaginable. And when they release a new process shrink / architecture, they insert their new products into those exact same price-points (sometimes with increased cores or features or just more performance), attempting to entice users to upgrade rather than keep their currently working computers (more performance at the exact same price-point is a damn good enticement).

This is why every die shrink and new processor architecture release is aimed squarely at the mainstream: Intel's earnings are directly dependent on how many upgrades they can encourage (versus the replacement-only market). if Intel cannot entice users to upgrade early then the massive amount of money invested in the die shrink / architectural upgrade was a waste.

So you see, Intel's greatest competition is themselves. In order to maintain high sales, you have to entice users to upgrade yearly - and that means crafting every die shrink (tick) and every architectural revision (tock) to near-perfection. It has been this way since they released the Core 2, and as long as they want to maintain sales growth it will remain that way.

Fear of customers switching to AMD is a long-term problem, not short-term like enticing upgrades every year. If Intel dropped the ball and released complete crap it would be several years before AMD could poach significant market-share from them (see Athlon 64 and how slowly the mainstream market moved). But Intel would feel the pinch of a reduced enticed upgrade cycle almost immediately. AMD could explode off the face of the earth tomorrow and it wouldn't affect this system much: Intel would have higher growth rates for a couple years while they consumed AMD's market share, but the market would quickly saturate again and Intel would be back to competing with itself.
 
Last edited:
On Topic:
I don't KNOW Intel's internal processes, but I cannot see how they would function optimally (within regards to Moore's Law) without any competition as they did when competition was stiff. Doesn't make sense. Intel is seeking to profit, not enrich the world.
That's the wonderful thing about analyzing actual products instead of trying to simply reason in the abstract (while ignoring actual products and profitability ;)).

Intel has kept pace with Moore's Law even after Conroe was released nearly 6 years ago. That's 4 cycles of Moore's Law. What people ignore is that Intel's biggest competition is no longer AMD; it's Intel itself. Intel must release faster products to entice owners to upgrade. There is no stagnation in innovation or Intel would also suffer. The fantasy that Intel would charge $300 for a Celeron in the absence of competition is just silly as the market wouldn't bear it.

I can understand the disappointment with IB on some levels. Extreme overclockers may not see the point to a lower power shrink which doesn't offer more significant overclocking potential. The drop to 77W TDP instead of giving more default speed at a 95W TDP may also be disappointing, which could have been used to provide several hundred more MHz per SKU. For most users, it's largely insignificant. These are the new performance kings on mainstream desktop platforms. It's hardly a failure though.
 

It's more like those LGA775 quad-core owners won't upgrade.

How many of us that have Yorkfield (or even Kentsfield) CPUs, are the second (or even later) owner, but haven't been in any real hurry to move up to Sandy Bridge (despite it being a relative bargain)?

We're Intel's real targets with Ivy Bridge (desktop and mobile alike) - not those that have already upgraded to Lynnfield, and especially not those running Sandy Bridge.

Us second (and third or longer) owners of Intel's original and second-generation quad-core desktop CPUs are finding that there's still lots of life (and capability) left in these desktop dinos (in CPU terms). That's a big problem for Intel - actually worse than the same thing in the corporate/enterprise segment (where the same thing is taking place); what do you do when you lead your sector, and sector growth altogether is stalled due to both poor external factors (the economy) and your older product is still more than enough for the current generation of software to a large extent? (I own stock in Intel - thus, Intel's dilemma also affects me personally.)

Intel has to get us LGA775 second (and later) owners (especially those of us with quads) off the fence - however, it also can't afford to raise prices either.

In addition, the one part of computing that HAS seen growth (albeit lower growth than usual) is the mobile segment; as good as Sandy Bridge is, there are still all sorts of areas in mobile that Sandy is ill-equipped to tackle (largely due to battery-life concerns).

Ivy Bridge is Intel's response.

It's more efficient (in terms of IPC and even IPS) than Sandy Bridge, yet can go into existing motherboards for Sandy Bridge. Even more telling, it will cost no (or very little) more. However, the biggest benefits are for power-constrained mobile users (which is what it will take to keep the mobile market growing) - more capability, better battery life, and again largely without a price rise compared to Sandy Bridge.
 
Have a Yorkfield here but I'm moving up, because I can certainly do with tax rebates as my government is going to spend it to enrich their pockets anyway.
 
With my i7 950 running fine at 4.0GHZ (near stock voltage), I see no point of upgrading to Ivy-Bridge.
As far as I am concerned, the main selling point of the Ivy platform is the set of new features such as USB3, SATA3, PCIE3, etc.

Haswell seems to worth the wait.
 
I've always been curious how CPU's "degrade." How much of a CPU's surface area is reserved just for redundancy? Can one failed transistor take out an entire chip? What mechanism does a CPU use to detect a faulty transistor and reroute around it?

Zero, possibly, none!
 
I've always been curious how CPU's "degrade." How much of a CPU's surface area is reserved just for redundancy? Can one failed transistor take out an entire chip? What mechanism does a CPU use to detect a faulty transistor and reroute around it?
Degradation under normal use happens via "electromigration". The electrons zipping around the materials in the chip can eventually change positions of the atoms which carry those charges, causing the transistor(s) to eventually fail. Running the chips out of specifications speeds up this process tremendously. The relatively short life spans in highly overclocked CPUs is an example of how much this process can be sped up, sometimes happening in far less than a year.

In desktop CPUs, redundancy is usually baked into the chip. During testing, failed (or even working) blocks like cache can be fused off to make the configuration for a particular SKU. If some critical with no redundancy portion of the chip is bad, the CPU die is unusable. If performance is degraded due to low transistor switching characteristics or leakage, the CPU may be down-binned to slower speeds where it can reliably operate.

It is possible for a chip to detect failures on power-on and disable some portion which isn't working. Again, this may happen with cache where failures can be checked (ex: an unreachable block). But a chip which is failing will likely fail in operation, particularly if flakiness is not easily detectable.

Some server chips have far more robust error detection hardware.
 
Have a Yorkfield here but I'm moving up, because I can certainly do with tax rebates as my government is going to spend it to enrich their pockets anyway.

Precisely my point - in my case, it's Kentsfield (the original Q6600) and Wolfdale (that's right, a dual-core - specifically, the Wolf-PUP E3400) before that.

It's not that E3400 (let alone Q6600) isn't capable of running modern applications (if not games) - both certainly are; even Intel knows this. However, between the sour economy, and the lack of *gotta-have-it*the no-device area (for the general PC owner/user - not the [H]orde), that is why the non-mobile computing space has been pretty much flat in terms of sales; even the mobile space isn't growing where it usually is (laptops/notebooks/netbooks) - it's growing in the *device* (sub-PC) space - iOS and Android) and the smartphone space (both of those and Windows Phone) - not Windows (let alone OS X).

Sandy Bridge was for those that upgrade regularly (and major overclockers); Ivy Bridge is the Intel CPU for the rest of us.
 
haswell should be the next migration.

it *should* be a bigger leap, with a new socket that shold stock around for a couple years hopefully. it should be a great jumping point from 775 and 1156

i have a pile of stuff for a IVY build, and I am tempted to sell it before it really devaules and stock pile it for haswell, but the wife wants a baby and I have a feeling the spare cash will end up somewhere else(like the baby, or a motorcycle upgrade :p ) so I might as well build. it will still take new GPUS just fine :)
 
Yeah, Forceman is cool. I'm the one usually being called a troll. Being called a troll is like being called a witch.. there's really nothing you can say that won't admit further guilt to the people doing the calling.

On Topic:
I don't KNOW Intel's internal processes, but I cannot see how they would function optimally (within regards to Moore's Law) without any competition as they did when competition was stiff. Doesn't make sense. Intel is seeking to profit, not enrich the world.

Aren't you the guy who put his video card in the oven 4 times then wondered why it didn't work? You tend to come here with some pretty wild assertions and expectations as to how things are supposed to work.

As stated, Intel's release schedule is faster and more productive than ever. Moving from a 2d to 3d process is going to throw a kink in the roadmap that people should have been reasonably expecting. This is no small adjustment to how these cpus are made. Feel free to NOT BUY a new setup. A great many people are skipping Ivy, be one of them if you're unhappy.
 
The shipping to Canada for a bolt-through kit was more than the kit itself so I bought a D14 locally. It was on sale for $70CAD at the computer store across from the pub I was having lunch at :D
 
I don't KNOW Intel's internal processes, but I cannot see how they would function optimally (within regards to Moore's Law) without any competition as they did when competition was stiff. Doesn't make sense. Intel is seeking to profit, not enrich the world.

Intel's primary threat to sales in non-emerging-markets is the installed base. Intel has to create better processors so that people replace old Intel-based systems with new ones. This isn't quite true when a new device (like laptops a decade ago, or tablets now) goes mainstream, but right now Intel is not yet in any new kinds of device . . .

Also, reducing the feature size through process technology does a lot to reduce manufacturing costs. More advanced proccesses do cost more per wafer, but you get a lot more die per wafer, so the per-die cost drops. So, for example, IB die probably costs less to make, before you factor in return-on-investment (ROI), than SB die. That said, you always have to factor in ROI, so YMMV.
 
As stated, Intel's release schedule is faster and more productive than ever. Moving from a 2d to 3d process is going to throw a kink in the roadmap that people should have been reasonably expecting. This is no small adjustment to how these cpus are made. Feel free to NOT BUY a new setup. A great many people are skipping Ivy, be one of them if you're unhappy.

I think you're avoiding the issues that have been raised. At launch Intel's marketing team told the media that Ivy would be a 20% performance improvement. It's less than half of that. Why does Intel get to lie about their product and expect the hardware community to act like nothing happened? It sets a bad precedent.

Like I said before, it sounds to me like Intel is having problems with the tri-gate transistor tech. Intel might save money by growing more chips per wafer, but as far as I know they aren't lowering prices. A pure die-shrink of the old Sandy Bridge architecture probably would have offered more performance/less heat. From the perspective of consumers Ivy Bridge is a disappointment. Hopefully they get things worked out with Haswell.
 
"Consumers" is a very large group, the vast majority of which will benefit from IB. Maybe what you ment to say was overclockers and enthusiasts which is a very small piece of the pie.

If Intel was folliwng AMD's lead, you'd get something that performs worse at the same clock speed, uses more power and costs more. instead IB performs better, uses less power and actually has a lower MSRP. I'd say they're operating well above the standard as it stands currently.
 
I think you're avoiding the issues that have been raised. At launch Intel's marketing team told the media that Ivy would be a 20% performance improvement. It's less than half of that. Why does Intel get to lie about their product and expect the hardware community to act like nothing happened? It sets a bad precedent.

Like I said before, it sounds to me like Intel is having problems with the tri-gate transistor tech. Intel might save money by growing more chips per wafer, but as far as I know they aren't lowering prices. A pure die-shrink of the old Sandy Bridge architecture probably would have offered more performance/less heat. From the perspective of consumers Ivy Bridge is a disappointment. Hopefully they get things worked out with Haswell.

Speaking of consumer disappointment and marketing hype how's that FX-8120 working out for you? :p
 
I think you're avoiding the issues that have been raised. At launch Intel's marketing team told the media that Ivy would be a 20% performance improvement.
You seem to have completely misunderstood what was being discussed. The transistor performance improved by that amount under certain conditions (at low voltage).

You know what they say about "a little knowledge"... ;)
 
I think you're avoiding the issues that have been raised. At launch Intel's marketing team told the media that Ivy would be a 20% performance improvement. It's less than half of that. Why does Intel get to lie about their product and expect the hardware community to act like nothing happened? It sets a bad precedent.

Like I said before, it sounds to me like Intel is having problems with the tri-gate transistor tech. Intel might save money by growing more chips per wafer, but as far as I know they aren't lowering prices. A pure die-shrink of the old Sandy Bridge architecture probably would have offered more performance/less heat. From the perspective of consumers Ivy Bridge is a disappointment. Hopefully they get things worked out with Haswell.

I don't see a problem with performance, what bothers me is the whole TIM thing, which is most likely the problem with heat.

I don't grasp why so many of you think they can create something like the 3D transistor and not field it at the production level to work out kinks as they go. You're literally expecting miracles. I'm simply expecting solder.
 
I don't see a problem with performance, what bothers me is the whole TIM thing, which is most likely the problem with heat.

I don't grasp why so many of you think they can create something like the 3D transistor and not field it at the production level to work out kinks as they go. You're literally expecting miracles. I'm simply expecting solder.

This should quell your TIM worries.

http://forums.anandtech.com/showthread.php?t=2242252
 

You're misunderstanding: I want solder. I don't want to have Intel processors from here on out of substandard build quality. Don't start putting solder on then go back. It's inferior. I like my systems to work for a very long time as I donate them. This is a penny pinching move by Intel which is really just screwing the consumer out of better quality product. Aren't you people tired of *everything* being downsized and degraded?

There is no argument that makes TIM acceptable vs Solder. You're getting TIM and some exec is getting a bonus on top of whatever exorbitant pay he already gets.
 
You're misunderstanding: I want solder. I don't want to have Intel processors from here on out of substandard build quality. Don't start putting solder on then go back. It's inferior. I like my systems to work for a very long time as I donate them. This is a penny pinching move by Intel which is really just screwing the consumer out of better quality product. Aren't you people tired of *everything* being downsized and degraded?

There is no argument that makes TIM acceptable vs Solder. You're getting TIM and some exec is getting a bonus on top of whatever exorbitant pay he already gets.

Honestly, if it doesn't affect its overall performance, and in this case, it doesn't appear that it does, I'm not bothered by it. I'm not debating on whether or not it's inferior to solder, just that it's inferiority appears to not make any appreciable difference.
 
You're misunderstanding: I want solder. I don't want to have Intel processors from here on out of substandard build quality. Don't start putting solder on then go back. It's inferior. I like my systems to work for a very long time as I donate them. This is a penny pinching move by Intel which is really just screwing the consumer out of better quality product. Aren't you people tired of *everything* being downsized and degraded?

There is no argument that makes TIM acceptable vs Solder. You're getting TIM and some exec is getting a bonus on top of whatever exorbitant pay he already gets.

I don't see how it impacts you at all. It doesn't affect temps (according to that test anyway) and it doesn't impact lifespan - so how exactly are you affected by their choice to use TIM? They used to use ceramic for the substrate, and then switched to something cheaper and easier to work with - was that a huge slap in your face as well? If someone hadn't cut the IHS off you wouldn't even know.
 
Thanks for the link hadn't seen that yet. So basically confirms what Intel was saying. Now everyone bitching about the TIM can now kindly STFU. :D
Well, maybe you can explain exactly what that guy in the link did. He certainly didnt explain well, although I sympathize with anybody who has English as a second language. Nevertheless, as somebody already mentioned, he certainly didnt compare TIM to solder as alternate methods of attaching the IHS. Seems like he just took off the IHS and said that wasnt the problem because the temps went up. :confused:
 
Well, maybe you can explain exactly what that guy in the link did. He certainly didnt explain well, although I sympathize with anybody who has English as a second language. Nevertheless, as somebody already mentioned, he certainly didnt compare TIM to solder as alternate methods of attaching the IHS. Seems like he just took off the IHS and said that wasnt the problem because the temps went up. :confused:

It's not possible for an end-user to replace the TIM with solder, so that's not even worth considering. The argument was that the TIM was making the IHS into an insulator and causing the high temps - this test shows (or seems to show - more testing is still needed) that removing the IHS doesn't help anyway. All it really means is that the IHS, as attached now, isn't hurting things, and that there isn't a magical solution by just removing the IHS and running the chip naked.

Edit: There are more pics and discussion at the original post:
http://www.overclock.net/t/1249419/pcevaluation-intel-i7-3770k-temperature-measured-without-ihs
 
I don't see how it impacts you at all. It doesn't affect temps (according to that test anyway) and it doesn't impact lifespan..

How can you say that about lifespan so definitively? Don't most thermal greases dry out over time and become worthless? Having to be reapplied? Does Intel have magic grease that will never be effected in such a way?
 
How can you say that about lifespan so definitively? Don't most thermal greases dry out over time and become worthless? Having to be reapplied? Does Intel have magic grease that will never be effected in such a way?

I just removed my heatsink for the very first time in over 4 years on my Q6600 and it was not dry. You don't need magic grease, you just need stuff that isn't shit.
 
I just removed my heatsink for the very first time in over 4 years on my Q6600 and it was not dry. You don't need magic grease, you just need stuff that isn't shit.

+1 to this.

Plus, for the grease to dry out, the moisture in it needs somewhere to go. As I see it, that lid is completely sealed, so there isn't really anywhere for moisture to go, so it shouldn't dry up.

I'm sure Intel's engineers aren't so stupid as to put a TIM that would dry out in 3-4 years (RMAs anyone?) Now, if they were able to make a TIM that would stop working right after the warranty period was over...
 
How can you say that about lifespan so definitively? Don't most thermal greases dry out over time and become worthless? Having to be reapplied? Does Intel have magic grease that will never be effected in such a way?

People act like this is the first time Intel or AMD has used TIM inside the IHS. What do you think was under the IHS on the P4 and Athlon? Even some of the Core 2 chips had TIM in them.
 
I'm sure Intel's engineers aren't so stupid as to put a TIM that would dry out in 3-4 years (RMAs anyone?) Now, if they were able to make a TIM that would stop working right after the warranty period was over...

Well I believe the warranty period is actually 3 years ;)

People act like this is the first time Intel or AMD has used TIM inside the IHS. What do you think was under the IHS on the P4 and Athlon? Even some of the Core 2 chips had TIM in them.

I believe TIM is used in Nvidia's IHS as well.
 
I don't see how it impacts you at all. It doesn't affect temps (according to that test anyway) and it doesn't impact lifespan - so how exactly are you affected by their choice to use TIM? They used to use ceramic for the substrate, and then switched to something cheaper and easier to work with - was that a huge slap in your face as well? If someone hadn't cut the IHS off you wouldn't even know.

Amazingly you can see far into the future and tell us about the lifespan. Must be nice. The fact of the matter is we're going from well built product to crap, to save Intel a penny or two. I have no idea why people defend this so vigorously. You are paying just as much and getting lesser product than the previous generation. You should be offended as a consumer.

I also don't trust one random test translated from a non-english speaker to determine temps. It's odd how quickly so many of you defending this move by Intel have latched onto that one link.
 
"Consumers" is a very large group, the vast majority of which will benefit from IB. Maybe what you ment to say was overclockers and enthusiasts which is a very small piece of the pie.

No, I mean consumers in general. The lower power consumption is certainly nice, but prices aren't going down despite the lower manufacturing cost and the performance difference isn't that great.

Speaking of consumer disappointment and marketing hype how's that FX-8120 working out for you? :p

That would be a valid point... if AMD was marketing Bulldozer as 20% faster than Stars. As I mentioned previously, Intel claimed "20 percent more processor performance using 20 percent less average power." in a media release on the 23rd. AMD and Intel always hype their numbers before releases, but I don't really recall them lying about them after they've been reviewed. 5-7% is not 20%.

I bought the FX-8120 despite the bad reviews. My good old Q6600 setup finally started having some issues and I couldn't afford an i7 build. I do a lot of work in 3Ds Max and I don't have to worry about the power bill, so it works out fine for my particular needs.

You seem to have completely misunderstood what was being discussed. The transistor performance improved by that amount under certain conditions (at low voltage).

You know what they say about "a little knowledge"... ;)
Cut the snide BS, I was talking about a what Intel has been saying to the media. I posted a quote earlier from a BBC article, here's the same thing on ABC:

http://abcnews.go.com/Technology/in...ge-chips-today/story?id=16194824#.T5vmp8XptD8
"This is Intel's fastest ramp ever," Intel PC business chief Kirk Skaugen told the BBC. "This is the world's first 22 nanometre product and we'll be delivering about 20% more processor performance using 20% less average power."

How is that not misleading? This is what's in the news, the average consumer looks at that and goes 'huh, 20% faster and 20% less power than my current i-whatever CPU!'

Don't get me wrong here, Ivy Bridge obviously does circles around anything AMD has to offer in terms of single-threaded performance. The integrated GPU is actually pretty decent, too. I just think that it's fair to call it a disappointment after all the hype and I don't like how Intel is continuing to try to build hype by misleading consumers about the performance improvement. I get the impression they're anxious to avoid faildozer-fail bridge comparisons after investing so much time, money, and marketing on their tri-gate tech.
 
Last edited:
...Don't get me wrong here, Ivy Bridge obviously does circles around anything AMD has to offer in terms of single-threaded performance...

I think this part of why it seems lately that Intel upgrades aren't greatly compelling (to me anyway) from one release to the next. They are so far ahead of AMD at this point they don't have to be. AMD was king when the P4 was Intel's best. I finally upgraded from socket 775 late last year to a Sandy Bridge system and will probably sit on this machie for a long time. Back in the '90s I would upgrade twice a year. Things just don't advance at the pace they used to.
 
Status
Not open for further replies.
Back
Top