Ivy Bridge TDP of 95W instead of the promised 77W ...

This is [H]ard forum! Its 18 difference, no big deal.

You can always undervolt. :D Or be [H]ard and overclock the crap out of it and not worry about how much heat and put the biggest HS or water cooler on it you can.
 
It's rather obvious why they changed the TDP spec. With little to no competition from AMD right now, Intel has no reason to strive to release anything but crap to the market. If AMD had something to compete against Intel with, Intel would have kept the 77w spec and marketed the hell out of it. Since they're not being challenged we get shafted with 95w Ivy Bridges so Intel can rise their profits by widening the binning cutoff. With the 3d transistor technology being new, this is probably a back door way of keeping their yields high while ramping up the technology and production.

And FYI, you have an eternity of this to look forward to unless AMD can get its act together and compete.
 
Agreed. I'm looking forward to the entry level IB chips for basic builds. I need to read up how they compare to A8 graphics. I read IB graphics were 3-4x faster than HD3000.

The Anandtech preview showed the HD4000 about being half as powerful as the Llano A8. Considering the Trinity is supposedly about 30% more powerful than Llano, Intel is still way behind in the graphics department. However, Intel has nowhere to go but up with their graphics. Same with AMD and their cores; they have nowhere to go but up with the Bulldozer design now.
 
Apparently the Trinity parts are a 50% improvement over Llano counterparts in graphics. I think that 50% refers to GFLOPs at the 35W TDP level so I'm not sure how that translates to the desktop parts.
 
Mildly disappointing? I find it extremely disappointing. Not sure there is any reason to buy Ivy now and I was waiting around for it. I hope the Microcenter $200 for a 2600K is still on.

Seriously, if that is an accurate report and the TDPs are the same or worse than Sandy, what would be the reason to buy an Ivy Bridge CPU now?

CPU part is probably taking less power than SB one but TDP includes much bigger GPU.
 
It's rather obvious why they changed the TDP spec. With little to no competition from AMD right now, Intel has no reason to strive to release anything but crap to the market. If AMD had something to compete against Intel with, Intel would have kept the 77w spec and marketed the hell out of it. Since they're not being challenged we get shafted with 95w Ivy Bridges so Intel can rise their profits by widening the binning cutoff. With the 3d transistor technology being new, this is probably a back door way of keeping their yields high while ramping up the technology and production.

And FYI, you have an eternity of this to look forward to unless AMD can get its act together and compete.
I really don't think this is the issue. This is Intel's first 22nm part, and also the first part using the 3D Tri-gate transistors. There are bound to be some kinks to work out.
 
I really don't think this is the issue. This is Intel's first 22nm part, and also the first part using the 3D Tri-gate transistors. There are bound to be some kinks to work out.

It always seemed strange to me to make two very large architectural changes (22nm and tri-gate) with the same silicon. Seems like something you want to spread around.
 
I really don't think this is the issue. This is Intel's first 22nm part, and also the first part using the 3D Tri-gate transistors. There are bound to be some kinks to work out.

Hence the second part of my statement where I stated they were probably compensating for yields bc of the new design. Without AMD competition they feel a lot less pressure to stick to the 77w TDP so can increase yields by loosening their bin requirements. If there was any true competition, Intel would have stuck with the 77w TDP because OEMs love lower TDPs. It allows the oems to use cheaper components and would be something Intel would have leveraged if there was a need to. With no competition, Intel already has the bulk of oem business and has chosen higher yields through slacker binning to maximize profits.
 
Hence the second part of my statement where I stated they were probably compensating for yields bc of the new design. Without AMD competition they feel a lot less pressure to stick to the 77w TDP so can increase yields by loosening their bin requirements. If there was any true competition, Intel would have stuck with the 77w TDP because OEMs love lower TDPs. It allows the oems to use cheaper components and would be something Intel would have leveraged if there was a need to. With no competition, Intel already has the bulk of oem business and has chosen higher yields through slacker binning to maximize profits.

Well if performance/watt goes up across the board, IB is still attractive. Using the same power as the "old" SB ship isn't that terrible if Intel can get more work done with the same power as before. BD was a failure because the performance/watt and IPC got WORSE with many high CPU loads compared to the Phenom II chips it replaced.
 
Well if performance/watt goes up across the board, IB is still attractive. Using the same power as the "old" SB ship isn't that terrible if Intel can get more work done with the same power as before. BD was a failure because the performance/watt and IPC got WORSE with many high CPU loads compared to the Phenom II chips it replaced.


Well yes and no. The IPC gains are there, but not massive. The chip has been touted since inception as a lower TDP cpu, so this is a HUGE disappointment and will probably be a big factor in alot of people avoiding it with Sandy Bridge already available and cheaper.
 
It always seemed strange to me to make two very large architectural changes (22nm and tri-gate) with the same silicon. Seems like something you want to spread around.
The moral of the story is to purchase your CPU whenever Intel hits hits a 'Tock' in their schedule . Case in point: The 2500k/2600k released back in Jan 2011 would have carried you through to Haswell (mid-2013) within roughly 10-15% stock performance of the latest chips. Two and half years is not a bad run.
 
The moral of the story is to purchase your CPU whenever Intel hits hits a 'Tock' in their schedule . Case in point: The 2500k/2600k released back in Jan 2011 would have carried you through to Haswell (mid-2013) within roughly 10-15% stock performance of the latest chips. Two and half years is not a bad run.

I still my first Gen Nehalem. Skipped Westmere, Sandy and will skip Ivy. Will likely buy Haswell. 4 years out of a chip and mobo is pretty damn good in my book :)
 
I still my first Gen Nehalem. Skipped Westmere, Sandy and will skip Ivy. Will likely buy Haswell. 4 years out of a chip and mobo is pretty damn good in my book :)

*checks sig*

Amateur! :p :D :cool:

One week to go, folks. Be interesting if this 95W TDP is, in fact, a hoax.
 
While looking into this I came across information that this isn't the first jump in tdp Intel has done to ivybridge.

http://www.anandtech.com/show/4764/ivy-bridge-configurable-tdp-detailed

They were stating the extreme edition cpu TDP was supposed to be 55w (65w turbo)for the desktop parts back in september of last year. So if they started off with 55w then had to bump it up to 77w and now 95w(if true), I wonder what went wrong. Intel's engineers are usually on the ball when it comes to things they are willing to state in public.
 
95w confirmed? Just took the pic

temporary-5-1.jpg
 
Last edited:
Can't see pic. Is it just my phone browser, or is everyone having the same issue?
 
I could see it. The pic is huge, which might be why it's not showing up on your phone. The size is larger than my 1080P monitor.
 
Sorry fixed now posted from my Phone
I can't resize on photobucket from my phone so it will be huge till I get home :p
 
Last edited:
I can see the pic and that's definitely a 95W 3770K :( Seems like they had to bring the hyperthreading parts up yet another notch. Tri-gate may not be what most hoped it to be
 
Wow! WTF on that one Intel?. :confused: Okay I am seriously confused on what the point of Ivy Bridge is now. PCI-e 3.0 and better integrated graphics...who cares?

Getting seriously temped to just get the 2700k to complete my build and be done with it.
 
Yeah and it's a bit hypocritical if you ask me on intel and it's loyalists imo that they can decieve upon release like this and get away with it, yet Bulldozer and AMD were trashed beyond recognition with AMD being painted as a liar.
 
Oh noes! 95W!

Come on guys, it's not the end of the world. It'll still be a good processor, no doubt. Let's get some official benches before we all write it off as a failure.
 
Oh noes! 95W!

Come on guys, it's not the end of the world. It'll still be a good processor, no doubt. Let's get some official benches before we all write it off as a failure.


It's not that, it's the other things like the temp issues, limited oc'ing, etc but that too. You're missing the bigger picture here though. The cpu has been stated multiple times as a 77w TDP part for a year now and they are releasing it as a 95w.
 
You're missing the bigger picture here though. The cpu has been stated multiple times as a 77w TDP part for a year now and they are releasing it as a 95w.

I'm still not ready to condemn them over that.
 
Yeah and it's a bit hypocritical if you ask me on intel and it's loyalists imo that they can decieve upon release like this and get away with it, yet Bulldozer and AMD were trashed beyond recognition with AMD being painted as a liar.

My theory is this:
When the Pentium 4 fiasco occurred, Amd's embarassment of Intel was so severe it left even current Intel partisans inner children butthurt resulting in the fanatically enforced double standards we have now in regard to the two companies :)
 
Wow! WTF on that one Intel?. :confused: Okay I am seriously confused on what the point of Ivy Bridge is now. PCI-e 3.0 and better integrated graphics...who cares?

Well, I think you have to remember that 99% of their customer base are not enthusiasts, so the better integrated graphics will be far more meaningful to them, even if they don't realize it.
 
Well, I think you have to remember that 99% of their customer base are not enthusiasts, so the better integrated graphics will be far more meaningful to them, even if they don't realize it.

While I agree with you that enthusiasts make up a smaller part of the market share, isn't the 3770K marketed as an enthusiast level chip? It's not going in laptops and mainstream PC's.

Right now I am not seeing what the benefit is between 2600K/2700K and the 3770K. What am I missing?
 
Well then, looks like I'll be putting together my 2600k rig this week for sure.
 
While I agree with you that enthusiasts make up a smaller part of the market share, isn't the 3770K marketed as an enthusiast level chip? It's not going in laptops and mainstream PC's.

Right now I am not seeing what the benefit is between 2600K/2700K and the 3770K. What am I missing?

ivy-bridge on lga1155 is not an enthusiast chip.
 
I disagree but I think he means intel tots Sandy E as the enthusiest chip.

So everyone that invested in 1155 platform needs to turn in their PC enthusiast membership cards. :D All Ivy Bridge chips might not fall into enthusiast class, but 3770K definitely does.
 
Back
Top