An Analysis of the Z390 Socket Proves the Extra Pins Aren't Necessary

I'll repeat it again. Chipset revenue from people who would otherwise just upgrade their CPU is MINISCULE. It's a rounding error for Intel and Intel doesn't care. Intel cares about the Dells, HPs, etc. of the world. This point is so blatantly obvious that it is completely shocking to me that it is debatable.

This speaks to my earlier point. Dell, HP, MSI, ASUS etc. are all Intel's customers as much as we are. It creates products that are designed to appeal to OEMs and motherboard manufacturers in order to sell more product. It isn't about us. When we buy motherboards, we are ASUS, ASRock's, GIGABYTE's or MSI's customers. Not Intel's. Intel sold the chipset to the motherboard manufacturer, not us.
 
So, talk to Intel's engineers then. Let's revisit what the premise to this entire argument is - we have people in here claiming that Intel is apparently reaping huge revenue from making people upgrade chipsets in order to upgrade their CPUs and because of that, they're lying about needing extra power and ground pins in the socket to force people to upgrade past Z270. They're claiming that there are apparently millions and millions of people who would simply drop in a new CPU if only Intel would "let" them. None of this is true and shows a huge lack of awareness of the PC market. And if these folks don't want to believe Intel's engineers, they can go ahead and try what der8auer showed - no skin off my back.

Intel's engineers would likely say what every single engineer in this thread has said - it might work, but there are certain conditions which would make the system unstable more than if you just used a 6700K in a Z170 or a 7700K in a Z270 (for example).

I might be able to get an answer as to why those extra grounds are needed, but if they aren't actually needed I won't hear anything. However, not hearing anything wouldn't be conclusive either. Believe it or not, I don't get answers to certain questions from various parties for many reasons. Espionage is a real thing and specific technical details are often kept as secret as possible for as long as possible. So not getting an answer isn't necessarily an indicator that those grounds aren't needed. Even if I was given an answer as to why they are needed, many armchair engineers will probably sit there and say that Intel is lying.

It feels damn near pointless to even ask the question.
 
This speaks to my earlier point. Dell, HP, MSI, ASUS etc. are all Intel's customers as much as we are. It creates products that are designed to appeal to OEMs and motherboard manufacturers in order to sell more product. It isn't about us. When we buy motherboards, we are ASUS, ASRock's, GIGABYTE's or MSI's customers. Not Intel's. Intel sold the chipset to the motherboard manufacturer, not us.

Exactly.

I might be able to get an answer as to why those extra grounds are needed, but if they aren't actually needed I won't hear anything. However, not hearing anything wouldn't be conclusive either. Believe it or not, I don't get answers to certain questions from various parties for many reasons. Espionage is a real thing and specific technical details are often kept as secret as possible for as long as possible. So not getting an answer isn't necessarily an indicator that those grounds aren't needed. Even if I was given an answer as to why they are needed, many armchair engineers will probably sit there and say that Intel is lying.

It feels damn near pointless to even ask the question.

I'd wager Intel's engineers ran some simulations and something fell out of tolerance under certain specific conditions and management said "It isn't worth the risk - we could possibly face warranty claims and bad press if things go awry when users encounter those conditions. Let's fix it on the next generation of motherboards with the new chipset."
 
This. I don’t know who Roman is, but I’m actually an EE by education and I can confirm there is very little overlap between ME and EE courses. You take the same prerequisite math and science classes, a couple of the same engineering classes, and that’s it.

I have to just shake my head at this thread and especially one comment I read, which was something along the lines of “Well, he ran 5A through a pin and there was no sign of damage.” LOL. Is that the new basis for proof? Some guy runs 5A through a pin for a few minutes, the thing doesn’t shoot sparks, so it must be valid? LOL. As an engineer, I’ll trust Intel’s engineers on this one.

Now, that doesn’t mean Intel doesn’t deserve some criticism. They had to know the direction CPUs were headed so I’m a bit surprised they didn’t anticipate the need for additional power and ground pins and include them in the initial spec.

it was 5 amps at full load for 24 hours.

edit: no sorry you are correct.

my bad
 
This. I don’t know who Roman is, but I’m actually an EE by education and I can confirm there is very little overlap between ME and EE courses. You take the same prerequisite math and science classes, a couple of the same engineering classes, and that’s it.

I have to just shake my head at this thread and especially one comment I read, which was something along the lines of “Well, he ran 5A through a pin and there was no sign of damage.” LOL. Is that the new basis for proof? Some guy runs 5A through a pin for a few minutes, the thing doesn’t shoot sparks, so it must be valid? LOL. As an engineer, I’ll trust Intel’s engineers on this one.

Now, that doesn’t mean Intel doesn’t deserve some criticism. They had to know the direction CPUs were headed so I’m a bit surprised they didn’t anticipate the need for additional power and ground pins and include them in the initial spec.
He's not a EE. The course overlap between a ME and a EE is basically non-existent.

Jesus Christ how goddamn hard is it to Google his nickname and read the very first result that gives info about him?

He is a mechatronics engineer with a degree from one of the most respected umiversities im Germany.

This degree comprises 1/3 ME, 1/3 EE and 1/3 IT.

No overlap, yeah.:bored:
 
I might be able to get an answer as to why those extra grounds are needed, but if they aren't actually needed I won't hear anything. However, not hearing anything wouldn't be conclusive either. Believe it or not, I don't get answers to certain questions from various parties for many reasons. Espionage is a real thing and specific technical details are often kept as secret as possible for as long as possible. So not getting an answer isn't necessarily an indicator that those grounds aren't needed. Even if I was given an answer as to why they are needed, many armchair engineers will probably sit there and say that Intel is lying.

It feels damn near pointless to even ask the question.

If you ever get the chance, I would love to hear the answer if they gave one, even if it was a BS response.
 
Jesus Christ how goddamn hard is it to Google his nickname and read the very first result that gives info about him?

He is a mechatronics engineer with a degree from one of the most respected umiversities im Germany.

This degree comprises 1/3 ME, 1/3 EE and 1/3 IT.

No overlap, yeah.:bored:

He's still not an EE and still doesn't have access to hundreds of millions of dollars in simulators and test equipment.
 
Reading is evidently hard. The quote was (emphasis is mine):

"I’ll trust Intel’s engineers on this one."

But Intel management gives the orders to the Intel engineering. If management told engineering to cripple something, they've done it. Didn't someone hop on to the Asrock Twitter at some point and say yes it would work on older boards but won't because of reasons?

If you want to appeal to the engineers, why not remind everybody that Intel engineers also invented their in-house 10nm process for their foundry.

I'll repeat it again. Chipset revenue from people who would otherwise just upgrade their CPU is MINISCULE. It's a rounding error for Intel and Intel doesn't care. Intel cares about the Dells, HPs, etc. of the world. This point is so blatantly obvious that it is completely shocking to me that it is debatable.

They wouldn't be making iGPU-less K models unless they cared about squeezing every last penny.
 
The pins are fairly robust, maybe the concern is for the circuit traces on the myriad number of boards on the market? Just hypothesizing.
 
I have worked with engineers and one thing it taught me is that having an engineering degree doesn't make you smart in anything.
I have met both smart and ones who were not. So saying somebody has a degree doesn't mean a whole lot. Experience is more believable
than a piece of paper, IMO.

Now as to the pins, if the chips are both 95w/125w(or whatever) and are pulling the same amount of power, then would you need the extra pins?
What is their criteria for having so many extra pins? Overclocking? Lots of what ifs and could be's. Dan_D said it best, we may never know
the exact reason. Maybe the guy needs to run it overclocked on prime95 for a few days?
 
Last edited:
He's still not an EE and still doesn't have access to hundreds of millions of dollars in simulators and test equipment.

Why does that even matter? Its all speculation anyway because we don't know how much headroom Intel engineers decide upon and that could vary with every product.
 
Why does that even matter? Its all speculation anyway because we don't know how much headroom Intel engineers decide upon and that could vary with every product.

It matters because certain people here act as if der8auer is a CPU designer and that his tests conclusively proved that Intel was ripping people off. His tests show it CAN work (which everyone already pretty much knew) but what they don't answer are the tougher questions such as longer-term failure rates, stability over time, etc. That's where Intel's simulation and testing come into play. Look, I've said repeatedly in this thread that it might work OK and if you want to take the risk, knock yourself out - as long as you know there is a risk something could go wrong and accept responsibility.

And your second comment is correct in terms of speculation regarding what Intel engineered into the products in terms of headroom. However, the entire argument being made in this thread by most is that Intel intentionally lied to people about the 8700K and above CPUs not working on the Z170/270 chipsets for the sole purpose of generating some magical revenue stream for themselves. As has been pointed out repeatedly, any revenue increase generated by increased chipset sales to those folks wanting to go from a 6700K/7700K to an 8700K or above is miniscule. If Intel wanted a magical revenue stream, there are about a million better ways with higher ROI than having to redesign a CPU socket, run it through all sorts of validation testing, etc, and then push it to market.

AMD's sockets last because AMD had a thorough and achievable roadmap. Intel's sockets have been in flux because let's all be honest - Intel expected Ice Lake to be on the scene by now and didn't ever anticipate having to spin several iterations of Skylake. As a result, yes, they've had to scramble and do some redesign on the fly.
 
It matters because certain people here act as if der8auer is a CPU designer and that his tests conclusively proved that Intel was ripping people off. His tests show it CAN work (which everyone already pretty much knew) but what they don't answer are the tougher questions such as longer-term failure rates, stability over time, etc. That's where Intel's simulation and testing come into play. Look, I've said repeatedly in this thread that it might work OK and if you want to take the risk, knock yourself out - as long as you know there is a risk something could go wrong and accept responsibility.

And your second comment is correct in terms of speculation regarding what Intel engineered into the products in terms of headroom. However, the entire argument being made in this thread by most is that Intel intentionally lied to people about the 8700K and above CPUs not working on the Z170/270 chipsets for the sole purpose of generating some magical revenue stream for themselves. As has been pointed out repeatedly, any revenue increase generated by increased chipset sales to those folks wanting to go from a 6700K/7700K to an 8700K or above is miniscule. If Intel wanted a magical revenue stream, there are about a million better ways with higher ROI than having to redesign a CPU socket, run it through all sorts of validation testing, etc, and then push it to market.

AMD's sockets last because AMD had a thorough and achievable roadmap. Intel's sockets have been in flux because let's all be honest - Intel expected Ice Lake to be on the scene by now and didn't ever anticipate having to spin several iterations of Skylake. As a result, yes, they've had to scramble and do some redesign on the fly.
Basically you are saying you have no idea either. His tests show that it can work. Didn't Intel say it wouldn't work?
They didn't say it would work, but it might have problems due to needing so many pins or whatever, unless I am wrong?
 
Jesus Christ how goddamn hard is it to Google his nickname and read the very first result that gives info about him?

He is a mechatronics engineer with a degree from one of the most respected umiversities im Germany.

This degree comprises 1/3 ME, 1/3 EE and 1/3 IT.

No overlap, yeah.:bored:

Yes, no overlap. Either an actual EE or not. His degree is basically an ME with electives. He might of taken some 101 classes but those are basically useless in this area that relies on 300-400 level emag course work.
 
Basically you are saying you have no idea either. His tests show that it can work. Didn't Intel say it wouldn't work?
They didn't say it would work, but it might have problems due to needing so many pins or whatever, unless I am wrong?

None of us know for sure the reasons behind Intel's decision. I thought that was pretty obvious. For us to be privy to the detailed technical information, they would probably require us to sign some sort of NDA. Instead, we get the "increased power requirements" reason.

I don't recall Intel's exact wording on whether it would work or not, but I do speak corporatese, so if they said "it won't work," that translates into it "won't work within our design parameters."
 
Basically you are saying you have no idea either. His tests show that it can work. Didn't Intel say it wouldn't work?
They didn't say it would work, but it might have problems due to needing so many pins or whatever, unless I am wrong?

You realize that those are the same things right? We're dealing in deeply probabilistic engineering methods with these things. Circuit timing isn't deterministic but probabilistic. Electromigration isn't deterministic but probabilistic. Pretty much everything in chip/electrical design uses probabilistic methods and not deterministic design at the levels of sophistication we are dealing with. Will a circuit work? The truth is we actually don't really know when we design it because of the massive number of variables involved so what we actually do is rely on probabilities and margins to make sure that it generally will work.

So while it may be possible to run 5A through a circuit for some set of variables, we aren't designing to that limited set of variables but to a much wider set of variables encompassing fairly wide guard bands to make sure that the probability that it will work will be significantly greater in reality.
 
However, the entire argument being made in this thread by most is that Intel intentionally lied to people about the 8700K and above CPUs not working on the Z170/270 chipsets for the sole purpose of generating some magical revenue stream for themselves. As has been pointed out repeatedly, any revenue increase generated by increased chipset sales to those folks wanting to go from a 6700K/7700K to an 8700K or above is miniscule. If Intel wanted a magical revenue stream, there are about a million better ways with higher ROI than having to redesign a CPU socket, run it through all sorts of validation testing, etc, and then push it to market.

Who's to even say it even cost them that much to activate a few extra pins if they don't do much at all in the first place?

People are picking fights with you because you're trying to talk nonsense with a straight face. Intel wouldn't do this for a few extra dollars? There's multiple decades of Intel's behavior in the market that says the opposite of what you're saying. We're talking about a company which has mastered the art of market segmentation up and down a thousand deep product stack and turned it into de factor corporate creed. It runs in their blood. If Intel sold hamburgers, they'd slice that hamburger 100 different ways to sell to 100 different people if they thought they could squeeze another 1% out of it.
 
LOL good old intel. This is why backup z170a system will live long and will run 7600k at 4.9ghz as long as it lives. My main system is going to be zen2 as I sold my other parts in anticipation of it and using my secondary system right now. Intel can suck it with their cash grab. I am sure that will slowly change though. As they keep losing marketshare to AMD.
 
Who's to even say it even cost them that much to activate a few extra pins if they don't do much at all in the first place?

In engineering, you don't just make a change willy-nilly. There are validation procedures which take time, money, and resources.

People are picking fights with you because you're trying to talk nonsense with a straight face.

You guys are espousing nonsensical conspiracy theories, and *I'm* talking nonsense? The other engineers here are talking nonsense too, I suppose. LOL! On the contrary, not a single thing I've said is nonsense. If you seriously think Intel is magically going to rake in huge sums of money by forcing a tiny (and it is TINY) percentage of people to invest in a new chipset to simply change CPU, you're comically wrong.

Intel wouldn't do this for a few extra dollars? There's multiple decades of Intel's behavior in the market that says the opposite of what you're saying. We're talking about a company which has mastered the art of market segmentation up and down a thousand deep product stack and turned it into de factor corporate creed. It runs in their blood. If Intel sold hamburgers, they'd slice that hamburger 100 different ways to sell to 100 different people if they thought they could squeeze another 1% out of it.

Intel did none of that for "a few extra dollars." Intel made the decisions you mentioned for hundreds of millions or billions in additional revenue. I don't think you have an appreciation of the scale of a company the size of Intel, their revenues and cash flow, or how they operate. Any increase in revenue by people "forced" to upgrade boards in order to use the latest CPUs is so miniscule that it is literally a rounding error for Intel. I seriously think you guys must work for some small companies and haven't worked in Fortune 50 companies before. I mean seriously, a few million to Intel is literally nothing.

By the way, I asked this question of another guy when he claimed this practice hurts non-enthusiasts. Of course, he conveniently didn't answer so maybe you'll humor me:

How does it hurt non-enthusiasts?
 
This speaks to my earlier point. Dell, HP, MSI, ASUS etc. are all Intel's customers as much as we are. It creates products that are designed to appeal to OEMs and motherboard manufacturers in order to sell more product. It isn't about us. When we buy motherboards, we are ASUS, ASRock's, GIGABYTE's or MSI's customers. Not Intel's. Intel sold the chipset to the motherboard manufacturer, not us.

That isn't the issue. Give users choice. Like AMD still releases new chipsets for partners to sell, but you still give those who want the latest and greatest an opportunity to buy new board based on new chipset. I pretty much know they will be releasing x470 replacement with zen 2. if you noticed they did relax it a bit with 8700k. May they get a pass for 9900k lol. But Honestly as more and more customers move to AMD and market share balances out you will see Intel relax their policies.
 
Intel cares about the Dells, HPs, etc. of the world. This point is so blatantly obvious that it is completely shocking to me that it is debatable.
Intel also cares about elite enthusiasts, but that's just how marketing is. It's why the major car makers make/sponsor race cars.
 
BTW, let me throw out another theory:

Chipsets and motherboards take a long time to get to market, and the specs are set way before that.
So imagine the Z390 chipset was originally developed for Intel's much-delayed 10nm CPUs.
Imagine those CPUs require significantly lower supply voltages (as typically occurs when a process shrinks) which implies higher current draw for the same TDP.
What would you do to the socket to accommodate that higher current draw at the same TDP?

Answer: add more power and ground pins.

And it might not have been possible to do so while preserving backward compatibility.
Power supply to CPUs ain't always simple, there's all kinds of issue with, for example, inductance of the package leads that complicate it.
 
Last edited:
It matters because certain people here act as if der8auer is a CPU designer and that his tests conclusively proved that Intel was ripping people off. His tests show it CAN work (which everyone already pretty much knew) but what they don't answer are the tougher questions such as longer-term failure rates, stability over time, etc. That's where Intel's simulation and testing come into play. Look, I've said repeatedly in this thread that it might work OK and if you want to take the risk, knock yourself out - as long as you know there is a risk something could go wrong and accept responsibility.

And your second comment is correct in terms of speculation regarding what Intel engineered into the products in terms of headroom. However, the entire argument being made in this thread by most is that Intel intentionally lied to people about the 8700K and above CPUs not working on the Z170/270 chipsets for the sole purpose of generating some magical revenue stream for themselves. As has been pointed out repeatedly, any revenue increase generated by increased chipset sales to those folks wanting to go from a 6700K/7700K to an 8700K or above is miniscule. If Intel wanted a magical revenue stream, there are about a million better ways with higher ROI than having to redesign a CPU socket, run it through all sorts of validation testing, etc, and then push it to market.

AMD's sockets last because AMD had a thorough and achievable roadmap. Intel's sockets have been in flux because let's all be honest - Intel expected Ice Lake to be on the scene by now and didn't ever anticipate having to spin several iterations of Skylake. As a result, yes, they've had to scramble and do some redesign on the fly.

I don't disagree at all, but while I don't see the test as proof, I do admit the conclusion is very much a possibility after thinking about it for a while.

Intel is a platform company that likes to sell as many chips per system as possible. Its their core strategy and has been since Pentium 1. They have been going out of their way to make it difficult to not have to buy a new chipset with a new CPU since then, but throughout that time frame we have seen examples where it was made possible.

Slotket adaptors existed. Evergreen CPUs. OverDrive CPUs (made by themselves interestingly enough). Risers. Add-on VRMS. Conversion kits.

I'm on the fence. Interesting conversation for sure though.
 
BTW, let me throw out another theory:

Chipsets and motherboards take a long time to get to market, and the specs are set way before that.
So imagine the Z390 chipset was originally developed for Intel's much-delayed 10nm CPUs.
Imagine those CPUs require significantly lower supply voltages (as typically occurs when a process shrinks) which implies higher current draw for the same TDP.
What would you do to the socket to accommodate that higher current draw at the same TDP?

Answer: add more power and ground pins.

And it might not have been possible to do so while preserving backward compatibility.
Power supply to CPUs ain't always simple, there's all kinds of issue with, for example, inductance of the package leads that complicate it.

But it isn't for 10nm...it's for 14nm which is the whole damn point.
 
I don't disagree at all, but while I don't see the test as proof, I do admit the conclusion is very much a possibility after thinking about it for a while.

Intel is a platform company that likes to sell as many chips per system as possible. Its their core strategy and has been since Pentium 1. They have been going out of their way to make it difficult to not have to buy a new chipset with a new CPU since then, but throughout that time frame we have seen examples where it was made possible.

Slotket adaptors existed. Evergreen CPUs. OverDrive CPUs (made by themselves interestingly enough). Risers. Add-on VRMS. Conversion kits.

I'm on the fence. Interesting conversation for sure though.

Ha! I remember the Slotket adapters - I may still have one in one of my part boxes, though I thought I threw them all out at some point.
 
In engineering, you don't just make a change willy-nilly. There are validation procedures which take time, money, and resources.



You guys are espousing nonsensical conspiracy theories, and *I'm* talking nonsense? The other engineers here are talking nonsense too, I suppose. LOL! On the contrary, not a single thing I've said is nonsense. If you seriously think Intel is magically going to rake in huge sums of money by forcing a tiny (and it is TINY) percentage of people to invest in a new chipset to simply change CPU, you're comically wrong.



Intel did none of that for "a few extra dollars." Intel made the decisions you mentioned for hundreds of millions or billions in additional revenue. I don't think you have an appreciation of the scale of a company the size of Intel, their revenues and cash flow, or how they operate. Any increase in revenue by people "forced" to upgrade boards in order to use the latest CPUs is so miniscule that it is literally a rounding error for Intel. I seriously think you guys must work for some small companies and haven't worked in Fortune 50 companies before. I mean seriously, a few million to Intel is literally nothing.

By the way, I asked this question of another guy when he claimed this practice hurts non-enthusiasts. Of course, he conveniently didn't answer so maybe you'll humor me:

How does it hurt non-enthusiasts?

If those parts of the business are a rounding error, I'd wish they'd stop segmenting everything 1000 ways, but instead they're out there squeezing every last penny they can out of a stone. It's what they do. Please don't ask me if I know how Fortune 50 companies work when you're still insisting they're product releases are being driven by engineers. It's not the engineers who decided to paper launch Coffee Lake months before it was ready. It's not an engineer who decided Kaby Lake X needed to be a thing. No reason not to believe socket + chipset changes every 6 months is driven by anything other than the people who made those other decisions. You'll never see anything like the 440BX out of this group again.
 
But it isn't for 10nm...it's for 14nm which is the whole damn point.

10nm might have been its original intended market. When you design a product, R&D money has been spent. Sometimes that product may not be needed for its original purpose but can be adapted to suit another purpose and recoup the development costs.
 
Im not gonna claim to be any kind of expert but it really pissed me off personally when I found out my brand new z270 wouldnt be upgradable to coffee lake. I haven't seen anything to show that it was some huge architectural change. I may be dating myself but you usually don't upgrade until its a vastly different system with new DDR requirements and feature sets. It just reeks of Intel either mismanaging development or they are really pushing the boundaries on forced upgrades. The z370 is nearly indentical to the z270 except its a new cpu and suddenly not backwards compatible. malevolent or not, Im 100% dumping intel for the upcoming Ryzen.
 
Last edited:
If those parts of the business are a rounding error, I'd wish they'd stop segmenting everything 1000 ways, but instead they're out there squeezing every last penny they can out of a stone. It's what they do. Please don't ask me if I know how Fortune 50 companies work when you're still insisting they're product releases are being driven by engineers. It's not the engineers who decided to paper launch Coffee Lake months before it was ready. It's not an engineer who decided Kaby Lake X needed to be a thing. No reason not to believe socket + chipset changes every 6 months is driven by anything other than the people who made those other decisions. You'll never see anything like the 440BX out of this group again.

Not once did I ever say engineers drove the product release.
 
Something I don't see being considered yet in this discussion is that there is a whole lot more to CPU & chipset design than just power and ground pins.
It's not a toaster oven they are designing..... we are talking about a device that has critical frequency, signal propagation, noise, etc. concerns.

Even though the CPU & chipset may run on less than the spec pin connections, that doesn't take into consideration what the signal/noise looks like
without those extra connections.

As part of the development process, engineers look at wave forms of signals on the running CPU with a scope. There are going to be many times where
they look at a specific signal and say "Yeah, that's not good." The fact that the CPU is running is almost irrelevant if it's not running correctly. The fix may
include adding/changing power, ground, signal paths, logic gates, etc.

There's a lot more to it than simple power and ground requirements and calculations.

.
 
Something I don't see being considered yet in this discussion is that there is a whole lot more to CPU & chipset design than just power and ground pins.
It's not a toaster oven they are designing..... we are talking about a device that has critical frequency, signal propagation, noise, etc. concerns.

Even though the CPU & chipset may run on less than the spec pin connections, that doesn't take into consideration what the signal/noise looks like
without those extra connections.

As part of the development process, engineers look at wave forms of signals on the running CPU with a scope. There are going to be many times where
they look at a specific signal and say "Yeah, that's not good." The fact that the CPU is running is almost irrelevant if it's not running correctly. The fix may
include adding/changing power, ground, signal paths, logic gates, etc.

There's a lot more to it than simple power and ground requirements and calculations.

.

Well a 5Ghz 9900k running Prime95 for 24 hours without error with additional power and ground taped off even compared to a z170 seems to indicate it's working just fine.
 
Well a 5Ghz 9900k running Prime95 for 24 hours without error with additional power and ground taped off even compared to a z170 seems to indicate it's working just fine.

No it actually doesn't. It merely indicates in that particular case, with those particular environmental factors, for that particular time period, it worked. It doesn't say anything about what the long term effects are, if there was damage from it, or if it would work under difference cases and environmental factors. I can put 30-40A through 20 gauge wire for 24 hours. But no one competent would ever do that in a house. As another example, you might want to look up Tacoma Narrows bridge to see what happens when you say something works for 24 hours and think that settles things.
 
Intel - China - South Korea love clever marketing. So sad that so many sell their soul just for filthy lucre
 
No it actually doesn't. It merely indicates in that particular case, with those particular environmental factors, for that particular time period, it worked. It doesn't say anything about what the long term effects are, if there was damage from it, or if it would work under difference cases and environmental factors. I can put 30-40A through 20 gauge wire for 24 hours. But no one competent would ever do that in a house. As another example, you might want to look up Tacoma Narrows bridge to see what happens when you say something works for 24 hours and think that settles things.

Just because you use an engineering fail analogy doesn't make you right.

We are talking about CPUs. When you overclock your CPU you're already operating outside of the engineered specifications. A highly overclocked KBL surely is more stressful for the socket than a 65W i5 8400. At the end of the day as long as the CPU is calculating correctly, nobody cares if it is .5% out of spec. If it lasts 15 years instead of 17 years very few people will mind. And that's assuming there is any negative effect at all. At least there's a small sample showing no negative effect. The other side only has speculation.
 
Just because you use an engineering fail analogy doesn't make you right.

We are talking about CPUs. When you overclock your CPU you're already operating outside of the engineered specifications. A highly overclocked KBL surely is more stressful for the socket than a 65W i5 8400. At the end of the day as long as the CPU is calculating correctly, nobody cares if it is .5% out of spec. If it lasts 15 years instead of 17 years very few people will mind. And that's assuming there is any negative effect at all.

Yes I know we are talking about CPUs, I've designed many of them. You assume that overclocked processors are calculating correctly. The reality is often rather mixed. One of the most significant top line specifications for a CPU sold in mass is FIT rate. No one is designing around 15 or 17 year FIT, they are designing to 10 year FIT. To give you an idea of where things stand, they are designed to roughly 50 failures in a million parts over a 10 year period. Those numbers are what warranties are generated off of. And a doubling of the failure rate can easily happen with things 1/2 % out of spec, we are after all dealing with cascading failure conditions. Things like number of PG pins on a socket are very much about maintaining proper failure rates.
 
This thread
I can tell many didn't listen to the Creat■or of the video in the video.

And why shoud they? How dare they question Intel if they don't have an EE degree? Why should anyone question what Ford does if they don't have a ME degree??

Dan_D makes a great point. Perhaps there is some other reason Intel did what they did, relating to power states or whatever.

And that's fine, good for Intel.

But man, did this thread expose who the new Shintels are. Using academic alcolades as they only grounds for a debate... your Professors would be proud.
 
Back
Top