Intel still charges a premium for unlocked chips, a practice that by itself only came about due to competition.nobody’s cracking down on overclocking,
So I disagree.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Intel still charges a premium for unlocked chips, a practice that by itself only came about due to competition.nobody’s cracking down on overclocking,
Intel still charges a premium for unlocked chips, a practice that by itself only came about due to competition.
So I disagree.
nobody’s cracking down on overclocking, they are running the silicon so close to the edge that the turbo frequencies basically do the job for you unless you start getting into the more exotic cooling solutions and yes an AIO or massive Noctua is an exotic cooler. What there is now is competition and neither party can afford to leave anything behind because that extra 100mhz could be the difference between it reviewing well or bombing.
Well there was that whole pre-ryzen period of sub-par CPUs.I haven't bought anything Intel in over ten years. Thank goodness that AMD is at least competitive with both CPUs and GPUs.
In the mid range they did just fine. Albeit hot. Frankly I'm still using one and it's chugging along better than its period competition, due to having moar cores. No noticeable difference in real world applications between it and first Gen ryzen.Well there was that whole pre-ryzen period of sub-par CPUs.
The biggest problem with this is, I want their shit. I just don't like their methods. Yes I can complain about their shady shit, and just still buy whatever I want. At the end of the day I don't own a mega corp so I cant exactly relate directly to their situations.Just corporations doing what they do.Don’t like it.
Want change?
Stop supporting the board members wallets.
Ah you mean the period when Intels CPUs just didn't bother with any working security. Its interesting if you bench those supposed terrible AMD chips today against Intel both with security issues fixed... Guess what. AMD holds its own. They might have even won in some of the value segments as the mitigations for most of those chips basically turn off Intels broken branch predictor on the mid range chips with low cache that damn near halves their performance in some cases its even worse then that. Ya ya Intel beat AMD up back for a stretch of 4 or 5 years by a good 30%... performance that mostly came from some Intel engineer actually deciding that their chips didn't need to check permission before doing cache mem reads. (which I still can't believe could have been done unknowingly... some engineer made that choice and either his bosses where too dumb to understand or choose to ignore it for the performance improvement)Well there was that whole pre-ryzen period of sub-par CPUs.
C'mon that's a useless comparison. When they were just released is when we were comparing what to buy. Not something 7 years old at this point, since the vast majority of us have moved on since then.Ah you mean the period when Intels CPUs just didn't bother with any working security. Its interesting if you bench those supposed terrible AMD chips today against Intel both with security issues fixed... Guess what. AMD holds its own. They might have even won in some of the value segments as basically turning off Intels broken branch perdition on the mid range chips would have damn near halved their performance.
Intel should prey to the chip gods with thanks nightly that no one discovered those flaws at the time. (that we know of... I mean who knows might have been part of state level tools for years)
Incorrect. Intel has tried to prevent overclocking as far back as the Pentium days. There is a Pentium 133MHz stepping that won't run if not set correctly. However, despite Intel's efforts, everything from the 300A on up has been overclockable. The limitations we've generally seen come from certain architectures being less capable of clocking higher. This includes the latest silicon which is already binned at the edge of what its capable of. Meaning, there is no room to overclock. About all we've been able to do for some time now is lock the all core frequency to the boost frequency or close to it. Even then, power consumption skyrockets and you need heavy duty cooling to handle the consequences of that.This might be dangerously close to off topic, but ever since the Celeron 300A Intel has been cracking down on overclocking, tweaking, or otherwise stealing performance from them. The only reason we have overclocking and unlocked CPUs is because the competitive offered it to compete. Intel is a shady, multi national Corp that would sell your children to slavery in order to sell you a can of beans.
Is it a useless comparison.C'mon that's a useless comparison. When they were just released is when we were comparing what to buy. Not something 7 years old at this point, since the vast majority of us have moved on since then.
Lots of compelling parts for HEDT out there - it's just getting old on the Intel side (x299 has been around what, 4 years now?). I still run them because PCIE lanes (and occasionally memory bandwidth matters for my stuff), but ... yeah. Hoping HEDT isn't dead.Despite this, if you were using AVX-512 as a consumer... it sounds like you can just stick with an older BIOS to retain the functionality.
Still, applications that leveraged it remain exceedingly rare, at least among those that are relevant to most consumers. This Anand thread was one of the only resources I could find that gave examples and it was basically just, x264.
There is a bigger question IMO about what Intel is doing in the HEDT segment. Most of the innovation they've brought to market has primarily benefitted consumer CPUs like 12th gen Core series, and I can't remember the last time mainstream manufacturers released compelling mobos for HEDT platform anyway. Couple years ago I switched from X99 to Z270 and imho, there's no looking back.
I've worked with/under Pat Gelsinger in the past. I have two rules in my career - never bet against Michael Dell, and NEVER bet against Gelsinger. Both are ambitious, driven, creative, and arguably brilliant in very specific fields. If Pat believes that GPUs are a market need for Intel, they'll get whatever funding and support they need (and a headman's axe to those not on board or pulling their weight) - if he believes that it doesn't support their core mission, that whole team will be cut lose without even a fare thee well. I suspect they see a need to break out of the core x86 business, hence the investment in fab lines (and licensing if needed), and the GPU market push. He's cutting things that don't make sense (Optane for consumer markets) and driving things that do (enterprise, GPU (which leads to specialized processors), fabrication). All things that are Intel's core competency.I expect it is going to take intel a couple generations to be where they would like with GPUs. They are going to get to parity though. With Intels past aborted attempts I believe there was a core of higher ups at Intel that honestly didn't see the GPU future. I think they honestly believed GPUs where never going to be more then a video game market thing which wasn't the big $$$, and at some point the CPUs would start doing real time ray tracing or something anyway. This time they clearly understand they need a GPU to provide 100% intel solutions for supercomputer/server/ai/auto ect ect.... so they will ride out the embarrassment of a couple of also ran status product cycles. (old Intel was too quick to abandon markets they didn't crush, too thin skinned to admit they where #2.... frankly they still sort of are their current CEO doesn't seem any different to me)
We'll definitely see. Not as worried about margins on individual products (especially early on), but definitely curious about the bundling/etc.For this first round of GPUs I don't think Intel will have the wiggle room to really pull the old one waffer... 5 skus ranging from -50% margin to 250%. The GPUs are outsourced... I mean they will obviously not price the bottom of the wafer at the top, I just don't think they can get away with the shenanigans of the past as easily having to more closely account for wafer supplier payments. At this point Shareholders are going to want to know yields... and aren't going to be ok with "lasered off" (or disabled in firmware) chunks of silicon that is being sold at cost or worse under cost. When Intel moves the GPU silicon to their own fabs though.... that might be very different. If Intel can ever sort their fabs out, then I expect a lot of OEMs to be stuck with the old you buy our CPU you buy our GPU for $1 or perhaps we don't have enough CPUs to fill your order BS. At that point Intel covers those OEM deals up in NDAs, reports the sales of CPU and GPU as "computing hardware unit" sales on the quarterly reports and everyone is happy. (same way companies like Microsoft have hidden Xbox losses for over a decade)
This. I haven't touched it on my 10700K because the board exceeded my planned OC on its own.nobody’s cracking down on overclocking, they are running the silicon so close to the edge that the turbo frequencies basically do the job for you unless you start getting into the more exotic cooling solutions and yes an AIO or massive Noctua is an exotic cooler. What there is now is competition and neither party can afford to leave anything behind because that extra 100mhz could be the difference between it reviewing well or bombing.
Oh, so Intel making multipliers unavailable in BIOS when they previously were, and the competition allowed it isn't a limitation then.....The limitations we've generally seen come from certain architectures being less capable of clocking higher.
They have for a very long time now with "k" cpu lines. However there's little to gain on amd or Intel really from manual overclocks anyway.Oh, so Intel making multipliers unavailable in BIOS when they previously were, and the competition allowed it isn't a limitation then.....
Those only exist because they were forced. And you pay more for the privilege.They have for a very long time now with "k" cpu lines. However there's little to gain on amd or Intel really from manual overclocks anyway.