Separate names with a comma.
Discussion in 'Intel Processors' started by MorgothPl, Nov 22, 2012.
I think this is much more likely than them only offering BGA processors.
This only hurts consumers who build their own PC. For the mass, it doesn't impact them a bit.
This makes sense on a whole lot of levels.
It ups Intel's margins firstly, and that's #1. By doing this, it forces every OEM to buy a CPU with each board manufactured, which means more overall sales.
This allows certain design elements not available in the socket/LGA design.
It *can* lower costs. (Depending on if that's the point, or not.)
I'm sure Intel has done their internal metrics, and probably find the number of people who upgrade their CPUs are an endangered species. In this case, why continue to support that top few percent? I'm sure it makes less and less financial sense as time goes on.
With AMD now giving a bow out to the high end (and increasingly the mid range) PC market, Intel is the sole king. Large corporations are not there for your benefit, they're there to make money. By controlling the flow of CPUs to top tier OEMs, they can make sure anyone who plays ball gets special perks. Every motherboard company from gigabyte to asrock must come to the throne of intel and pay homage. Every OEM from dell to emachines must also.
Essentially, this gives Intel a way to lock down the PC market so that nothing is beyond control. They can require certain features on certain mainboards, or control prices in the PC market as they wish.
And, if you're already in bed with Intel (Dell for example) this could be a way to squeeze out other OEMs from the market, making sure people get a larger piece of the pie.
This is the death of the enthusiast market, and isn't surprising since its been rapidly shrinking for years. It was going to happen at some point anyway.
This is a reaction by Intel to it's growing irrelevance. They have good tech, but so have many other companies that were forced to fall by the wayside. I would expect them to become a quasi IBM, doing CPU duties for more and more specialized niches.
There's no telling how long the PC market lasts, but it's certain that Intel will wring every dollar from it as it fades. I'm not sure why anyone would expect them to do anything else. Wouldn't you?
I know plenty of electronics use BGA and are just fine. However, the nightmare that pops into my head are the old PC Chips motherboards that they used to sell at the computer store I worked at years ago. Those things were so cheap and unreliable that I cringed every time I could not convince a customer to upgrade from the "special" $249 computer.
Smells like FUD.
No, it makes no sense at all from an efficiency or product inventory management perspective. Where it does make sense is to simply fuel tin helmet crackpot theories.
It came as a bit of cluelessly mistranslated info, and has somehow become "fact". Incredible.
I'm an enthusiast, and this won't affect me one bit. Even among enthusiasts, only a small handful upgrade the CPU only. Especially on the Intel side since there's essentially a new socket with every Tock cycle. AMD sockets stick around for a few more generations, but that doesn't really matter either considering new AMD processors aren't any faster than the ones they're replacing. <~~~ I'm half trolling and half serious with that last comment, but the bottom line is, this will have have very little effect onthe enthusiest market. Heck, if anything it may strenghen it. One less complication to worry about for a newbie trying to get into the hobby. (if it even happens)
So the motherboard makers agree to standardize their own socket dimensions then solder the chip on a PGA/LGA "circuit board" carrier.
AMD did this a long time ago, see below.
Holy shit. Windows compatible.
I like how people keep saying AMD is bowing out of Desktop PC's, because they hope it happens.
Hint, I dont care that you want it to happen, it isn't. AMD is staying in high end CPU's. I can go buy a 8350 on newegg right now, 2 years after people said AMD stopped making desktop CPU's.
um......... the opposite actually............
it wont affect the masses who buy OEM... it will affect us who buy the cheapest chip and O/C the crap out of ti just as Intel made the k series for us.
ummmm... how is that the "opposite" exactly?
For a person who financed his own demise you are pretty peppy, Good luck buying your "next" Intel system and wonder about how you have spend your cash in the past, or you waiting for the next AMD system that really is not as fast in gaming as Intel. Maybe AMD won't be around by that time........
Ever take a look at an LGA 775? 11xx? How many pins can you fit on a board before cost becomes excessive? 4000? 10000? At some point that socket becomes so complex the cost isn't justifiable.
I don't think for example, socket 4976 is a good idea, when the legs are a nanometer thin and a puff of wind will bend them. Sure, I guess they could throw it on a board ala pentium 2, but considering the important functions are increasingly on the CPU itself, there's no point. You may aswell just solder it on.
The future is the integration of everything onto the CPU itself. It's cheaper, more efficient, and requires less hands in the profit pie. This has been the trajectory for years. We're up to the Bus, memory controller, and GPU all on the CPU. I don't see that trend stopping. Maybe they won't license anything from realtek, but I wouldn't bet on it. At this point everything is possible. All on a chip is the future. That's network, audio video ETC all in one. The motherboard will be for ports, powermanagements and memory access.
EDIT: Just to add, this increases margins for intel. They can offer less product at higher prices. Now instead of selling 10 different CPUs, they can sell 3 or 4. Product that would have binned into a less expensive category can now go into a higher category, increasing profit.
They used to do this back in the day, and it worked pretty well for them I think.
No one thinks this is going to be a cold water dive. It'll be a gradual process, and it costs intel nothing to try it out. It's also pretty much all over tech websites now.
LOL, it's a bit of a stretch to use that argument when Intel has shipped sockets of over 1366 pins, and continues to use 1155 pins in the mainstream processor. Mainstream Haswell/Broadwell share a 1150 socket. Both are SoC deigns.
It's ridiculous to suggest, particularly with SoC type CPUs, that pin count is going to quadruple. Pin count is more likely to remain fairly stable on those mainstream CPUs.
What is interesting though, is that with DDR4 supporting processors coming in the next 2 years or so, that pin count might actually increase a bit due to the point to point nature of the interface (max 1 DIMM per channel). A 4 DIMM memory interface, whether it's dual channel or quad, with have 4 full sets of traces to the 4 DIMMs, compared to dual channel memory which can typically support 2 DIMMs per channel (share traces to memory controller) on most desktop boards.
I don't normally do this, but your post is very worth of a... "Cool story, bro" so...
Cool story, Bro
I remember a few years ago there was an article stating that Intel wanted to eventually integrate the southbridge onto the processor. Eventually Intel integrated the northbridge by adding both a memory controller and a single PCI Express x16 lane onto their processors. AMD have done the same as well.
If you notice the trend by both Intel and AMD, they both want to integrate the major components of a motherboard sans the memory onto the processor. A lot of the reasons is for efficiency and smaller ICs, and smaller number of parts in the computer. It has been an ongoing trend since the invention of the computer-- from vacuum tubes to transistors to the first integrated circuit, and eventually large-scale integrated circuits and then system-on-a-chip (SoC). The computer is getting smaller and smaller. Look at the Raspberry Pi, which is using an ARM SoC. Intel has moved into the mobile SoC market with their Intel Atom "Medfield" SoCs with hopes to take some marketshare away from ARM-based products designed and built by the likes of Apple, Qualcomm and Nvidia. It is still going to be a very steep uphill climb for Intel to do that seeing that there are very few Medfield-based devices on the market at the moment. That should change by next year and the year after when processors like Valleyview and Airmont are released.
However, the trend to a more integrated and smaller computing device is starting to get more noticeable day-by-day. Both AMD and Intel are approaching it from different angles if you look at it. AMD is moving towards the integration of both the compute cores of a CPU and floating point efficiency of GPU cores. Intel is headed the other direction it seems, if true. Eventually we may no longer have chipsets like Z77 or X79 on the motherboard when this integration of northbridge, southbridge, processor and graphics unit are integrated onto a single piece of silicon. There may be a single chip acting more like a communications hub than a controller between the processor and the peripherals of the computer. A single-chip Thunderbolt hub for a future combined QPI and DMI buses, USB, and SATA? Who knows?
Would it possible that a permanently affixed Broadwell be the first sign of this move of complete integration? Maybe, maybe not. It depends on the market itself and, of course, consumers themselves.
Consider for a moment the number of desktop PC users on the market today that replace their processor in their computer on a regular basis every few months to a year, or every few years. Now consider how much of the consumer desktop PC market are knowledgeable enough to replace a processor themselves. And, lastly, consider how big of a computing performance we've seen from one generation of processors to the next compared to the software that runs on them.
For example, when was the last time your parents upgraded their computer which is just used to go on Facebook or e-mail, or check recipes or the news? Do they even know how to replace parts in a computer? I'm talking about normal people that aren't tech-savvy enough to do that.
Aside from games, how much software that is used on a regular day-to-day basis today see an increased jump of performance when going from a mid-range to a high-end processor that would validate a processor upgrade?
I'll give you one example. One of my friends is still using an older Core i7 920 processor with an Nvidia GTX 260 graphics card. He plays about the same games as I do. After nearly five years, he hasn't bothered upgrading his computer. He hasn't seen a big performance jump from his 920 to an i7 2nd. gen (Sandy Bridge) to Ivy Bridge. To him, getting him to upgrade to another processor will require at minimum 15% jump in performance in games like Civilization 5 or software he uses for work. He's noticed as well that not many software today are not taking advantage of multithreaded cores unless you're into video editing and encoding or 3D rendering. He didn't see a huge enough jump between Sandy Bridge and Ivy Bridge to warrant spending money for a new processor and motherboard. His next upgrade will likely be the graphics card before the rest of his system. My friend may eventually move to Broadwell-based processors when they're out because he thinks it's foolish to jump onto Ivy Bridge when the next processor after that is going to be an entirely new socket. That and he sees Broadwell as the processor that will be a big enough performance boost over his old i7 920. For now, his Core i7 920 handles 1080p video files just fine, can play Youtube Flash videos perfectly fine, can play his games at reasonable framerates above 40 FPS, and can handle his email and work software just fine. Why should he upgrade his processor when his current one runs just fine?
You have to remember that the decisions made by companies today are decided by two things: one, to make money, and, two, market trends. So much money is spent by companies for market research. If you look at the computing market today, what is the current trend right now? Smaller, more efficient computing devices and mobile devices.
Is there still a need for large desktop PCs? Yes, for many users who like many of us on [H]ardforum play games, do our graphics and video work, and other CPU/GPU-intensive stuff on the computer. None of that can be done on a tablet or a cheap laptop. But, when you think about it, how much of that consumer desktop PC market do we currently contribute our money to? Is that market trend for desktop PCs and desktop PC parts going up or down? Recent articles state it isn't. The mobile market is going up, and many OEM low- to mid-range computers are being sold and built today. The gaming and high-end don't make a large part of that market. We are the people in the minority in the desktop PC market.
A permanently affixed processor for consumers who don't replace their processor regularly, don't overclock, and see no performance gains between one generation of a CPU to the next for their regular day-to-day software will see that as a great boon to them. "I don't know how to replace a processor, I'll just buy a new computer" is pretty much an attitude of a lot of consumers that I've seen. If Broadwell is the first sign of that, and if it is true that Intel wants to have the processor permanently affixed on a motherboard, then it'll probably hit the largest part of the desktop PC market first-- entry level to low-end. They may eventually move that into mid-range processors if it proves successful. However, I don't see that as likely. Many mid-range computers are mainly used for gaming, and a permanently affixed processor will not sit well for that market. Keeping it to the low-end and entry level desktop PC market may be more successful.
This will not sit well for not only desktop PC users in the mid-range to high-end but to some manufacturers because this will be a customer support nightmare for them. I can guarantee that will happen. It's going to be up to the OEM manufacturers like ASUS, MSI, and ASRock to make these kind of boards reliable the moment someone turns on their computer. This will probably be fine for normal consumers that watch the occasional Youtube videos or Netflix, e-mail, social websites, web browsing. This may also be good enough for educational institutions that use low-end computers as workstations in libraries or their computer lab for Word or Excel, and don't replace their computers that often. You have to remember that many don't upgrade just the processor, but the entire computer itself.
To me, in my opinion, I see this as Intel's way of trying to control the desktop PC market to how they want it. AMD isn't much of a threat to them and by moving into areas to make more money is more lucrative for Intel. They already charge a bit more for unlocked processors that can be overclocked compared to those that can't. And, many mid-range processors don't have Hyperthreading. If Intel wants to do this beginning with Broadwell, then it's a great way to make money. Want a more feature-filled computer? Move up to a better board with a better chipset and processor. Or, are you up for replacing your own processor? Then, you'll spend X-amount for a dedicated processor and board, and do everything yourself.
If we don't see this happen with Broadwell, we may see a full integration of a motherboard's major components onto the processor, a desktop SoC in other words, within this decade. And, that "desktop SoC" may eventually be the future of desktop PC computing for us. The only thing replaceable is the memory, hard drive/SSD, and an optional dedicated graphics card seeing Intel is getting slightly better with each Intel HD graphics unit they integrate onto their CPUs. We may see that Intel HD graphics unit run games at 30 to 50 FPS at 1080p within the decade, which would make a single integrated computer more realized than it is now.
brain fart on my behalf!
not crazy about the soldered cpu
The historic trajectory for PC CPUs is more pins. We went from 370 around 10 years ago to 2k+ today. I don't think 4k or 6k or more is any kind of stretch. I wouldn't think it's overnight, but if Intel is looking out over the long term they have to begin to prepare OEMs and everyone else from design to enthusiast of the eventual shift.
That's all assuming the same rate of progress continues. But if the PC is no longer the focus of intel (and it probably isn't) then switching to BGA designs makes even more sense.
I think they've done well with compression strategies and speed on the pinouts AMD and intel have been doing the last few years, but in the end there's no replacement for scale. More and bigger is the future if PC designs continue to be pushed forward. Obviously the yin to that yang is smaller sized FFs, or specialized environments. A mini ITX board already pushes the concept of the socket.
There's no way I can read intel's mind and say that is the kind of logic they're using, but it does make sense. In 2020, most PCs may well be rasberry pi size, stuck onto the side of the monitor or built into an LCD screens base. There's no room for sockets in that world.
And on the other side, the no holds barred big as anything desktop where speed is the primary concern, by 2020 may well be on DDR5/6/7 and have who knows how many pins, features, speed etc.
The big kahuna in the room is the GPU, the more cores they have, the more pins they need or they become starved for memory. There's no getting around that. The tricks and voodoo will only work so long before they either go bigger or hit a wall. I think what's been holding it relatively steady is the contuing drop in size, creating a sort of stasis over the last few years. But eventually die process will stall out, at least for a time.
Anyways I can't speak for intel, I don't know what big wigs are thinking there. But I can see the logic behind the decision. There will be a demand for a reduced number of models. In a world where intel is the only real x86 player, they don't need to hit so many price points.
Well it's been debunked intel will produce socketed cpu's for the near sighted future. But this is a strategy in their outlook, no doubt.