cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,077
An interesting article entitled "Has Intel Given Up on the PC?" by Ashraf Eassa sheds a different light on the [H]ardocp article "How Intel Feels About PC Users." Ashraf makes an excellent point that the client computing group within Intel generated the most revenue by a long shot year-over-year, but only experienced 6% growth compared to the data center group which experienced a 27% growth year-over-year. He explains that the slide from the presentation was a reflection of that.

Then Mr. Eassa discusses why Intel shouldn't neglect the PC-centric business even though they think it will become less relevant over time. The article continues to delve into what Intel must do to keep their revenue flowing.

Last quarter, Intel's client computing group generated $8.7 billion in revenue and $3.2 billion in operating profit . The business was the single biggest contributor to the company's operating profit in the second quarter of 2018 (although the company's data center group was close behind, raking in $2.7 billion in operating profit). Given how important the personal computer market is likely to be for the company in the years ahead, shareholders should hope that the company continues to invest appropriately in it.
 
Not sure if Intel has given up on the PC, but my PC has given up on Intel. ;)

Check the sig.

I've got an AMD 2700x enroute to replace that long-serving 1090t. The i7-4790 rig? Well, it's doing alright, but it'll probably get upgraded in a year or two by an AMD cpu. (Mobo's gotta get replaced then anyway. Shrug.)

Right now my desktops are 50/50 Intel to AMD. In a week, it'll be 40/60, with AMD in the lead. In a year, it'll be 20/80. FWIW.
 
I do wonder how many households have a somewhat new-ish PC compared to the early 2000's.

We've got really capable cell phones, tablets. For myself those aren't substitutions for a PC, but for a good many who just needed email and the web, hell yeah.
And for a large # who don't like cell or tablets, laptops surely have come a long, long way if you need a keyboard and a nicer screen.

I love my desktops and I'll have them forever, but I do really wonder how long it can stay mainstream. I think the time when enthusiasts will have to buy server parts will come.
 
I do wonder how many households have a somewhat new-ish PC compared to the early 2000's.

We've got really capable cell phones, tablets. For myself those aren't substitutions for a PC, but for a good many who just needed email and the web, hell yeah.
And for a large # who don't like cell or tablets, laptops surely have come a long, long way if you need a keyboard and a nicer screen.

I love my desktops and I'll have them forever, but I do really wonder how long it can stay mainstream. I think the time when enthusiasts will have to buy server parts will come.

Other substitutions are creeping in too. You have smart TVs and streaming boxes for video, smart speakers for music and checking the weather, alot of (app based) chat services encroaching on email and so on.




Also, I don't really like the term "PC" anyway since it just means a big Windows computer. Anything with a chip over 10W should fit in that bracket IMO, like game consoles, as to me a PC represent a big hunk of computing power for your sole, personal use.
 
Last edited:
PCs aren't all doom and gloom for Intel. From TFA, Client Computer Group revenue - 8.7 billion, The other four combined - ~8 billion. Corporate leaders will be failing stockholders if they ignore the group that generates about 50% of company revenue, even if the growth rates will be single digits at best. However, they would also be failing stockholders to ignore the growth rates in the non-Client Computer areas. All you have to do is look at the financial reports from Microsoft and Amazon to see the revenue potential from supplying hardware for server farms providing cloud features. And even with the massive security issues with IOT gizmos, IOT will continue to be a large growth area for years to come.
 
Not sure if Intel has given up on the PC, but my PC has given up on Intel. ;)

Check the sig.

Rig 1 == CPU i7 4790k, GPU GTX970, mobo Asus Z87-Pro, Win10, 32GB Ram, on Dual 24" 1920x1200 Monitors
Rig 2 == CPU FX-8350, GPU GTX670 (4GB), mobo Asus M5A88-M, Win10, 16GB Ram, HTPC on 1080p TV/Projector
Rig 3 == CPU i7 6700k, GPU R9 390 (8GB), mobo Gigabyte Z170 UD5, Win10, 64GB Ram, on 27" 2560x1440
Rig 4 (break glass in case of emergency) == CPU 1090T, GPU HD6870, mobo Gigabyte 890GPA-UD3H, Win10, 8GB Ram, spare 24" 1080p
"(break glass in case of emergency)" Lol! : -)
 
The main thing their executives seem to miss is that the PC market wasn't expanding specifically because Intel wasn't investing in it. Now that we have AMD putting in competition, we're actually seeing advancement in it again, and the market is growing again because of that. Intel is entirely the cause of their own woes.
 
"(break glass in case of emergency)" Lol! : -)

I am Sad that his "Emergency PC" is nicer than the primary gaming machine of most people I know at work. They talk about LAN parties and how they are extreme gamers......and they all play MMORPGs like WOW and Guild Wars.
 
Not sure if Intel has given up on the PC, but my PC has given up on Intel. ;)

Still running Intel on my rigs but they're still solid enough to hold up a while longer. Once their time is past I'm going AMD for sure. Ryzen and TR have proven themselves and then some.
 
Honestly perhaps the reason why they experienced no growth is pricing and what they were giving people.

I have quite a few friends and people I work with still on Z68 Sandy Bridge(2600k or 2700k) and X58 Bloomfield (I7 920 or 930), because they said the performance of newer chips is only a little better. Some of them were going to upgrade to the 8700k 6 Core as they felt it would give better performance, but with Dram prices and Graphics cards/Monitors being as expensive as they already are The core was the last thing they were willing to upgrade.

Intel waited to long to increase Core counts on Z series, and by doing so is delaying the mainstream which in turn causes programmers to not bother trying to utilize something people don't have. They didn't even care until Ryzen came along, I mean AMD didn't care until the Core2Quad Q6600 dropped then they had to. It always seems like who ever is on top at the time stops innovating and just cut/copy/paste on a smaller die with a little better frequency.

This is like 3dxpoint from Intel it can revolutionize computing for consumers turning those RAM sticks which are volatile to non-volatile and hold the key to not having to boot up everytime you turn on your PC, but Where are they?????????Why are they dragging their heels, because they don't care about consumers, I've seen the 256Gb sticks floating around offices for servers and data centers, They are dragging their heels spending more time on how to market the crap rather than just releasing it, because as of this moment only certain Xenons are compatible, 5-10 years from now we may get it on X series, with another 5-10 till Z series finally gets it.

...and they wonder why they have no growth with computing market share? If they released 3dxpoint today using a new chipset and new processor series within 3 years it would have the growth as everyone would upgrade for how radically game changing it is. lets say a 128Gb DDR4 module they charged $250 and an 80Gb SoDimm for laptops at 125$ they would sell like hotcakes. grant it the DDR4 market would crash, but hey talk about the industry that self inflates prices to print money by cutting back production causing a defunct market.
 
...and they wonder why they have no growth with computing market share? If they released 3dxpoint today using a new chipset and new processor series within 3 years it would have the growth as everyone would upgrade for how radically game changing it is. lets say a 128Gb DDR4 module they charged $250 and an 80Gb SoDimm for laptops at 125$ they would sell like hotcakes. grant it the DDR4 market would crash, but hey talk about the industry that self inflates prices to print money by cutting back production causing a defunct market.

I feel you're too hopeful on this, so I have to pop your balloon.

1. 3DXpoint is too expensive to produce to get it that cheap. Right now, byte for byte, it costs more than DDR4 to manufacture. They're currently taking a loss selling at the prices they are, but they're trying to build the market for it. They're trying to get the manufacturing costs down to below that of DDR4, but that will take time.

2. It's FAR too slow to take the place of DDR4 entirely. It's slower than DDR2. However, it does do great for caching. It works in conjunction with DDR4 main memory and the storage system to cache the boot and OS files so that startup and running programs is far faster. It's GREAT for database handling. It will never take the place of main memory, though. However, even that will take time to happen because it is still more expensive than DDR4.

Give it some time, and it will have a substantial effect on the PC ecosystem. They might even integrate it in certain chipsets to increase performance. (Can you imagine: Z690 with 128GB of embedded 3dXpoint storage cache!) Perhaps they may make it a 4th level cache in the CPU. Who knows? Right now, though, it's nothing more than an expensive gimmick.
 
Datacenters, servers, cloud, etc. are where they see the big bucks. Sooner or later enough will exist for that profit margin to plateau. If AMD is smart and continues to focus on both home and business then Intel will never recover but become as impotent as IBM. Odds are by then Samsung and who knows who else will be competing as well.
 
I feel you're too hopeful on this, so I have to pop your balloon.

1. 3DXpoint is too expensive to produce to get it that cheap. Right now, byte for byte, it costs more than DDR4 to manufacture. They're currently taking a loss selling at the prices they are, but they're trying to build the market for it. They're trying to get the manufacturing costs down to below that of DDR4, but that will take time.

2. It's FAR too slow to take the place of DDR4 entirely. It's slower than DDR2. However, it does do great for caching. It works in conjunction with DDR4 main memory and the storage system to cache the boot and OS files so that startup and running programs is far faster. It's GREAT for database handling. It will never take the place of main memory, though. However, even that will take time to happen because it is still more expensive than DDR4.

Give it some time, and it will have a substantial effect on the PC ecosystem. They might even integrate it in certain chipsets to increase performance. (Can you imagine: Z690 with 128GB of embedded 3dXpoint storage cache!) Perhaps they may make it a 4th level cache in the CPU. Who knows? Right now, though, it's nothing more than an expensive gimmick.

Ok I have done a bit more research into things, apparently the new modules are based on a 2nd gen version of 3dxpoint so to speak its not the same fab process as the original cache drives, and also seems more advanced than the SSD enterprise drives they released.

The Ultimate goal of 3dxpoint is to replace,

RAM- because it won't be needed as the non-volatile storage will be fast enough and have direct access to CPU and GPU directly rather than being bottlenecked at the SATA speed, It wont replace mass storage fast enough through something like the PCIe or will only be used as a cache for servers in conjunction with RAM where datasets exceed the capacity levels(they are making 1Tb and scaling 8Tb modules apparently according to recent news). For the PC end user though this will replace RAM.

OS/small dataset storage - Will be stored on the 3Dxpoint modules loaded in DDR Dimms.

The positives, an always on/off state, no/low power requirements, replacing RAM and Storage in 1 solution hence they can charge something like $400.00 for 2 256gb sticks for the market and still.

My main statement is we are talking about Intel here, they have more resources than many but instead of allocating those resources for this and pushing the technology forward faster, they are screwing around, leading to what you have stated, it needs time...…….

No one mentioned about Raja K, did anyone think instead of creating a GPU for Intel that perhaps he is working on the 3dxpoint part of GPU interaction instead.
 
Last edited:
I suspect Intel is half-right about all this. Or at least, those who think the PC market is going to radically shrink.

I turned my old i7-2600k rig into a home theater PC. Partly because I felt a need to find a use for the old parts, and partly because my wife's computer, an ancient core 2 era build, finally gave up the ghost and died permanently. Motherboard toast. So I said hey, I'll swap the fire stick for a more capable (and even occasional gaming capable) rig, and then she could use that for her email and other bullshit too.

It sees a lot of use as an HTPC, but she never uses it for anything else. All her social media is on the phone or the kindle now. Shopping, browsing, all of it. I wonder how many are like her and rarely - if ever - use a PC outside of work.

Therein lies the future of computing. Games, content creators, and enthusiasts on the one hand, mass-produced shitboxes for offices on the other... and then, of course, big data. Few others would even bother, and if they did it would only be when their existing computer died. They could be served by the same machines built for offices. Joe Blow has an iPad. Jane Doe has a Galaxy. WTF do they need a PC for?

The market that remains will be well-served by AMD products and, at least for now, Intel products too. But I am starting to wonder if Intel is prepared to slowly cede whole swaths of the market to AMD because they just don't see the money in it anymore, outside big data. The CPU market will be high-end enthusiast shit in a very niche market, low-margin low-end shit for offices, and high margin products for big data. Even AMD seems to know this. Zen's weaknesses on the desktop are, in some ways, a result of the compromises they made to rapidly increase core scaling on the data side.
 
I feel you're too hopeful on this, so I have to pop your balloon.

1. 3DXpoint is too expensive to produce to get it that cheap. Right now, byte for byte, it costs more than DDR4 to manufacture. They're currently taking a loss selling at the prices they are, but they're trying to build the market for it. They're trying to get the manufacturing costs down to below that of DDR4, but that will take time.

2. It's FAR too slow to take the place of DDR4 entirely. It's slower than DDR2. However, it does do great for caching. It works in conjunction with DDR4 main memory and the storage system to cache the boot and OS files so that startup and running programs is far faster. It's GREAT for database handling. It will never take the place of main memory, though. However, even that will take time to happen because it is still more expensive than DDR4.

Give it some time, and it will have a substantial effect on the PC ecosystem. They might even integrate it in certain chipsets to increase performance. (Can you imagine: Z690 with 128GB of embedded 3dXpoint storage cache!) Perhaps they may make it a 4th level cache in the CPU. Who knows? Right now, though, it's nothing more than an expensive gimmick.

https://www.anandtech.com/show/12828/intel-launches-optane-dimms-up-to-512gb-apache-pass-is-here

…….and for starters we have zero data on the Dimms….zero benchmarks, any conclusion you are drawing from Optane cache modules or the SSDs is flat out wrong as the interface is the bottleneck. So any figures you are throwing out are conjecture at best.

The ultimate goal of 3dxpoint is to eliminate ram completely, speed and latency are 2 separate things...…………

Either way you need to rethink the numbers, why we use RAM and the implications of getting rid of it, Everyday mainstream users don't have the same needs as a Professional or workstation user, in fact with storage most of the sales pitch numbers translates to little actual real world performance as applications can't take advantage of the speed, RAM on the other hand does make a difference but to or at what point, even if it only had DDR3 speeds that would be good enough as the actual real world performance Between DDR2 to DDR3 to DDR4 became less and less noteworthy

So whats holding up this from burning the market...………………….
Simple READ the articles, its BS dealing with proprietary controllers, the same BS Intel tried with that SDR RAM years and years ago, was it superior sure, but the market wanted nothing to do with Intel's shenanigans, Either way with whatever is going on Micron wants to separate the operation, because they have fabs sitting idle while Intel twiddles it thumbs on how to market the most money they can from this rather than properly getting the tech out there for mainstream, Intel is touting how much cheaper these dims are compared to ECC DDR4 server ram.....
 
Back
Top