Save Zero Dollars by Opting for Intel's iGPU-Disabled CPUs

I think I would have liked to see the iGPU silicon removed from the die...perhaps it would have some measurable benefit to attaining higher clocks/lower thermals.



I'm honestly sick and tired of paying crazy prices. I hope AMD comes to the rescue. I really think they will. I would die laughing if I could get 10 - 12 core or more for $350 or $400 ... that would be amazing.

AMD did come to the rescue - about 15 years ago in August 2004 when they worked with HP to put the first mainstream dual core (Opteron) in a ProLiant server. Intel didn't release a dual core until April 2005.

Remember the Jan 1998 craze of $800+ Pentium II 300MHz? And even then, they were in short supply due to demand, so some sellers were gouging as high at $2000+ for each of them...and they were still selling.

The reality today is that we can go to any major etailor and get a 8 core processor in either AMD or Intel flavor for under $600, a 6 core for under $300, and a quad core for under $200 - that is a serious bargain, if we consider what the state of the CPU market was like in the distant past.
 
Just because it's not spelled out for you doesn't mean people are not aware. I understand a lot of kids don't even know or understand the multi-chip design and the associated latencies, but "a lot" of people do. If you're asking me to give you names, or numbers, I can't. Just like you can't give me a lot of "specific" information I might ask for. So if you want to continue playing word games, we canm but that really doesn't sound fun to me.

It is quite premature to declare performance problems in this chip based upon a high level slide and assumptions of what that implies for the internal design.

If the "a lot" of people is mostly folks doing forum speculation, I'd ignore most of that. If the "a lot" has people with CPU design experience and knowledge of how this chip handles the issues, we're much more interested in seeing their analysis.
 
Shame there isn't a way to use iGPUs for a different application if you have a dedicated GPU, I rarely used it in my laptop.
 
Weren't they pushing encoding acceleration or something, for a while? So that the iGPU was useful even on a system that has a discrete GPU installed? I can't remember. I vaguely remember reading about this in some review (probably on here)

I've actually never had an Intel iGPU in any of my desktop systems. I went from the AMD systems I had been using since ~2000 to Bloomfield on x58 in 2009 which lacks an iGPU and then to x79 which also lacks an iGPU. So I have no real experience with Intel iGPU's.

I have certainly never missed having one.

I have to wonder, does not having an iGPU have thermal benefits? Do they have lower temps allowing them to be OC:ed higher? Or is an iGPU that is not in use effectively power gated to the point where this is irrelevant?

quicksync i think it was called.
 
I don't think there are any shocks here. Wasn't the 2550k the same price as the 2500k?

I ran one for awhile because Microcenter practically gave them away after no one bought any. It was something crazy like $150 for mobo and CPU IIRC. Hopefully that happens again lol.
 
Last edited:
You've got six months left to pull this crap Intel. By the time the Ryzen 3000s come out people will be running to buy them and stick it to you at the same time. I have an 8700K but it is my last Intel chip.

Double agree on this. Holding onto my 4770K for six months till Ryzen 3000 series launches. Then jumping to AMD and not looking back. Intel screwed the pooch for so long by stagnating in the market and now they are trying to play catch up to what the competition is doing.
 
Double agree on this. Holding onto my 4770K for six months till Ryzen 3000 series launches. Then jumping to AMD and not looking back. Intel screwed the pooch for so long by stagnating in the market and now they are trying to play catch up to what the competition is doing.

Yep. Look how much intel has changed since Ryzen , vs the years previous. Cores B flyin out Intel's ass, like whoa
 
I'd want an iGPU as a backup for if a GPU stops working, so that I can still use my OS and browse the internet until I'm able to replace the GPU.
 
Given that I've used an igpu in the past two weeks as a backup (970 that went bad. Using a 1060 now...), I really wouldn't want to give it up on a new CPU.

I mean, it's not great and is no replacement for a dedicated GPU, but its useful in a pinch.


I might consider an igpu-less processor if it were appreciably faster and/or $30+ cheaper, but that's not reality, so I've got no reason to pick a unit without an igpu.
 
The first two posts addressed half of this issue, the igpu is perfect for testing bad cards. They also manage to limp a system between cards.

The second issue they are removing something and not adjusting price at all, that is just bull.

I'm not sure what Intel is on, but I'll bet its the same thing SixFootDuo is, and damn I kinda want some of it :p.
 
^^^ Exactly. I see no reason to buy one without an iGPU. Just in the whole scheme of things, I never felt like I paying for it to begin with. Which I guess if F series CPUs are the same price, it is free on K series. Guess it all just depends how you look at it.
 
Given that I've used an igpu in the past two weeks as a backup ...
Oh my, that sent me down memory lane thinking about when I last/ever used an iGPU ... I once even planned for an iGPU :D

Rememeber the nForce chipset ? (I had the MSI K7N420)
I can remember I used the integrated GPU to for some time, waiting for the Geforce ti4400 to arrive (I sold my Geforce 256 DDR with my prevoius computer).

Man, what an amazing and inventive piece of hardware the nForce was ... very thrilling to get a first gen new chipset:
IGP, 5.1 dolby (with riser card), dual memory controller, ethernet, ... and all in one unified driver :)
 
That iGPU is the only damned reason I got this board to boot after a period of weirdness. In fact, I am quite inconvenienced when i need a lower-power graphics card to run something, IE my server's previous board had no graphics out.There I go complaining about free stuff though.
 
Is it actually useful?

On my AW17 laptop I have a processor with the quicksync active. I've used it to encode video for streaming via OBS to twitch, the quality is pretty good and I don't think it affected my system performance at all.

it is also quite useful for a mediaserver running plex. hardware-accelerated transcoding ftw.
 
I played csgo for a month straight on my 6600k's igpu. No I couldn't keep everything at Max settings and get 200fps but it was certainly adequate.
 
I played csgo for a month straight on my 6600k's igpu. No I couldn't keep everything at Max settings and get 200fps but it was certainly adequate.

Yeah, but CSGO is hardly a barometer of anything. Welcome to 2004 graphics.

Absolutely anything will run CSGO.

Side note:. Why is it that competitive games have to have poor graphics? Why can't a game be both competitive and highly immersive at the same time?
 
I don't get Intel these days. Do they have a bunch of 20 something MBA types trying to make a name for themselves?
 
They're also good for things like my SFF cloud server box. Saves me having to use a low profile graphics card.

There's alot of form factors, and laptops, and I.E. Intel NUC style devices that almost couldnt exist without the embedded igpu.

How many AMD based sff devices do you see?
It's either intel or arm 90% of the time.


Ever see an AMD based hdmi compute stick?
 
Back
Top