Intel Plans to Launch Its Discrete GPU Lineup Starting at $200

This is going to hurt both NVIDIA and AMD but I think AMD will feel it a lot more since the low/mid range is all they have going for them. If Intel pulls their collective heads out of their asses and decides to price these $100+ below the competition for midrange/high end GPUs in the future, they could turn the market on its head. They obviously want to chase after NVIDIA right now and the thin and light laptop/tablet market with an intel CPU + LP GPU could really hurt NVIDIA, esp. if it has low/midrange performance (e.g. 1050 Ti).

Jensen shouldn't have dumped data center GPUs on the public the way he did with Turing. It just created a lot of salty customers who will ride the Intel bandwagon when given a chance. Hopefully NVIDIA brings prices back down to realistic levels but I somehow doubt it. AMD is desperate for margins so they'll follow NVIDIAs lead and people will similarly hate them for it including their fanboys who like GPUs on the cheap. Intel has an opportunity to capitalize on this by tossing in solid GPUs for cheap prices.

In the future I can see Intel powering cloud gaming setups (CPU/GPU/Chipset) and they could potentially have all Azure/Google AWS systems under their control and lock NVIDIA/AMD out. If I was Jensen, I'd be shitting my pants every night trying to figure out a strategy to counter them. Even AMD will be in trouble once Intel begins rolling out its new architectures in 10 nm.
 
Last edited:
This is going to hurt both NVIDIA and AMD but I think AMD will feel it a lot more since the low/mid range is all they have going for them. If Intel pulls their collective heads out of their asses and decides to price these $100+ below the competition for midrange/high end GPUs in the future, they could turn the market on its head. They obviously want to chase after NVIDIA right now and the thin and light laptop/tablet market with an intel CPU + LP GPU could really hurt NVIDIA, esp. if it has low/midrange performance (e.g. 1050 Ti).

Jensen shouldn't have dumped data center GPUs on the public the way he did with Turing. It just created a lot of salty customers who will ride the Intel bandwagon when given a chance. Hopefully NVIDIA brings prices back down to realistic levels but I somehow doubt it. AMD is desperate for margins so they'll follow NVIDIAs lead and people will similarly hate them for it including their fanboys who like GPUs on the cheap. Intel has an opportunity to capitalize on this by tossing in solid GPUs for cheap prices.


i'd love to see more competition in the GPU market.. but i don't expect the pricing to last all that long.. no matter how they try to sell it they're going low to get their foot through the door and then expect status quo after that.
 
Update to the story, apparently Raja didn't say Intel GPU will start at $200 bucks.

“Not everybody will buy a $500-$600 card, but there are enough people buying those too – so that’s a great market.

So the strategy we’re taking is we’re not really worried about the performance range, the cost range and all because eventually our architecture as I’ve publicly said, has to hit from mainstream, which starts even around $100, all the way to Data Center-class graphics with HBM memories and all, which will be expensive.

We have to hit everything; it’s just a matter of where do you start? The First one? The Second one? The Third one? And the strategy that we have within a period of roughly – let’s call it 2-3 years – to have the full stack." - Raja Koduri

https://www.tomshardware.com/news/intel-discrete-gpu-xe-debut-200,40083.html
 
I think Intel needs to appoint a dedicated PR manager to prep Koduri for interviews. He seems to be either dropping GPUs literally (i could not help myself), or gives too many sound bites for out of context news headers.
 
The only problem I have with Intel IGP's even up to my $500 9900k @ 5.2Ghz, is the damn things can still only output 4K@30Hz.

I understand why - because its just been kicked down the road from Skylake - but I don't have to like it.
 
Last edited:
Could be. If you're claiming an uncommon issue and can't reference what you've done to alleviate said issue, then start a thread and go through it.

We're talking about future GPUs too, of course, which is why your post is a bit off-topic. I have five generations of Intel IGPs that work great running in house right now and more at work.

and yet scads of people are having the same black screen issue.

also before the gpus come out inb4 intel black screen threads.
 
So the question is 'lately', asked for the purpose of discussing a future product, and you're referencing a fixed bug in a niche game?

You were the one saying they had stellar drivers in the past so I just provided a counterpoint as it directly relates to their previous performance. Infrequent is usually the word people put in front of "Intel GPU driver updates" and if 6 million players is niche now then so are games like Witcher 3 and Skyrim.
 
You were the one saying they had stellar drivers in the past

I didn't say that ;)

and yet scads of people are having the same black screen issue.

What percentage of 80% of the GPU market is 'scads'? And what evidence do you have that this problem affects current Intel IGPs, and what support do you have for it to continue to be a problem?
 
I didn't say that ;)

Could be. If you're claiming an uncommon issue and can't reference what you've done to alleviate said issue, then start a thread and go through it.

We're talking about future GPUs too, of course, which is why your post is a bit off-topic. I have five generations of Intel IGPs that work great running in house right now and more at work.

Someone sarcastically mocks Intel's drivers. You ask if that person has used those said drivers recently. A few examples are posted indicating various issues. You dismiss them as irrelevant, uncommon, in the past, then talk about how great they are at doing things which are presumably not games.

The slow updates goes back to at least the GMA days so complaints aren't exactly new. Since you claim to be a "long suffering AMD fan" why would you discount a long track record stretching back that long because "future GPUs" will be different?
 
Someone sarcastically mocks Intel's drivers. You ask if that person has used those said drivers recently. A few examples are posted indicating various issues. You dismiss them as irrelevant, uncommon, in the past, then talk about how great they are at doing things which are presumably not games.

My response was semi-sarcastic too- I'm quite aware of Intel IGP driver issues in the past and have worked through many. I've also gamed on Intel IGPs for near a decade.

The reason I dismiss older issues on older hardware is not that they aren't issues, but that they're not really relevant to the current discussion.

The slow updates goes back to at least the GMA days so complaints aren't exactly new. Since you claim to be a "long suffering AMD fan" why would you discount a long track record stretching back that long because "future GPUs" will be different?

They've sped up the updates. And while AMD isn't wholly relevant, I've repeatedly stated that AMD has gotten quite a bit better with their drivers.

Intel's discrete GPU push is a different situation than where AMD is- AMD has an established lineup. Intel is working to get there, and it should be highlighted that they've been visibly working pretty hard. They're trying to go somewhere different, so while their past performance does somewhat inform opinions, it doesn't apply as much as AMDs past does to AMDs future. AMD shows no signs of trying to move 'up market', but rather to just prevent themselves from becoming irrelevant.
 
Intel's discrete GPU will also contain a hidden BMC that can be exploited at any moment by the highest bidder and utilized to collect info on how many times you teabag your opponents in online FPS shooters.

You will be referred to the Global Ministry of Acceptable Behavior, this is your warning.
 
Intel's discrete GPU will also contain a hidden BMC that can be exploited at any moment by the highest bidder and utilized to collect info on how many times you teabag your opponents in online FPS shooters.

You will be referred to the Global Ministry of Acceptable Behavior, this is your warning.

Just like the one in every AMD, ARM, MIPS, and whatever else...

How's the RISC V GPU doing?

;)
 
You'd be surprised...

I'm far less worried about legacy support than I am about AAA-support. With only IGPs, few developers are going to have optimized for an architecture that will never run their games (or other software).

On the other hand, Intel has come a long way in terms of drivers on Windows, and they're a lead developer on Linux.

My expectation is that they'll need a lot of TLC after release, but stuff by and large will run at release.

I hope so. I understand that 'AAA' support has to be a priority, but if Intel put some serious money/effort into supporting popular old games/Linux gaming I'd be a huge fanboy of their new GPUs.

Given the large number of titles that can be played on virtually any AMD/Nvidia GPU this has to be considered. I have an old quad core AM1 chip for retrogaming, and even with that piddly APU its great for old games and emulators. My i5 laptop can play a few major titles, but it's pretty limited.
 
Last edited:
Intel has good feature support FYI:
upload_2019-8-4_17-10-47.png
 
So he said nothing.. other than some day, Intel will have a full product stack. Well I guess its something...
 
The GMA support in the past wasn't entirely Intels fault. They did update the drivers and in some cases NONE of the updated drivers ever made it into the MB 4in1 drivers.

I remember having to deal with HP's shit storm of driver support back in the day where they were using a pre-beta version of Intels GMA driver for their stuff which would barely even work with Windows. Had to do some trickery to allow driver to be installed that wasn't part of the 4in1's.
 
The cynic in me, and given Raja's history at AMD, translates the headline to: "We won't be able to compete on performance, so we'll try to compete on price." Though knowing Intel and their passion for high margins, I imagine their $200 part will performance-wise compete with AMD and Nvidia's $130 GPU's.

At 720p, you'd actually be surprised at what you can achieve with a GT710...I was pleasantly suprised...
You'd be surprised at what you can achieve with HD630 at 720p! ;) More seriously, Intel is going to have to do more than compete with the lowest end and have a discreet card that is barely better than their integrated graphics if they want to have any success. Intel has a track record for throwing lots of money at a new market segment, then backing off significantly when they don't meed initial success. Some of the projects linger on for a bit, but they eventually get disowned.
 
Update to the story, apparently Raja didn't say Intel GPU will start at $200 bucks.

“Not everybody will buy a $500-$600 card, but there are enough people buying those too – so that’s a great market.

So the strategy we’re taking is we’re not really worried about the performance range, the cost range and all because eventually our architecture as I’ve publicly said, has to hit from mainstream, which starts even around $100, all the way to Data Center-class graphics with HBM memories and all, which will be expensive.

We have to hit everything; it’s just a matter of where do you start? The First one? The Second one? The Third one? And the strategy that we have within a period of roughly – let’s call it 2-3 years – to have the full stack." - Raja Koduri

https://www.tomshardware.com/news/intel-discrete-gpu-xe-debut-200,40083.html
TPU updated the article used in the OP, as well. I added the clarification to the first post.
 
Here is an interesting tidbit about multi GPU support for Intel's new cards.

https://www.phoronix.com/scan.php?page=news_item&px=Intel-Linux-Preparing-Multi-GPU
That would definitely sell more cards. Initially it might be just hybrid, allowing the discrete card to work with the iGPU. After that, the 2-3 cards would be common on this forum. ;)

Two things I hope to see besides a low price:
1. 4K at least 60 FPS most games.
2. Quick Sync
3. Two M.2 slots for low latency raid 0 on the videocard.
 
That would definitely sell more cards. Initially it might be just hybrid, allowing the discrete card to work with the iGPU. After that, the 2-3 cards would be common on this forum. ;)

Two things I hope to see besides a low price:
1. 4K at least 60 FPS most games.
2. Quick Sync
3. Two M.2 slots for low latency raid 0 on the videocard.

Why would you pair slow flash with a GPU?
 
I don't expect much from their first go around. For the future who knows? Raja might be able to pull something out his as with a huge R&D budget that Intel will throw at it unlike AMD.
 
A sub 300 card would be what I’m interested in. Anything to get those damn RTX prices down. An RTX 2060 Super is too damn expensive for a 1080p card.
 
I still remember my old intel onboard in my laptop.... they wouldn't update support for the newer version of OpenGL for Windows, even though it had hardware support. Their official response was that it just wasn't physically possible with the hardware. Funny, as their Linux driver had support! These types of things don't give me much confidence in tech support from companies (not just Intel).
 
Erm...

Why would you want to gimp the card?
.....in addition to whatever gb ram it has.. it wouldnt gimp it.

Check out the AMD SSG.
16gb ram and dual M.2 slots for adding Optane SSDs.
 
Last edited:
I still remember my old intel onboard in my laptop.... they wouldn't update support for the newer version of OpenGL for Windows, even though it had hardware support. Their official response was that it just wasn't physically possible with the hardware. Funny, as their Linux driver had support! These types of things don't give me much confidence in tech support from companies (not just Intel).

When it comes to laptops and drivers... just know you're going to get fucked. It's extremely rare that the OEM didn't mess with stuff to the point that you can just get drivers from the processor vendor (AMD / Intel / Nvidia).

I had to do a reinstall on my ultrabook to get the latest Intel drivers- nothing from Intel would install with the OEM load. Surprisingly, performance seems to have jumped in the few games I play that it can handle.
 
https://www.amd.com/en/products/professional-graphics/radeon-pro-ssg
Editing 8k, 360° stitching, CAD/visualization, oil and gas mapping (huge huge speed increase there) etc.

More for pro stuff. But very curious if it comes to and benefits games in future.

Really, really doubtful that it would benefit games in the near future, but there are plenty of possibilities here going forward.

As to consumers, if one can slap on an Optane (or similar SSD designed to be abused) for the purpose of providing local resources for content creation workloads- or even for game streaming!- I can definitely see a market here.
 
Intel is just not going to be competetive on the high end, period. Amd was once competitive and look where they are now. I dont get all the people that think intel will magically dethrone nvidia who are a decade ahead of them in r and d. Its nice to have more options but thats all its gonna be for years
 
Back
Top