Intel's supposedly RTX 3070-level GPU, the Xe DG2, is running in its labs right now

I’m salivating at all this competition. PC hardware is entering a new era and out of the stagnation caused by Intel and Nvidia in CPUs and GPUs for the past 5-10 years. Feels like the mid-late 2000s again.
 
I’m salivating at all this competition. PC hardware is entering a new era and out of the stagnation caused by Intel and Nvidia in CPUs and GPUs for the past 5-10 years. Feels like the mid-late 2000s again.

One more big player would be nice....that doesnt own some segment of the market already..
 
TSMC 6nm? Guess that's better than Intel 10nm, then again 14++++ is still better than 10nm ;). Also is this Raja making these claims? I remember why I was super excited when he finally left AMD. He had all these grand claims that got everyones hopes up and then the truth came out with the release. Hopefully Intel reigns him in a bit and this doesn't get stupid. I think 3070 performance in a laptop is kind of extreme considering how much power 3070 performance takes, although 6nm should be a little bit more efficient, but still, the thermals. I'd love to be proven wrong, but I feel like people are going to be let down or battery life is going to be around 3 minutes ;).
 
TSMC 6nm? Guess that's better than Intel 10nm, then again 14++++ is still better than 10nm ;). Also is this Raja making these claims? I remember why I was super excited when he finally left AMD. He had all these grand claims that got everyones hopes up and then the truth came out with the release. Hopefully Intel reigns him in a bit and this doesn't get stupid. I think 3070 performance in a laptop is kind of extreme considering how much power 3070 performance takes, although 6nm should be a little bit more efficient, but still, the thermals. I'd love to be proven wrong, but I feel like people are going to be let down or battery life is going to be around 3 minutes ;).
Maybe for DTR systems.
 
TSMC 6nm? Guess that's better than Intel 10nm, then again 14++++ is still better than 10nm ;). Also is this Raja making these claims? I remember why I was super excited when he finally left AMD. He had all these grand claims that got everyones hopes up and then the truth came out with the release. Hopefully Intel reigns him in a bit and this doesn't get stupid. I think 3070 performance in a laptop is kind of extreme considering how much power 3070 performance takes, although 6nm should be a little bit more efficient, but still, the thermals. I'd love to be proven wrong, but I feel like people are going to be let down or battery life is going to be around 3 minutes ;).
I would agree as it looks this generation both amd and nvidia have opted to ride the hot end of the efficiency curb with large dies. I can't see Intel addressing the same market with a laptop power envelope on a first gen product
 
I would agree as it looks this generation both amd and nvidia have opted to ride the hot end of the efficiency curb with large dies. I can't see Intel addressing the same market with a laptop power envelope on a first gen product
Technically it's a 3'rd gen, the Xe would be first the DG1 would be second and that is shipping to OEM's now so the DG2 makes it third-gen, or maybe 2.5 gen?
 
Sure. Unless they stole some designs from nVidia or AMD, how did they make this giant leap in technology? Intel has been hyping their GPUs for decades and they always end up being duds.
 
Sure. Unless they stole some designs from nVidia or AMD, how did they make this giant leap in technology? Intel has been hyping their GPUs for decades and they always end up being duds.
Intel has no problem throwing engineers and money at problems. I believe that would be enough to bring a GPU to reality. And about duds, im a huge fan of the Intel phi processors as they have there role in in hpc
 
Last edited:
Holy crap...

Depending on the RND that went into it, imagine if Intel put this on the market for $349! Nvidia & AMD would shit the bed.

Intel also likes to burn money and find the few markets where there is still some profit. I see datacenter and mobile parts. If a dedicated pcie GPU materializes in the market I'm sure it will be listed as high as Intel thinks they can go
 
Intel also likes to burn money and find the few markets where there is still some profit. I see datacenter and mobile parts. If a dedicated pcie GPU materializes in the market I'm sure it will be listed as high as Intel thinks they can go

That's true, if I were them though and the performance is indeed what they say it is, if they priced it $100-150 dollars under AMD and Nvidia they would sell like hotcakes. Could you imagine greater than 2080ti performance (based on 3070 rumors) for $350 bucks?! :eek:
 
That's true, if I were them though and the performance is indeed what they say it is, if they priced it $100-150 dollars under AMD and Nvidia they would sell like hotcakes. Could you imagine greater than 2080ti performance (based on 3070 rumors) for $350 bucks?! :eek:
I think such a situation may happen if Intel makes a firm entry into the dedicated GPU market and if amd is successful this gen. Can you imagine a 3 way race between 3 very capable companies to actually hit the best market position they are capable of?
 
I think such a situation may happen if Intel makes a firm entry into the dedicated GPU market and if amd is successful this gen. Can you imagine a 3 way race between 3 very capable companies to actually hit the best market position they are capable of?

It would be like the old Voodoo days...before Nvidia realized that scenario and bought them :/
 
Intel + Raja, GTX 3070 performance turns out to be more like RX 470, they missed a few numbers when leaking info :D

Yuup.

Slap it on a PCB and screw a cooler on it and let's get it in to people's desktops

Come on Intel!

A 3070 performing card for the holidays makes for a very Merry Christmas.
Intel has literally not been able to do this very thing since ever. .. and now they're going to hop into the market with a 3070 grade perf mobile part? :LOL:

:cautious:
 
I’m salivating at all this competition. PC hardware is entering a new era and out of the stagnation caused by Intel and Nvidia in CPUs and GPUs for the past 5-10 years. Feels like the mid-late 2000s again.
The end of Moore's Law just says no.
 
That's true, if I were them though and the performance is indeed what they say it is, if they priced it $100-150 dollars under AMD and Nvidia they would sell like hotcakes. Could you imagine greater than 2080ti performance (based on 3070 rumors) for $350 bucks?! :eek:
Why would they do this? They would be more of a paper launch then the 3080... you don't price yourself so cheap that you can't supply demand. You can only gain market share if you can produce product and I highly doubt Intel's first chip is going to have the capacity needed to grab large parts of the market. They will price it inline with it's performance vs. the competition, they have name recognition already, so no need to come in and undercut prices just to sell what small quantities they will have.
 
Good to see people still fall for Raja hype.

But hey! It’s Raja!

04114DD6-46DE-41BC-AC83-1E41280614F7.gif
 
It was just taped out. In the labs means it could still be up to a year out.

3070 perf in a laptop would be sweet if it pans out.
 
Intel figured out there is so much more money selling GPUs then CPUs. I can't wait to have a 3rd party to compete.
 
They already make GPUs in a CPU. I can't see why they could not make a decent PCIE card that competes with a 3070 if it is hardware dedicated to being just a GPU.
We both know that integrated solution is pure rubbish. I was talking about dedicated GPUs.
 
I'm all for better graphics in Laptops and SFF units!
I've said it before: whether it's Intel, AMD, or anyone else, if you can provide a bog-standard SOC-type solution (and it doesn't need to be a true SOC, just have that kind of all-in-one feel to the end user) with the power of an Xbox S, it will become the de-facto standard laptop/SFF consumer power plant. It would be the Model T of PC hardware for five years straight.

It's why I'm surprised that AMD lowered the performance of the 4000-series integrated graphics relative to their previous generations.

I don't care who makes it, honestly.
 
With AV1 support and QuickSync this gpu is going in my Plex server. That it's has capability of a 3070 makes me think I'll be gaming on it too.
 
I've said it before: whether it's Intel, AMD, or anyone else, if you can provide a bog-standard SOC-type solution (and it doesn't need to be a true SOC, just have that kind of all-in-one feel to the end user) with the power of an Xbox S, it will become the de-facto standard laptop/SFF consumer power plant. It would be the Model T of PC hardware for five years straight.

It's why I'm surprised that AMD lowered the performance of the 4000-series integrated graphics relative to their previous generations.

I don't care who makes it, honestly.
I’m pretty sure AMD gutted the 4000 series to hit the budget Asian markets, I doubt they planned to make it a large scale launch. Probably more of a proof of concept they could capitalize on than proper launch.
I expect big things from the Intel NUC platforms in the coming years though.
 
Sure. Unless they stole some designs from nVidia or AMD, how did they make this giant leap in technology? Intel has been hyping their GPUs for decades and they always end up being duds.

Kyle previously wrote about how Raja was trying spin off AMD's graphics division with him at the helm of the new company or heading the GPU unit of whatever other company bought his engineered departure.

After AMD finally spun him off, Raja was at Intel and potentially carrying whatever intel for Intel that he had.
 
Kyle previously wrote about how Raja was trying spin off AMD's graphics division with him at the helm of the new company or heading the GPU unit of whatever other company bought his engineered departure.

After AMD finally spun him off, Raja was at Intel and potentially carrying whatever intel for Intel that he had.
Wasn't there information that came out that Raja is trying to make himself CEO of Intel?
 
The only way I would be remotely excited for an Intel dedicated GPU is if they brought power efficiency back into the equation. I don't know why... but internally I don't care that a new NV/AMD card pulls 350w... but I'd care if an intel did.
 
Kyle previously wrote about how Raja was trying spin off AMD's graphics division with him at the helm of the new company or heading the GPU unit of whatever other company bought his engineered departure.

After AMD finally spun him off, Raja was at Intel and potentially carrying whatever intel for Intel that he had.
You do remember, they also had our Kyle at one point.
One thing I would like to see with the high end is 2x M2 slots. AMD did with the Radeon Pro SSG.
This would also allow them to sell more Optane cards.
 
You don't normally go from one of the slowest discrete laptop GPUs available to outperforming desktop GPUs in a single cycle.
Admittedly, I haven't been following Intel's discrete graphics very closely, but I thought the parent designs were heavy-hitting number crunchers/on-the-fly video encoders for companies like Amazon and Netflix; there's got to be some potential to squeeze gaming performance out of that kind of framework.
 
Back
Top