Intel Demos DG1 GPU at CES

Kardonxt

2[H]4U
Joined
Apr 13, 2009
Messages
3,678
I'm sure it will be a while before they release anything competitive, but I'm excited to see something actually running games. It may not just be Larrabee 2.0 after all

There were almost no details provided on the DG1, but Intel did showcase a live demo of Destiny 2 running on the GPU. The DG1 is based on the Xe architecture, the same graphics architecture that will power Intel’s integrated graphics on the upcoming 10nm Tiger Lake chips that it also previewed at its CES keynote.

https://www.theverge.com/2020/1/6/21054244/intel-dg1-discrete-gpu-first-announcement-ces-2020
 
That looks like a cool way to make a computer. wonder how hot will it get
 
i740 was a discrete card. Intel is pretending it didn't exist, evidently...either that or this card has the same level of performance (probably the latter)

Technically i740 is discrete, but the industry was different back then. I remember having a few i740 cards for sale along with 3dfx voodoo and the first nvidia boards while working at CompUSA. The term "discrete" wasn't used then, it was just onboard graphics and graphics card. Zero integration between CPU at all, real gaming machines used motherboards that didn't have any onboard graphics. It wasn't until Intel started heavily onboarding their GMA crap first into motherboards and later cpus, that "discrete" started coming up to distinguish between PC that was gaming machine and a PC that was just an office internet machine.

Thus in this era, technically it is Intel's "first" discrete graphics card. It will probably still suck for the 1st gen or 2 if they don't drop it and shovel it back into their cpus like before.
 
Jen-Hsun Huang said that for something to be called a GPU, it had to have hardware transforms and lighting, which the GeForce 256 was the first card to provide (other than SGI/workstation cards).
So it all depends on how much you want to follow his guidance on that terminology :p

PS: if not, would the iSBX 275 count?
 
Anyone not impressed with the graphics?

They looked like low-settings and/or not much better than integrated/APU.

Maybe just a bad demo, I don't know.
 
Anyone not impressed with the graphics?

They looked like low-settings and/or not much better than integrated/APU.

Maybe just a bad demo, I don't know.

I played on the laptop earlier tonight with the DG1 graphics. Destiny 2 was running at 1080p on low settings and I'm not sure I'd say it was perfectly playable - there responsiveness felt fairly soft to me. They stressed that it's still an early version that should improve before actual release (FPS counters weren't allowed).
 
I see. Not sure why they would show that, it honestly looked pretty bad even compared to APU performance.
 
1080 low, and it still sucks at that? What is even the point of having it at that performance?
If it's faster than an APU and takes the same amount of power then it has a lot of value. Not everyone wants a 10lbs gaming laptop with a 2080. Some people want fast and efficient and light with good to excellent battery life. For a product like this, knowing wattage directly compared to tflops is more relevant than raw performance.
There's a lot of space to play in the low power segment between APUs and thirsty top end parts.
 
I see what you are saying, and I agree, except I believe there are integrated GPUs that can do just as well.


The point of Intel doing this is that they won't have to R&D around Nvidia or AMD. and yeah, right this second, there are better options for graphics. Intel's hope is that their product competes at some level, but its also their own product. Which they don't have to share with anyone else.

At this point, they seem very focused on anchoring the design around mobile and then scaling it up, from there. Which feels like......it could be a real long time before they have a truly high powered graphics card. However, they might have mobile graphics competitive with AMD, in a couple of years. I mean, their recent collab with AMD feels like Intel is very interested in offering larger packages for mobile, which have expanded graphics capability. Graphics which Intel own, has more control over, and doesn't have to share out any money. I just spoke in circles, I know.
 
Yeah, but you have to come out the gate swinging. This was not a good show.
At least you have to come out with a viable product, this doesn't seem like that. If it performs the same as an apu then it is much simpler to use an apu. In order to worth having a discreete gpu it needs to wash the floor with apus.
 
Right, that is all I'm saying.

Obviously I wasn't expecting a 2080 Ti competitor (even AMD is struggling with that) but at least something that is noticeably better than current APUs from AMD, or even Intel's own integrated GPUs, which this doesn't seem like it is.
 
Maybe something was wrong and it was using the onboard graphics. I'm sure its confusing to switch from Intel graphics to Intel graphics
 
Maybe something was wrong and it was using the onboard graphics. I'm sure its confusing to switch from Intel graphics to Intel graphics

Could be drivers too. I think we're all aware of just how crap Intel's graphics drivers can be. Either way, its not a great showing. There is potential there, if they can get performance up before launch, but I don't see it gaining too much excitement from this weak showing.
 
Could be drivers too. I think we're all aware of just how crap Intel's graphics drivers can be. Either way, its not a great showing. There is potential there, if they can get performance up before launch, but I don't see it gaining too much excitement from this weak showing.

Absolutely, Intel has always had poor graphics drivers. I hope that's all it is.
 
At least you have to come out with a viable product, this doesn't seem like that. If it performs the same as an apu then it is much simpler to use an apu. In order to worth having a discreete gpu it needs to wash the floor with apus.
It just means getting into the discrete GPU business is a lot harder than Intel thought. I do agree if it perform as well as an APU, might as well get an APU for gaming but DG1 could still be viable if it perform good as a workstation card or HPC card and price correctly. That being said, I do believe Intel will eventually get it right in the GPU market.
 
It will likely take a while before intel even gets close to low end cards. They are lacking a lot of the patents to make a functional high end card, and it will be costly to come up with ways around that.
 
It will likely take a while before intel even gets close to low end cards. They are lacking a lot of the patents to make a functional high end card, and it will be costly to come up with ways around that.
The patents are a non issue from them. They used to license those patents from Nvidia for their Graphics division. Nvidia was insanely expensive to license them from so... they licensed them from AMD instead for a lot less. Intel can do pretty much anything they have a license for from AMD and they can do it legally, so long as they maintain the license.

I'm fairly certain the Intel / AMD graphics licensing really came into the spotlight when they developed the NUC with integrated Radeon Graphics together. I recall reading the article about all of it around that point in time.

I think it's more a matter of hardware/software maturity and scratch building a "scalable" graphics architecture. I suspect they will release something viable but it won't really be until the 2nd or 3rd gen parts launch. However, in the meantime, the tech will work wonders for their integrated graphics.
 
After doing some reading, the relatively low (suspected) performance of DG1 makes sense. It is a low end 96 EU unit (that can potentially be teamed with Tiger's GPU to increase performance, but that may not be working yet). Xe HP is the big GPU.
 
Intel also offers this card which is aimed to accelerate graphics workloads, just in an unusual way. The difference between that and DG1 in terms of functionality is that the DG1 has a direct video output. Both cards leverage existing chips and pretty much just disable the x86 side of things to leave open GPU compute and transcoders. Handy for cloud desktops or something like Google Stadia but nothing I can imagine consumers getting excited about.
 
After doing some reading, the relatively low (suspected) performance of DG1 makes sense. It is a low end 96 EU unit (that can potentially be teamed with Tiger's GPU to increase performance, but that may not be working yet). Xe HP is the big GPU.

bolded part, if true, tells me everything I need to know: it'll suck real bad
 
They should know that by now, this is their third or fourth try?
I believe 4th attempt, Intel 740 to Intel 752 to Larrabee to DG1. I do feel like Intel is all in this time rather than being one and done but I could be entirely wrong on this premise.
 
Everyone is forgetting about how much faster the DG1 is vs the i740. They should of done a Knee Deep in the Dead side by side at CES.
Sees-sh! We're trying to pretend this is our first try here.
 
The patents are a non issue from them. They used to license those patents from Nvidia for their Graphics division. Nvidia was insanely expensive to license them from so... they licensed them from AMD instead for a lot less. Intel can do pretty much anything they have a license for from AMD and they can do it legally, so long as they maintain the license.

That is completely wrong. There is nothing stating that Intel licenced AMD GPU patents, (though some might be covered in existing cross license between the companies) nor would they need to.

As part of settlement with NVidia, Intel paid NVidia 1.5 Billion for a perpetual licence to all their patents filed up until March 2017. Note the word "Perpetual". They have a forever license to all the NVidia patents up until that date.

That is a massive base of IP to build a GPU from, on top of Intels own GPU patents.
 
That is completely wrong. There is nothing stating that Intel licenced AMD GPU patents, (though some might be covered in existing cross license between the companies) nor would they need to.

As part of settlement with NVidia, Intel paid NVidia 1.5 Billion for a perpetual licence to all their patents filed up until March 2017. Note the word "Perpetual". They have a forever license to all the NVidia patents up until that date.

That is a massive base of IP to build a GPU from, on top of Intels own GPU patents.
Done deal

We can now confirm the rumours that Intel has given up on Nvidia because it has written a cheque to license AMD's graphics.

It looks like veteran GPU editor Kyle Bennet was right when he first reported the rumor, however wild it sounded. We didn't contemplate it but wrote about it several times. Intel needs a GPU licence and the Nvidia – Intel licensing agreement ended on March 17 2017, so Intel doesn’t have a licensc. It is more likely that Intel has a licencee from AMD but neither company has officially announced it.

https://www.fudzilla.com/news/graphics/43663-intel-is-licensing-amd-graphics
 
Done deal

We can now confirm the rumours that Intel has given up on Nvidia because it has written a cheque to license AMD's graphics.

It looks like veteran GPU editor Kyle Bennet was right when he first reported the rumor, however wild it sounded. We didn't contemplate it but wrote about it several times. Intel needs a GPU licence and the Nvidia – Intel licensing agreement ended on March 17 2017, so Intel doesn’t have a licensc. It is more likely that Intel has a licencee from AMD but neither company has officially announced it.

https://www.fudzilla.com/news/graphics/43663-intel-is-licensing-amd-graphics

Fudzilla being morons is not evidence. Posting in giant sized fonts, doesn't make them correct.

They didn't understand what the end of the agreement meant.

It meant the end of payments, and it meant the end of new patents being added to the agreement.

Every patent in the agreement is for perpetual use.

Intel Puts the Kibosh on Reports It Will License AMD GPU Technology

Mark Hibben of SeekingAlpha has correctly pointed out that the actual patent license that Intel struck with Nvidia is perpetual. Intel hasn’t lost access to any NV patents as a result of completing its license payments, and therefore doesn’t need to sign an agreement with AMD to replace them. We regret not catching that properly the first time around, and possibly giving more life to a rumor in the process. The idea that Intel needed a deal with AMD or Nvidia is false
Direct denial from Intel:
Intel Refutes Rumor of Licensing AMD Graphics Technology

In a statement sent to Tech Trader Daily by an Intel spokesperson, the company said "The recent rumors that Intel has licensed AMD's graphics technology are untrue."
A rumor had spread on Tuesday that AMD had won a deal to license its graphics technology to Intel, as related by Fuad Abazovic of Fudzilla, who wrote that an AMD-Intel deal was “confirmed,” without citing sources.
 
Last edited:
Back
Top