Intel Demos DG1 GPU at CES

Fudzilla being morons is not evidence. Posting in giant sized fonts, doesn't make them correct.

They didn't understand what the end of the agreement meant.

It meant the end of payments, and it meant the end of new patents being added to the agreement.

Every patent in the agreement is for perpetual use.
I posted in giant sized fonts to draw attention to where the rumor came from in the first place (Kyle).

So, is Kyle Bennet a moron by your logic?

I guess I will just wait for Kyle to chime in on this one. He has a habit of being correct when he reports rumors. His rumors are typically well founded and supported by all sorts of sources.

Aside from the above you are simply stating that anything Intel designed under Nvidia license until 2017 is still covered under patents. So, March 2017 to 2020 anything developed since leaving the Nvidia portfolio behind would not be covered, unless it's a totally new rendering tech that no one else in the market has...
 
if anything, kyle will just close the thread as you've completely taken it off topic.
 
Fudzilla being morons is not evidence. Posting in giant sized fonts, doesn't make them correct.

They didn't understand what the end of the agreement meant.

It meant the end of payments, and it meant the end of new patents being added to the agreement.

Every patent in the agreement is for perpetual use.

Intel Puts the Kibosh on Reports It Will License AMD GPU Technology

Mark Hibben of SeekingAlpha has correctly pointed out that the actual patent license that Intel struck with Nvidia is perpetual. Intel hasn’t lost access to any NV patents as a result of completing its license payments, and therefore doesn’t need to sign an agreement with AMD to replace them. We regret not catching that properly the first time around, and possibly giving more life to a rumor in the process. The idea that Intel needed a deal with AMD or Nvidia is false
Direct denial from Intel:
Intel Refutes Rumor of Licensing AMD Graphics Technology

In a statement sent to Tech Trader Daily by an Intel spokesperson, the company said "The recent rumors that Intel has licensed AMD's graphics technology are untrue."
A rumor had spread on Tuesday that AMD had won a deal to license its graphics technology to Intel, as related by Fuad Abazovic of Fudzilla, who wrote that an AMD-Intel deal was “confirmed,” without citing sources.
Given additional research, It would appear that Intel doesn't require patents from AMD or Nvidia to make GPUs and that the Nvidia deal was for something else entirely. The AMD cross license was only for a singular GPU:

https://www.fudzilla.com/news/graphics/46065-nvidia-intel-licensing-was-never-licensing
 
Given additional research, It would appear that Intel doesn't require patents from AMD or Nvidia to make GPUs and that the Nvidia deal was for something else entirely. The AMD cross license was only for a singular GPU:

https://www.fudzilla.com/news/graphics/46065-nvidia-intel-licensing-was-never-licensing

You need to stop using fudzilla as your main source of "research". He goes off on more wrong headed tangents after being wrong the first time.

Intel does have a perpetual licence to all those patents. They are very handy now that Intel is entering the GPU business, and they are the biggest reason Intel doesn't need to license anymore patents.

Heck Intel probably has access to a lot of NVidia Ray Tracing patents as well.
 
Technically i740 is discrete, but the industry was different back then. I remember having a few i740 cards for sale along with 3dfx voodoo and the first nvidia boards while working at CompUSA. The term "discrete" wasn't used then, it was just onboard graphics and graphics card. Zero integration between CPU at all, real gaming machines used motherboards that didn't have any onboard graphics. It wasn't until Intel started heavily onboarding their GMA crap first into motherboards and later cpus, that "discrete" started coming up to distinguish between PC that was gaming machine and a PC that was just an office internet machine.

Thus in this era, technically it is Intel's "first" discrete graphics card. It will probably still suck for the 1st gen or 2 if they don't drop it and shovel it back into their cpus like before.
Hah, I worked at CompUSA back in the day (Well Computer City which got bought/became CompUSA). I still have my old 3dfx Voodoo card. I think I had it paired with a Matrox GPU of some sort. Glad the days are gone where you had to have both your standard desktop graphics card, as well as the "gaming" upgrade 3D card.
As far as the DG1, I am still trying to figure out what their target audience is. Do they just want to be a third player behind AMD and Nvidia and try to use brand name to get sales? Maybe they can challenge at top end, but I have my doubts. So target low to mid range performance and win on price?
I would rather Intel either focus on improvements to the built in graphics on their processors (or creating a new graphics co-processor), or put work into external GPU technology.
 
Back to DG1 specifically. It does seem like it will be disappointingly slow for a discreet part. They only seem to be aiming for double the performance of Intels IGP.

But I guess you have to walk before you can run. It will probably be a cheap add-on in Intel based laptops, and seen almost nowhere else. I can't see the Card version selling to the public in any volume.
 
Back to DG1 specifically. It does seem like it will be disappointingly slow for a discreet part. They only seem to be aiming for double the performance of Intels IGP.

But I guess you have to walk before you can run. It will probably be a cheap add-on in Intel based laptops, and seen almost nowhere else. I can't see the Card version selling to the public in any volume.
Unless it's paired with Intel Systems in the OEM market that don't include a GPU on them. Might see it a lot more than expected in cases like that.
 
Back to DG1 specifically. It does seem like it will be disappointingly slow for a discreet part. They only seem to be aiming for double the performance of Intels IGP.

But I guess you have to walk before you can run. It will probably be a cheap add-on in Intel based laptops, and seen almost nowhere else. I can't see the Card version selling to the public in any volume.

If the final hardware is in the same performance bracket as a GT 1030, I'd snap one up for my home server in a heartbeat. It's bound to have top shelf Linux support and drop-in QuickSync, which would make my DVD/Blu-ray transcodes a lot faster. What I'm anxious to hear about is the compute performance...
 
I don't think this is all doom and gloom if we look at it as an early dev tool and a stop gap to keep intel laptops on par with APU laptops as entry level gamers. (Even if it takes an extra chip to do so.)

We have no idea how much of that PCB is empty space. They may just be getting some real world testing under their belts until something like 7nm lets them move it on CPU.

They certainly have a long way to go but they need to start someplace. I think getting some real world experience and feedback from devs beats sitting on their hands.
 
I don't think this is all doom and gloom if we look at it as an early dev tool and a stop gap to keep intel laptops on par with APU laptops as entry level gamers. (Even if it takes an extra chip to do so.)

We have no idea how much of that PCB is empty space. They may just be getting some real world testing under their belts until something like 7nm lets them move it on CPU.

They certainly have a long way to go but they need to start someplace. I think getting some real world experience and feedback from devs beats sitting on their hands.

From everything I read, this low end part, won't be a card product, but a laptop product, it's only in card format now, to make it easy to send to dev shops.
 
Maybe they'll make it PCIe 4.0 in order to target AMD dev shops.
 
They stressed that it's still an early version that should improve before actual release (FPS counters weren't allowed).

something about the word "stressed" and also "FPS counters verboten"
 
They also say not to look a gift horse in the mouth... :D

LOL ... Intel is no gift horse for me. They don't give non-corporate consumers any "gifts" (that I've ever heard of anyways) rather, they over charge for their X series CPU's that can't be OC'ed because they chose to use inferior TIM thereby DECEIVING the customers and expecting them to DELID and lose the warranty. Had they offered people a refund if they so chose to do that would have been fine and well ... but Intel didn't ;)
 
Last edited:
Laptop part or not, it's just not impressive. AMD can do better already as the video I posted shows with Destiny 2 on High settings. Hell, even Intel integrated probably isn't far off. I get the idea of courting developers early but they don't have the goods.
 
Are we sure that Intel didn't just order a "1070 mini" off of Wish.com and use whatever crap card they received for their demo?
 
Laptop part or not, it's just not impressive. AMD can do better already as the video I posted shows with Destiny 2 on High settings. Hell, even Intel integrated probably isn't far off. I get the idea of courting developers early but they don't have the goods.

They're not being aggressive -- they're playing it safe. They know they can build a discrete part that's simply 2x whatever they put in their IGPs. This isn't hard for them.

However, they do have to get discrete right, whatever that means from their planning perspective, and a large part of that is doing what Nvidia does right and AMD is starting to, and that's to build an ecosystem with developer support. Intel has come a long way with their IGPs, arguably competing well with AMDs APUs (who would have ever thought?), but now they have to get real developer attention going.
 
DG1 is exactly where I expected it to be. Laptop graphics in a dgpu package.

I'm half-way expecting this to simply be TigerLake but with the X86 cores disabled. The ability to do this has actually been in Intel SoC's since Haswell as they use it for their VCA cards.

Where the performance increase stems from would be higher clocks permitted by a much higher TDP in a discreet card. Performance/watt will naturally suffer from the laptop focused version due to the voltage ramp up to reach the desired clocks. Nothing exactly new here but rather that this may not be entirely a disaster in a laptop.

What is disastrous is the performance that they were showing off. Granted drivers are immature for a new architecture but even in that context, things look bad even at the low end. This wouldn't have been so bad had they kept things behind closed doors or under NDA. On the flip side this does work unlike the disable GPU found in rare-as-hen's-teeth Cannon Lake chips.
 
Last edited:
They're not being aggressive -- they're playing it safe. They know they can build a discrete part that's simply 2x whatever they put in their IGPs. This isn't hard for them.

However, they do have to get discrete right, whatever that means from their planning perspective, and a large part of that is doing what Nvidia does right and AMD is starting to, and that's to build an ecosystem with developer support. Intel has come a long way with their IGPs, arguably competing well with AMDs APUs (who would have ever thought?), but now they have to get real developer attention going.

I think you are right. There is a lot that can be said about not playing your hand too early and risk appetites. But being Intel, I think they could have easily built a monster and sent it out to devs asking just how much they could wring out of it. I think that would have made a big splash. Get some passion, drive and incentive behind them from developers.
 
I think you are right. There is a lot that can be said about not playing your hand too early and risk appetites. But being Intel, I think they could have easily built a monster and sent it out to devs asking just how much they could wring out of it. I think that would have made a big splash. Get some passion, drive and incentive behind them from developers.

Haha, they cant build a monster gpu. They dont have years of experience building one and two they need an actual driver development team that can focus on getting that performance out of the hardware. Also when trying something new you start off slow and try to wring all the performance you can out of it first before moving on to more complicated hardware. If they continue to spend the resources to get better at it then they will slowly improve and might one day be a option in the discrete graphics, but for now they are learning and playing it safe.
 
Haha, they cant build a monster gpu. They dont have years of experience building one and two they need an actual driver development team that can focus on getting that performance out of the hardware. Also when trying something new you start off slow and try to wring all the performance you can out of it first before moving on to more complicated hardware. If they continue to spend the resources to get better at it then they will slowly improve and might one day be a option in the discrete graphics, but for now they are learning and playing it safe.

Yes they can. Raja has already confirmed Big Xe

In fact, they have 4 models in the works so far


https://www.tweaktown.com/news/66756/intel-xe-gpus-teased-4-models-total-up-512-cores/index.html

I'm thinking things aren't going as well as planned though.
 
Yes they can. Raja has already confirmed Big Xe

In fact, they have 4 models in the works so far


https://www.tweaktown.com/news/66756/intel-xe-gpus-teased-4-models-total-up-512-cores/index.html

I'm thinking things aren't going as well as planned though.

Raja does many things and one of them is over hype what is coming. Raja would be smart to keep his mouth shut and let his work speak for itself. It's a ton harder to build something from nothing then he is letting on and this first generation from Intel is more of a learning experience for them and they hope it's good enough that people will buy them.
 
Haha, they cant build a monster gpu.

Intel can build whatever they want -- just like AMD or Nvidia.

Big GPUs are exactly the same as small GPUs, just with more cores, just like CPUs, except GPUs are far less complex.

The bigger question for Intel is, would anyone buy it if it weren't based on tech more proven than their IGPs?

They're going to have the same problem that AMD does in general, and that AMD experienced acutely with Zen: Intel has to build confidence in their GPU products just to get developers to target them. It took time for software to attune to the unique challenges of CCXs and CCDs and NUMA on the desktop, and it's going to take time for software to target AMDs hardware RT implementation.

That's a price everyone has to pay. By limiting their risk, Intel is also limiting the potential rewards -- so they're paying up front.
 
Back
Top