Xe looking EXTREAMLY under welming

AzixTGO

Weaksauce
Joined
Feb 21, 2016
Messages
106
No, as Director of Enthusiast Engagement.


Wouldn't have done a worse job than the current crew that have made "Intel" become synonymous with "dishonest marketing". Their latest stunt is comparing 1195G7@28W with 5800U@15W

https://www.reddit.com/r/Amd/comments/noz2un/according_to_intel_core_i71195g7_is_faster_than/

You can find (older?) versions of this slideset floating around which didn't even mention the wattage (you'd have to look up in the disclosure statement)
View attachment 361435
Does the wattage matter that much? For people who look at individual processors and benchmarks, maybe. But generally consumers buy a complete product. The details are for the OEM to tackle
 

philb2

Weaksauce
Joined
May 26, 2021
Messages
88
Taco is surprised that people think it will be affordable. Why would it be affordable when everyone else is charging premium nd people keep buying?
Good points. On the one hand, a "reasonable" price strategy would enable them to penetrate the market as fast as they can produce GPUs. On the other hand, Intel has always maintained a "pricing umbrella" against AMD for CPUs.
 

AzixTGO

Weaksauce
Joined
Feb 21, 2016
Messages
106
Good points. On the one hand, a "reasonable" price strategy would enable them to penetrate the market as fast as they can produce GPUs. On the other hand, Intel has always maintained a "pricing umbrella" against AMD for CPUs.
if the GPUs are lower performing it would be a perfect opportunity to undercut and gain massive market share.

But aren't these being made at TSMC? Would their supply be good enough
 

Jandor

Gawd
Joined
Dec 30, 2018
Messages
593
Intel will have a DG2 card close in between 3070/6800, and recently Raja said they are interested by AMD FSR. Si we have many details.
Fact is for instance, RDNA2 doesn't have AI and has poor Ray tracing performance but is great at standard features. Not sure Intel won't be better at RT and may have an AI dedicated part.
AMD will probably put the AI feature on CDNA2.
 

travm

[H]ard|Gawd
Joined
Feb 26, 2016
Messages
1,027
And maybe also South Korea, if that nutcase up north gets even crazier.

Here is a report about semiconductor manufacturing profits. Notice where the biggest companies are located. https://www.techrepublic.com/articl...74950570821122218&mid=13387000&cid=2391303006
Tubby the third is less scary. I have complete faith that South Korea can handle anything he could throw at them. The ROC on the other hand... Not convinced they could stand up to a string breeze much less the full weight of China.
 

philb2

Weaksauce
Joined
May 26, 2021
Messages
88
Tubby the third is less scary. I have complete faith that South Korea can handle anything he could throw at them. The ROC on the other hand... Not convinced they could stand up to a string breeze much less the full weight of China.
You're probably right. If TSMC has any smarts, they will do all future fabs somewhere other than ROC. And maybe try to get visas for some of their engineers to that "somewhere else."
 

travm

[H]ard|Gawd
Joined
Feb 26, 2016
Messages
1,027
You're probably right. If TSMC has any smarts, they will do all future fabs somewhere other than ROC. And maybe try to get visas for some of their engineers to that "somewhere else."
Fabs elsewhere yes, but don't kid yourself engineers don't make difficult manufacturing tasks work. It's technolgists and technicians that make the world go round. All they need is the secret sauce.
 

Rizen

[H]F Junkie
Joined
Jul 16, 2000
Messages
9,471
Fabs elsewhere yes, but don't kid yourself engineers don't make difficult manufacturing tasks work. It's technolgists and technicians that make the world go round. All they need is the secret sauce.
Who do you think designs the machines that build the chips?
 

Armenius

Fully [H]
Joined
Jan 28, 2014
Messages
25,743
You're probably right. If TSMC has any smarts, they will do all future fabs somewhere other than ROC. And maybe try to get visas for some of their engineers to that "somewhere else."
TSMC is building a 5nm fab in Arizona. They own another one in Washington and Singapore.
 

illli

[H]ard|Gawd
Joined
Oct 26, 2005
Messages
1,384
So in a 'round-about' way, lots of folks have no confidence Intel (with it's billions) can ever produce this card or be competitive with GPUs.
If that is the case, America chip manufacturing has failed...miserably.
Larrabee says hello. I think originally it was positioned as a gaming card by intel and all in all, Intel sunk a bullion or so into trying to make that work
 

Jandor

Gawd
Joined
Dec 30, 2018
Messages
593
Intel high end gaming DG2 GPU has been showed by Raja and it looks more than 400mm². It's based on TSMC 6nm tech and so compares very well with AMD Navi 21 (RX6900XT) (540 mm² but 7nm TSMC including plenty of cache) and Nvidia A102 (RTX 3090) (620mm² but 8nm Samsung). I won't bet it won't beat both : new arch built from scratch, better silicone tech, Intel + Raja.
 

NattyKathy

Gawd
Joined
Jan 20, 2019
Messages
783
It's only underwhelming if the expectation is 6900XT/3080Ti performance. If the expectation is 3070Ti/6800 performance then 512EU Xe @>2Ghz shouldn't disappoint at least on paper. Performance consistency could be a big hangup though if the Xe iGPUs are any indication. I was strongly considering a Tiger Lake laptop last year for the iGPU but game benchmarks showed some odd results, with 80/96EU Xe sometimes matching a GeForce MX350 and sometimes getting bested by Vega 8. That's questionable for an iGPU but completely intolerable for a dGPU— imagine if Xe-HPG benches at 3070Ti levels in some games but gets beat by a 6700XT in others... it would be a bloodbath in the reviews. Hopefully Intel can fix that before Xe-HPG launches (not holding my breath though)
 

os2wiz

Gawd
Joined
Nov 20, 2011
Messages
640
"As fast as a 3070" won't mean as much by the time the cards are released.
Most people privy to inside sources are projecting performance mid way between 3070 and 3080 with the highest model in the lineup providing 16GB. It is supposed to be produced at TSMC so it will probably bee 6nm or 7nm TSMC process. If these projections pan out, I will buy one if AMD 6800XT availability does not improve by March 2022 and with a significant rice drop ( like about $799 to $899.)
 

Wat

Limp Gawd
Joined
Jun 23, 2019
Messages
186
Most people privy to inside sources are projecting performance mid way between 3070 and 3080 with the highest model in the lineup providing 16GB. It is supposed to be produced at TSMC so it will probably bee 6nm or 7nm TSMC process. If these projections pan out, I will buy one if AMD 6800XT availability does not improve by March 2022 and with a significant rice drop ( like about $799 to $899.)
What if it isnt released for another 2 years? 3 years? I remember the great sounding predictions around Larrabee, so I'm skeptical.
 

mvmiller12

[H]ard|Gawd
Joined
Aug 7, 2011
Messages
1,077
I seemed to remember the i740 being hyped as a premium product before it was released... Also Intel is notorious for making poor video drivers for their iGPUs.
 

UltraTaco

Limp Gawd
Joined
Feb 21, 2020
Messages
471
I believe this time it will be different. There CPUs are fantastic, why should they have issue with gpus performing on par? This is intel, nt some garage band in the garage! I believe in them.
 

pendragon1

Fully [H]
Joined
Oct 7, 2000
Messages
28,134
I believe this time it will be different. There CPUs are fantastic, why should they have issue with gpus performing on par? This is intel, nt some garage band in the garage! I believe in them.
"Gaming wasn’t even mentioned as a use case in Larrabee’s initial announcement. However, almost immediately after, Intel started talking about Larrabee’s gaming capabilities, setting expectations sky-high. In 2007, Intel was several times larger than Nvidia and AMD put together. When Intel claimed Larrabee was faster than existing GPUs, it was taken as a given, considering their talent pool and resource budget.

Larrabee gaming expectations were hyped even further when Intel purchased Offset Software, months after buying the Havok physics engine. The studio’s first game, Project Offset, was demoed in 2007 and showcased unprecedented visuals. Unfortunately, nothing came out of the Offset Software purchase. Intel shuttered the studio in 2010, around the time it put Larrabee on hold."
https://www.techspot.com/article/2125-intel-last-graphics-card/
plus the i740
we've seen this before, twice...
 

philb2

Weaksauce
Joined
May 26, 2021
Messages
88
"Gaming wasn’t even mentioned as a use case in Larrabee’s initial announcement. However, almost immediately after, Intel started talking about Larrabee’s gaming capabilities, setting expectations sky-high. In 2007, Intel was several times larger than Nvidia and AMD put together. When Intel claimed Larrabee was faster than existing GPUs, it was taken as a given, considering their talent pool and resource budget.

Larrabee gaming expectations were hyped even further when Intel purchased Offset Software, months after buying the Havok physics engine. The studio’s first game, Project Offset, was demoed in 2007 and showcased unprecedented visuals. Unfortunately, nothing came out of the Offset Software purchase. Intel shuttered the studio in 2010, around the time it put Larrabee on hold."
https://www.techspot.com/article/2125-intel-last-graphics-card/
plus the i740
we've seen this before, twice...
pendragon1 So history will likely repeat itself. I guess I shouldn't hold out any hope that Intel will shake up the market.
 

pendragon1

Fully [H]
Joined
Oct 7, 2000
Messages
28,134
But how did they go from showcasing unprecedented visuals to nothing?
that was the game engine's "unprecedented visuals" and it died because intel mothballed the whole thing. it(larrabee) wasnt good enough and they wasted enough money.
 

griff30

Supreme [H]ardness
Joined
Jul 15, 2000
Messages
6,502
I've said it before, I will grab aglow end Intel GPU day 1 just for transcoding Plex if priced right.
The competition....

nVidia Quadro P2000 = $200-$400 USED
NVidia Quadro RTX 5000 $2000+ used for 4k transcodes.

 

WorldExclusive

[H]F Junkie
Joined
Apr 26, 2009
Messages
11,174
With no driver experience, unknown partnerships with devs, I don't expect this GPU to be seriously competitive for at least three generations.

Nvidia at anytime can gives us 3090 performance for $749 if they feel threatened. Intel will still have to beat Nv at that price point in order to gain mindshare i.e. AMD 4000 series.
 

Sorameth

n00b
Joined
Oct 11, 2016
Messages
44
Twenty years ago 3dfx was lost, leaving just nvidia and ati. There were rumors of a super card to give us another option from matrox. People had a lot of hopes for that card and the parhelia was a horrible disappointment. Feels like that.

I feel old now.
 

philb2

Weaksauce
Joined
May 26, 2021
Messages
88
Twenty years ago 3dfx was lost, leaving just nvidia and ati. There were rumors of a super card to give us another option from matrox. People had a lot of hopes for that card and the parhelia was a horrible disappointment. Feels like that.

I feel old now.
Matrox? I used them, like 20+ years ago. But I started out with CP/M on a Z80 machine with 64KB of RAM and dual 8" floppy drives. I even met Gary Kildall once. That makes me old.
 

Direkt

Weaksauce
Joined
Jan 10, 2006
Messages
93
I would say that it looks pretty decent considering it's a first generation product. I'm all for a third player in the GPU game.
 

Sir Beregond

Limp Gawd
Joined
Oct 12, 2020
Messages
284
What if it isnt released for another 2 years? 3 years? I remember the great sounding predictions around Larrabee, so I'm skeptical.
Skeptical as well. But hopeful. If it takes that long to release, then yeah, stick a fork in it, it's done. If it releases by the end of this year, that would be something else.
 

griff30

Supreme [H]ardness
Joined
Jul 15, 2000
Messages
6,502
I have a feeling that even if it's 3070 level and released this year, it's gonna be luke warm reception from gamers who are AMD or Nvidia fanatics.
Pre-made gaming rigs are where they will sell and Intel knows that.
 

philb2

Weaksauce
Joined
May 26, 2021
Messages
88
I have a feeling that even if it's 3070 level and released this year, it's gonna be luke warm reception from gamers who are AMD or Nvidia fanatics.
Pre-made gaming rigs are where they will sell and Intel knows that.
Sure, but what good is a pre-built rig if there isn't good support, really good support, for lots of games? That kind of support isn't just something you can buy off the shelf.
 

Wat

Limp Gawd
Joined
Jun 23, 2019
Messages
186
Intel is just one CEO away from killing the graphics division. As soon as they have a bad quarter or two, they will decide to "refocus on our core competencies"
 

Sir Beregond

Limp Gawd
Joined
Oct 12, 2020
Messages
284
The Prophecy of Pythia says "All of this has happened before. All of this will happen again." So say we all.
51vZ-KijNgL._AC_.jpg
 

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
51,793
Intel is just one CEO away from killing the graphics division. As soon as they have a bad quarter or two, they will decide to "refocus on our core competencies"

The Prophecy of Pythia says "All of this has happened before. All of this will happen again." So say we all.

Interesting point. Here is even a more interesting point that has seemed to go unnoticed. Put this under your tinfoil hat.

https://archive.is/cpvR1

PAT GELSINGER LEFT INTEL BECAUSE OF LARRABEE FIASCO?

SEPTEMBER 18, 2009


Last week, we learned that Patrick P. Gelsinger will leave Intel for EMC and tried to find out the reason for the move. From one side, the move had perfect sense. Pat was one of Andy Groove’s men, and Paul Otellini did his best to surround himself with his aces, thus the choice of Sean Maloney was logical.

But the underlying issue wasn’t that Pat was "Andy Groove’s men", the issue was the war with nVidia and under-delivering on Larrabee.

As we all know, Larrabee project has been problematic at best. Intel start hyping up Larrabee long before it was ready, and the project broke all deadlines. We read through roadmaps and watched Larrabee slip not by quarters, but by years. After we saw roadmaps for introduction of Larrabee pushed back all the way to 2011, and hearing that a lot of key industry analysts are dismayed at Intel – Pat’s maneuvering capability was cut to a single corner.

A lot of people we talked to were disappointed at Intel "starting a war with nVidia without a product to compete", and after hearing statements such as "Intel is a chip company, not a PowerPoint company", it was clear to us that Intel seriously "screwed the pooch" on this one.

There is no doubt in our minds that Intel is going to deliver Larrabee, as it is the future of the company. But Intel will probably spend additional billion or so USD on making the chip work [because it is quintessentially broken in hardware, we haven’t even touched the software side], and come to market with a complete line-up. But unlike the CPU division that only missed Lynnfield [Core i5-700, i7-800 series] roadmap by six months, project Larrabee is now a year late, and according to documents we saw, it won’t reach the market in the next 12 months. This will put a 45nm Larrabee against 28nm next-gen chips from ATI and nVidia, even though we know the caveat of using 45nm Fabs for the job. According to our sources, in 2011 both ATI and nVidia will offer parts with around 5-7TFLOPS of compute power, surpassing 10TFLOPS on the dual-ASIC parts. According to information at hand, Intel targeted 1+ TFLOPS of compute power for the first generation, i.e. less number crunching performance than ATI Radeon HD 4870 and nVidia GeForce GTX 285. With Larrabee coming in 2011, the company did revise that number to raise available performance.

We learned about the estimated cost of Larrabee project, and if there wasn’t for best-selling Core 2 series, this project would seriously undermine Intel’s ability to compete. To conclude this article – Larrabee was Gelsinger’s baby, project got seriously messed up and somebody had to pay the bill. Patrick is staying in Santa Clara though, almost on the same address. Given his new job, Patrick P. Gelsinger simply moved from 2200 Mission College Blvd [Robert N. Noyce building, i.e. Intel HQ], to 2831 Mission College Blvd [EMC HQ].
 

Wat

Limp Gawd
Joined
Jun 23, 2019
Messages
186
Yeah... itaninum sounded great at the time, too. I wonder how it would do with contemporary fabrication methods. Same for the DEC alpha...
 

Denpepe

[H]ard|Gawd
Joined
Oct 26, 2015
Messages
1,824
Intel is just one CEO away from killing the graphics division. As soon as they have a bad quarter or two, they will decide to "refocus on our core competencies"
People also buried AMD after the launch of the mediocre RDNA 1 and after countless reiterations of CGN cards or the bulldozer CPU's and look where they are now. And they lacked the money that intel has.
 

Shadowarez

Gawd
Joined
Jul 8, 2019
Messages
546
well we can hope intel can weather the money pit untill it becomes a decent product.
 

SamuelL421

Limp Gawd
Joined
Jun 3, 2016
Messages
362
People also buried AMD after the launch of the mediocre RDNA 1 and after countless reiterations of CGN cards or the bulldozer CPU's and look where they are now. And they lacked the money that intel has.

My thoughts as well. I don't know what it is that prevents intel from pushing a dedicated GPU project to completion. On paper, intel checks all the boxes, controls its own fabs (albeit not at the needed nm scale, yet), intel is willing to throw massive amounts of money at the project, intel has now had several generations of quality iGPUs on its mobile parts (caveat here being the shit drivers).

It doesn't make sense to me that they can't/won't eventually see this through and produce a quality GPU. If they shoot themselves in the foot again, it will be from bad management decisions, bad software/drivers, or stupidity like artificially limiting compatibility or similar.
 
Top