Xe looking EXTREAMLY under welming

Taco is surprised that people think it will be affordable. Why would it be affordable when everyone else is charging premium nd people keep buying?
Good points. On the one hand, a "reasonable" price strategy would enable them to penetrate the market as fast as they can produce GPUs. On the other hand, Intel has always maintained a "pricing umbrella" against AMD for CPUs.
 
Good points. On the one hand, a "reasonable" price strategy would enable them to penetrate the market as fast as they can produce GPUs. On the other hand, Intel has always maintained a "pricing umbrella" against AMD for CPUs.
if the GPUs are lower performing it would be a perfect opportunity to undercut and gain massive market share.

But aren't these being made at TSMC? Would their supply be good enough
 
Intel will have a DG2 card close in between 3070/6800, and recently Raja said they are interested by AMD FSR. Si we have many details.
Fact is for instance, RDNA2 doesn't have AI and has poor Ray tracing performance but is great at standard features. Not sure Intel won't be better at RT and may have an AI dedicated part.
AMD will probably put the AI feature on CDNA2.
 
And maybe also South Korea, if that nutcase up north gets even crazier.

Here is a report about semiconductor manufacturing profits. Notice where the biggest companies are located. https://www.techrepublic.com/articl...74950570821122218&mid=13387000&cid=2391303006
Tubby the third is less scary. I have complete faith that South Korea can handle anything he could throw at them. The ROC on the other hand... Not convinced they could stand up to a string breeze much less the full weight of China.
 
Tubby the third is less scary. I have complete faith that South Korea can handle anything he could throw at them. The ROC on the other hand... Not convinced they could stand up to a string breeze much less the full weight of China.
You're probably right. If TSMC has any smarts, they will do all future fabs somewhere other than ROC. And maybe try to get visas for some of their engineers to that "somewhere else."
 
You're probably right. If TSMC has any smarts, they will do all future fabs somewhere other than ROC. And maybe try to get visas for some of their engineers to that "somewhere else."
Fabs elsewhere yes, but don't kid yourself engineers don't make difficult manufacturing tasks work. It's technolgists and technicians that make the world go round. All they need is the secret sauce.
 
Fabs elsewhere yes, but don't kid yourself engineers don't make difficult manufacturing tasks work. It's technolgists and technicians that make the world go round. All they need is the secret sauce.
Who do you think designs the machines that build the chips?
 
You're probably right. If TSMC has any smarts, they will do all future fabs somewhere other than ROC. And maybe try to get visas for some of their engineers to that "somewhere else."
TSMC is building a 5nm fab in Arizona. They own another one in Washington and Singapore.
 
So in a 'round-about' way, lots of folks have no confidence Intel (with it's billions) can ever produce this card or be competitive with GPUs.
If that is the case, America chip manufacturing has failed...miserably.
Larrabee says hello. I think originally it was positioned as a gaming card by intel and all in all, Intel sunk a bullion or so into trying to make that work
 
Intel high end gaming DG2 GPU has been showed by Raja and it looks more than 400mm². It's based on TSMC 6nm tech and so compares very well with AMD Navi 21 (RX6900XT) (540 mm² but 7nm TSMC including plenty of cache) and Nvidia A102 (RTX 3090) (620mm² but 8nm Samsung). I won't bet it won't beat both : new arch built from scratch, better silicone tech, Intel + Raja.
 
It's only underwhelming if the expectation is 6900XT/3080Ti performance. If the expectation is 3070Ti/6800 performance then 512EU Xe @>2Ghz shouldn't disappoint at least on paper. Performance consistency could be a big hangup though if the Xe iGPUs are any indication. I was strongly considering a Tiger Lake laptop last year for the iGPU but game benchmarks showed some odd results, with 80/96EU Xe sometimes matching a GeForce MX350 and sometimes getting bested by Vega 8. That's questionable for an iGPU but completely intolerable for a dGPU— imagine if Xe-HPG benches at 3070Ti levels in some games but gets beat by a 6700XT in others... it would be a bloodbath in the reviews. Hopefully Intel can fix that before Xe-HPG launches (not holding my breath though)
 
"As fast as a 3070" won't mean as much by the time the cards are released.
Most people privy to inside sources are projecting performance mid way between 3070 and 3080 with the highest model in the lineup providing 16GB. It is supposed to be produced at TSMC so it will probably bee 6nm or 7nm TSMC process. If these projections pan out, I will buy one if AMD 6800XT availability does not improve by March 2022 and with a significant rice drop ( like about $799 to $899.)
 
Most people privy to inside sources are projecting performance mid way between 3070 and 3080 with the highest model in the lineup providing 16GB. It is supposed to be produced at TSMC so it will probably bee 6nm or 7nm TSMC process. If these projections pan out, I will buy one if AMD 6800XT availability does not improve by March 2022 and with a significant rice drop ( like about $799 to $899.)
What if it isnt released for another 2 years? 3 years? I remember the great sounding predictions around Larrabee, so I'm skeptical.
 
I seemed to remember the i740 being hyped as a premium product before it was released... Also Intel is notorious for making poor video drivers for their iGPUs.
 
I believe this time it will be different. There CPUs are fantastic, why should they have issue with gpus performing on par? This is intel, nt some garage band in the garage! I believe in them.
"Gaming wasn’t even mentioned as a use case in Larrabee’s initial announcement. However, almost immediately after, Intel started talking about Larrabee’s gaming capabilities, setting expectations sky-high. In 2007, Intel was several times larger than Nvidia and AMD put together. When Intel claimed Larrabee was faster than existing GPUs, it was taken as a given, considering their talent pool and resource budget.

Larrabee gaming expectations were hyped even further when Intel purchased Offset Software, months after buying the Havok physics engine. The studio’s first game, Project Offset, was demoed in 2007 and showcased unprecedented visuals. Unfortunately, nothing came out of the Offset Software purchase. Intel shuttered the studio in 2010, around the time it put Larrabee on hold."
https://www.techspot.com/article/2125-intel-last-graphics-card/
plus the i740
we've seen this before, twice...
 
"Gaming wasn’t even mentioned as a use case in Larrabee’s initial announcement. However, almost immediately after, Intel started talking about Larrabee’s gaming capabilities, setting expectations sky-high. In 2007, Intel was several times larger than Nvidia and AMD put together. When Intel claimed Larrabee was faster than existing GPUs, it was taken as a given, considering their talent pool and resource budget.

Larrabee gaming expectations were hyped even further when Intel purchased Offset Software, months after buying the Havok physics engine. The studio’s first game, Project Offset, was demoed in 2007 and showcased unprecedented visuals. Unfortunately, nothing came out of the Offset Software purchase. Intel shuttered the studio in 2010, around the time it put Larrabee on hold."
https://www.techspot.com/article/2125-intel-last-graphics-card/
plus the i740
we've seen this before, twice...
pendragon1 So history will likely repeat itself. I guess I shouldn't hold out any hope that Intel will shake up the market.
 
But how did they go from showcasing unprecedented visuals to nothing?
that was the game engine's "unprecedented visuals" and it died because intel mothballed the whole thing. it(larrabee) wasnt good enough and they wasted enough money.
 
I've said it before, I will grab aglow end Intel GPU day 1 just for transcoding Plex if priced right.
The competition....

nVidia Quadro P2000 = $200-$400 USED
NVidia Quadro RTX 5000 $2000+ used for 4k transcodes.

 
With no driver experience, unknown partnerships with devs, I don't expect this GPU to be seriously competitive for at least three generations.

Nvidia at anytime can gives us 3090 performance for $749 if they feel threatened. Intel will still have to beat Nv at that price point in order to gain mindshare i.e. AMD 4000 series.
 
Twenty years ago 3dfx was lost, leaving just nvidia and ati. There were rumors of a super card to give us another option from matrox. People had a lot of hopes for that card and the parhelia was a horrible disappointment. Feels like that.

I feel old now.
 
Twenty years ago 3dfx was lost, leaving just nvidia and ati. There were rumors of a super card to give us another option from matrox. People had a lot of hopes for that card and the parhelia was a horrible disappointment. Feels like that.

I feel old now.
Matrox? I used them, like 20+ years ago. But I started out with CP/M on a Z80 machine with 64KB of RAM and dual 8" floppy drives. I even met Gary Kildall once. That makes me old.
 
I would say that it looks pretty decent considering it's a first generation product. I'm all for a third player in the GPU game.
 
I have a feeling that even if it's 3070 level and released this year, it's gonna be luke warm reception from gamers who are AMD or Nvidia fanatics.
Pre-made gaming rigs are where they will sell and Intel knows that.
 
I have a feeling that even if it's 3070 level and released this year, it's gonna be luke warm reception from gamers who are AMD or Nvidia fanatics.
Pre-made gaming rigs are where they will sell and Intel knows that.
Sure, but what good is a pre-built rig if there isn't good support, really good support, for lots of games? That kind of support isn't just something you can buy off the shelf.
 
Intel is just one CEO away from killing the graphics division. As soon as they have a bad quarter or two, they will decide to "refocus on our core competencies"
 
The Prophecy of Pythia says "All of this has happened before. All of this will happen again." So say we all.
51vZ-KijNgL._AC_.jpg
 
Intel is just one CEO away from killing the graphics division. As soon as they have a bad quarter or two, they will decide to "refocus on our core competencies"

The Prophecy of Pythia says "All of this has happened before. All of this will happen again." So say we all.

Interesting point. Here is even a more interesting point that has seemed to go unnoticed. Put this under your tinfoil hat.

https://archive.is/cpvR1

PAT GELSINGER LEFT INTEL BECAUSE OF LARRABEE FIASCO?

SEPTEMBER 18, 2009


Last week, we learned that Patrick P. Gelsinger will leave Intel for EMC and tried to find out the reason for the move. From one side, the move had perfect sense. Pat was one of Andy Groove’s men, and Paul Otellini did his best to surround himself with his aces, thus the choice of Sean Maloney was logical.

But the underlying issue wasn’t that Pat was "Andy Groove’s men", the issue was the war with nVidia and under-delivering on Larrabee.

As we all know, Larrabee project has been problematic at best. Intel start hyping up Larrabee long before it was ready, and the project broke all deadlines. We read through roadmaps and watched Larrabee slip not by quarters, but by years. After we saw roadmaps for introduction of Larrabee pushed back all the way to 2011, and hearing that a lot of key industry analysts are dismayed at Intel – Pat’s maneuvering capability was cut to a single corner.

A lot of people we talked to were disappointed at Intel "starting a war with nVidia without a product to compete", and after hearing statements such as "Intel is a chip company, not a PowerPoint company", it was clear to us that Intel seriously "screwed the pooch" on this one.

There is no doubt in our minds that Intel is going to deliver Larrabee, as it is the future of the company. But Intel will probably spend additional billion or so USD on making the chip work [because it is quintessentially broken in hardware, we haven’t even touched the software side], and come to market with a complete line-up. But unlike the CPU division that only missed Lynnfield [Core i5-700, i7-800 series] roadmap by six months, project Larrabee is now a year late, and according to documents we saw, it won’t reach the market in the next 12 months. This will put a 45nm Larrabee against 28nm next-gen chips from ATI and nVidia, even though we know the caveat of using 45nm Fabs for the job. According to our sources, in 2011 both ATI and nVidia will offer parts with around 5-7TFLOPS of compute power, surpassing 10TFLOPS on the dual-ASIC parts. According to information at hand, Intel targeted 1+ TFLOPS of compute power for the first generation, i.e. less number crunching performance than ATI Radeon HD 4870 and nVidia GeForce GTX 285. With Larrabee coming in 2011, the company did revise that number to raise available performance.

We learned about the estimated cost of Larrabee project, and if there wasn’t for best-selling Core 2 series, this project would seriously undermine Intel’s ability to compete. To conclude this article – Larrabee was Gelsinger’s baby, project got seriously messed up and somebody had to pay the bill. Patrick is staying in Santa Clara though, almost on the same address. Given his new job, Patrick P. Gelsinger simply moved from 2200 Mission College Blvd [Robert N. Noyce building, i.e. Intel HQ], to 2831 Mission College Blvd [EMC HQ].
 
Yeah... itaninum sounded great at the time, too. I wonder how it would do with contemporary fabrication methods. Same for the DEC alpha...
 
Intel is just one CEO away from killing the graphics division. As soon as they have a bad quarter or two, they will decide to "refocus on our core competencies"
People also buried AMD after the launch of the mediocre RDNA 1 and after countless reiterations of CGN cards or the bulldozer CPU's and look where they are now. And they lacked the money that intel has.
 
well we can hope intel can weather the money pit untill it becomes a decent product.
 
People also buried AMD after the launch of the mediocre RDNA 1 and after countless reiterations of CGN cards or the bulldozer CPU's and look where they are now. And they lacked the money that intel has.

My thoughts as well. I don't know what it is that prevents intel from pushing a dedicated GPU project to completion. On paper, intel checks all the boxes, controls its own fabs (albeit not at the needed nm scale, yet), intel is willing to throw massive amounts of money at the project, intel has now had several generations of quality iGPUs on its mobile parts (caveat here being the shit drivers).

It doesn't make sense to me that they can't/won't eventually see this through and produce a quality GPU. If they shoot themselves in the foot again, it will be from bad management decisions, bad software/drivers, or stupidity like artificially limiting compatibility or similar.
 
so management from the likes of previous CEO leting engineers do what they do without having corporate bs weigh them down. or force limiting features to eek out more $$$ by making consumers pay more for stuff like higher then 2666 ram speeds on cpus, or pcie lanes being cut down to pleb levels.
 
Intel only need to put their GPU on Intel mother boards with Intel CPU sold by Dell and HP and especially those for professional use and they're going to be fine. Companies and professionals are going to buy those like pancakes. Don't worry for Intel.
They don't even need to sell those on open market, but only integration and they're going to be fine. As they're going to share the same fabs at TSMC with AMD, they're not going make a lot of them because the production of wafer is dedicated to other use, like automobiles, first.
 
Intel only need to put their GPU on Intel mother boards with Intel CPU sold by Dell and HP and especially those for professional use and they're going to be fine. Companies and professionals are going to buy those like pancakes. Don't worry for Intel.
They don't even need to sell those on open market, but only integration and they're going to be fine. As they're going to share the same fabs at TSMC with AMD, they're not going make a lot of them because the production of wafer is dedicated to other use, like automobiles, first.
smart companies are not going to touch them until they are proven. dell and hp are not going to jump on them untested.
 
Back
Top