This is about APUs though. Ultimately these are lowest common denominator parts. They need small dies to be cheap enough for high volume.
Of course you can make a specialized high end APU with a big GPU, those are what Microsoft and Sony get for their consoles.
But now you have a much more...
New tech releases with pent up demand always have shortages and rising prices. We don't need conspiracy theories for that to happen.
When supplies run short, it isn't proof of said conspiracy theory, just the normal case of demand that exceeds supply, that even happened for some tiers of...
Similar alarmist story when they sold to Softbank:
https://www.dailymail.co.uk/news/article-3709836/Sale-hi-tech-giant-Japan-risks-crushing-UK-innovation-Lobby-group-warns-24bn-takeover-benefits-Britain.html
They will make some noise about job guarantees, but it's mostly smoke. UK will approve...
Agreed. People argue way to much over NVidia naming like everything related to that name has to be set in stone.
Though this release does feel like a return to Pascal structure in many ways.
If this does follow Pascal model, then we will probably see a 3080 Ti many months after 3080, much...
Sometimes, but unlikely that's the case here. ARM fills in the missing pieces of NVidias end to end computation solution.
NVidia is going to leverage the hell out of ARM. The first thing they plan is to try up-selling GPU and AI licenses to ARM licensees. This isn't speculation, this what...
Because GPUs cost more to make these days, and it now costs > 100 million dollars in upfront costs to start manufacturing a new series, so they need decent margins to recoup those costs as quickly as possible.
Recoup costs too slowly and you are looking at a new GPU series, before the old one...
This time the FE cards look the best value, if you can get your hands on one. New cooler is supposed to be much better than the last FE cooler which wasn't that bad itself.
If I wanted an 3080, I would definitely be trying for an FE card, but that might be difficult for quite some time, this...
That's because you are comparing across tiers. Old x80 Ti, vs new x80.
Compare it to the old x80 to new x80 and the performance gain is in the 70-80% range, and that is what people are excited about.
These kind of gains were NOT normal before Turing. They happens maybe once in a decade...
GPUs these days, tend to already be at diminishing returns for clock speed. You may end up increasing heat by 30% (and noise) to squeak out 5% performance gain which will be undetectable playing games.
So mostly OC'ing a GPU makes things you can notice (heat and/or noise) worse, for a...
We have to recognize that most of the Windows market, is a laptop market, and I don't think Microsoft is ever backing down from Windows on ARM.
They are going to keep at it, while ARM SoCs keep getting better. Eventually they are going to start getting it right, at least in laptops.
It's a...
No not the only one, that is what I said from the start, though I think Intel is in trouble too. Though that's the long view, not in the next couple of years.
x86 clings to it's stronghold largely because of Windows. Everything elsel left, is running fairly portable code that can be easily...
I think there is plenty of doubt about that. Apple ARM Macs will soon be showing what ARM can do with a bit bigger power budget. IBMs RISC (Power) is irrelevant for PC, for the same reason Apple abandoned it. Most PCs are laptops, and perf/watt rules there.
So you also agree that it is only...
Nothing wrong with ARM core architecture. Yes legacy x86 SW is an issue, but it's far from an insurmountable one.
Now that NVidia owns ARM (subject to regulatory approval) they will be motivated to push it everywhere and that includes on the PC.
ARM making seriously inroads into the Windows...
Which is why NVidia owning ARM holding, and pushing for better desktop class ARM chips, makes the Windows on ARM more viable, and in the long run a greater reward for NViida.
It's not about any one thing, it's about all of them. If you acknowledge Microsoft is eventually going ARM, then NVidia owning ARM puts them in the CPU driving seat for:
Mobile
Data Center
PC
And all the synergies in them. That is what makes this worth 40 Billion.
It has more of everything, which is what you would expect on a higher tier.
IIRC ROPs tend to be heavily influenced by the memory interface. When it increases ROPs increase.
3090 has 64 bits more in it's memory bus.
Yes, but it can be very much for the PC Master race as well, and this is less obvious to many.
Most people are focusing on mobile. Because that is currently the main thrust of ARM. But ARM already saturates mobile, and this does not justify 40 Billion.
Data Center is the next most obvious...
If it's ATX and it's that small, likely they were aiming for cheapo, not small.
All the ATX cases I have been interested in could house a 3090 FE, and I am actually only looking at smaller "mid-tower" ATX cases.
Even the mATX cases I have been looking at (CM NR400 and S400) could house a 3090...
I agree, If they merely wanted to make more ARM chips, the license is a pittance. So if NVidia is really doing this, and spending all that money, it's going to be for a massive outcome, rough for everyone not NVidia.
I speculated in one of the other story links about this yesterday (we need...
Rumored for quite a while that Zen 3 CCX is unified 8 cores. This should really help kill the latency advantage Intel has for gaming. Hope this one is true.
AIBs obviously don't pay cost though, so they are at a disadvantage competing against NVidia for card sales.
When NVidia sells their own card, they make revenue and profit for the Chip and the card. When they sell the chip to an AIB they only get chip revenue and profit.
Revenue and profit...
The thing is they don't need to buy ARM holding for that. They can make all the ARM chips they want form now until the end of time, and it wouldn't cost them more than the interest on 40 Billion dollars.
I suppose if most of this is stock swap, then sure,it's effectively "free", and NVidia...
I do think there was a lot of over-reaction on cooler costs. NVidia FE coolers likely do cost a bit more than AIB coolers of similar size. Remember the claims that card prices were going to be high because the coolers were so expensive? :rolleyes:
But I don't think the 3080 FE cooler is any...
Is that supposed to somehow obligate me to watch all the clickbait/rumor videos that mention AMD?
I made it clear in multiple posts in AMD and NVidia threads. I won't watch these kinds of videos, it doesn't matter what they purport to contain.
I said when his claims are in text, I will read...
Maybe take your time and breath, and look more closely at what you are responding to.
A lot of people were upset that a performance leak, that Showed 3080 (note the absence of Ti) was only 33% faster than 2080 Ti.
There was only one meaningful game in that leak, and since we the leak was not...
How likely is that anyone would be disappointed enough in a 3080+ to feel the need to upgrade again that soon.
Ampere looks great, and it appear buyers will be creating a large demand for them, you could try to buy one, and still might find yourself watching the AMD event with no Ampere in...
How about not. Since again, people say the same things about their favorite YT rumor channel they think is special.. I did find text from January stating a Summer Launch for Big Navi, so he missed that one already, since Summer is ending before October.
People who support these kinds of...
I get this all the time from fans of Moores Law is Dead, Coreteks, Adored, etc....
Sorry, no, I am not wasting my time watching, nor am I supporting YT clickbait, with my views.
If it's in text somewhere, I will read it. No doubt if it has anything interesting to say, all the text rumor sites...