Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Twatter? Pallet?where you getting that from? nvidia hasnt said so and if you listen to twatter pallet has 25 skus for 3050ti based cards.
thanks i try not to talk out my ass too much and its Palit, forgot how to speel it. saw this:Twatter? Pallet?
I just wanna say here that I have found pendragon1 to be an astute observer of the scene. And hey, I'm originally from NY, and we don't have the time to say nice things that aren't really true.
maybe not them but IF they are making them others will too.But they aren't available in the US.
There's certainly going to be a desktop 3050Ti
Keep in mind that with this GPU generation, NV (and AMD) are back on their BS of the mobile chips not actually being the same config (or even the same die) as the desktop parts with the same name. So, I'm confident there will be a 3050Ti for desktop, but it hasn't come up yet because it's going to be a different product than the mobile version (probably closer to the mobile 3060) and NV is focusing on filling their mobile stack at the moment.
they won't be listed yet since the 3050 on desktop hasn't yet been officially announced. We're all operating on info from the leak/rumor sites, which can be pretty accurate if u know what to trust, but won't match up with "official" information until NV makes the announcement.I looked at the Asus and MSI web sites to check out what they have in the RTX 3050. None are listed so I'm a bit confused as to some comments here.
On the other hand, Bitcoin has plunged lately from over $63K to over $36K today. It was under $10K less than a year ago. I imagine that will cause some miners to limp around if it went that low again.
So, my post was actually not totally accurate... I rechecked the specs (since Ampere mobile details aren't quite seared into my brain yet like previous gens) and it's only the 3080 that's a bamboozle. The rest actually have a slightly higher core count than the desktop variants which makes sense with the "wide and slow for efficiency" thing like you pointed out. I'm guessing that GA102 is just so power hungry that the frequency would have to be pushed so low as to f*ck up the efficiency curve- taking a ~220W GPU down to ~100-165W is one thing, but going from 320W to 100-165W is a pretty huge jump and IME there's a point of massively diminishing returns when power limit goes too low. Die size may play into that a bit too-TU104 was physically the largest chip to ever go into regular laptops and that was already enormous at 545mm^2- but GA102 is also not that much larger than TU104 (compared to TU102 at least) so it's hard to say how much of a factor that was. It's certainly less expensive for NV to use a 392mm^2 die instead of a 628mm^2 one.It makes no sense because higher core count GPUs actually get more performance/watt than cut-down versions. It's insulting to people who actually know the way this stuff works.
It makes no sense because higher core count GPUs actually get more performance/watt than cut-down versions. It's insulting to people who actually know the way this stuff works.
oops, yeah yr right the 3070 is cut too. Geez I can't keep this straight... at least previous generations were consistently cut or not cut across the entire stack (the "special" 980 version being the only exception IIRC) but Ampere is all over the place.The RTX 3060 notebook part uses the uncut cores. The reason the 3070 uses more cut cores is because they have already hit the power wall with 3060 at 120w (So 3070 at 150w is nuts)
You can't use much magic to improve performance (when you take into account the 15% difference in core count you would be limited to like 5% improved performance when already power-limited at 150w)
Looks like they're doing laptops first. That makes sense to me given the current environment. Miners are less likely to snap up all the low end GPUs if they have to buy a whole laptop to get one, plus a GPU like a 3050 makes more sense to me in a laptop. I'm mostly a desktop guy so when I buy a laptop I want it portable. Less power draw & less heat -> more portable. I don't want a brick that comes with a folding cart so I can game for an hour or two on a laptop with a GPU that pulls 150W.So while all this discussion is interesting, we have no way of knowing when we can actually buy one of these models (assuming that's what you want).
Excuse the noob question. What is a "big" chip? An "uncut" chip? I know what is the bus and cores.Looks like they're doing laptops first. That makes sense to me given the current environment. Miners are less likely to snap up all the low end GPUs if they have to buy a whole laptop to get one, plus a GPU like a 3050 makes more sense to me in a laptop. I'm mostly a desktop guy so when I buy a laptop I want it portable. Less power draw & less heat -> more portable. I don't want a brick that comes with a folding cart so I can game for an hour or two on a laptop with a GPU that pulls 150W.
Personally I think the 3080 using the "big" chip screwed up the stack for laptops. The last time NV had an "80" card that wasn't a Ti with a 384-bit bus was the 780. The 700 series was mostly rebadged 600 series cards except for the 780. 680->770, 670->760, etc. but the 780 used the big chip with a 384-bit bus. This time around they want a 3080 laptop chip but the desktop 3080 is using the big chip so laptops get an uncut version of the chip in the desktop 3070 as the laptop 3080 and it all cascades from there. The laptop 3070 has fewer cores than the desktop version because they wanted to leave some space between the laptop 3080 and the 3070.
Its a lot easier to avoid the power limits when your full-speed desktop parts already top-out at 150w (then binning plus 10% down-clock gets your 970 down to 100-120W!oops, yeah yr right the 3070 is cut too. Geez I can't keep this straight... at least previous generations were consistently cut or not cut across the entire stack (the "special" 980 version being the only exception IIRC) but Ampere is all over the place.
Just my description of the biggest chip NV comes out with in a given generation that actually ends up on a graphics card. So the 384-bit bus chip, typically cut down a bit for anything that isn't a Tesla card or Titan (aka xx80Ti), though in this latest generation we got a 3090 instead of a Titan.Excuse the noob question. What is a "big" chip? An "uncut" chip? I know what is the bus and cores.
So for the 256 or 192 bit bus chips, is the physical die size the same or smaller? If the same, how is the chip "cut down?"Just my description of the biggest chip NV comes out with in a given generation that actually ends up on a graphics card. So the 384-bit bus chip, typically cut down a bit for anything that isn't a Tesla card or Titan (aka xx80Ti), though in this latest generation we got a 3090 instead of a Titan.
Cut down means parts of the chip are disabled. Like a desktop 3080 and 3090 use the same chip, but some parts of the chip are disabled for 3080s. The 3090 has 10496 Cuda cores and a 384-bit memory bus. The 3080 has the same number of physical cores and the chip could support 384-bit memory if it has no defects, but they disable some of the cores and 1/6th of the memory bus for the 3080 so it ends up with 8704 cores and a 320-bit memory bus.So for the 256 or 192 bit bus chips, is the physical die size the same or smaller? If the same, how is the chip "cut down?"
edit: oops ninja"dSo for the 256 or 192 bit bus chips, is the physical die size the same or smaller? If the same, how is the chip "cut down?"
It's likely that there are defects on the lower binned dies, although sometimes fully-functional dies are binned low to fill demand for lower-end parts (dunno how much that's happening with Ampere)AMD used to sell 3 core chips. I think they were 4 cores with 1 defective core disabled. Is is possible some of these chips have a defective part? Do some people ever try to enable the disabled cores?
It only happens, and may happen again when the competition is in the hole. There are no majik Radeons or Ryzens because they don't need to atm. Simple as that.I haven't heard of anything about re-enabling disabled parts of chips in a long time either. Back in the day those 3 core Athon 64 X3 chips wandplus was talking about could have the 4th core re-enabled. I never had one personally but hacks to turn it back on were out there and plenty of people on [H] tried it. Results were mixed. If you were lucky you got a fully functional chip, but often the 4th core didn't work at all or wouldn't work well enough to want keep it enabled. Maybe it could work but it killed your clock speed.