Lenovo confirms GeForce RTX 3060 12GB, RTX 3050 Ti 6GB, and RTX 3050 4GB

Marees

[H]ard|Gawd
Joined
Sep 28, 2018
Messages
2,038
Lenovo has listed new unreleased GeForce RTX graphics cards in the specifications of its desktop Legion gaming system.


The RTX 3060 12GB & RTX 3050 Ti 6GB would be based on the same die, GA106

RTX 3050 4GB is based on the GA107 die

https://videocardz.com/newz/lenovo-...6GYvXmjJabMgPqxs8_I5lfpSoCSB-DbCoG0LpLkHIY5g8



Lenovo Legion R5 28IMB05 GPU Options, Source: Lenovo

VideoCardzGeForce RTX 3060 TiGeForce RTX 3060GeForce RTX 3050 TiGeForce RTX 3050
NVIDIA GeForce RTX 3060 & RTX 3050 Series
ArchitectureNVIDIA AmpereNVIDIA AmpereNVIDIA AmpereNVIDIA Ampere
GPUGA104-200GA106-400GA106-300GA107-300
CUDA Cores4864~3840~3584~2304
RT Cores38~30~28~18
Tensors/TMUs152~120~112~72
Memory8GB GDDR612GB GDDR66GB GDDR64GB GDDR6
Memory Clock14 Gbps16 GbpsTBCTBC
Memory Bus256-bit192-bit192-bit128-bit
TGP/TBP~180WTBCTBC~90W
Release DateDecember 2nd, 2020January 2021TBCTBC
Source: Lenovo
 
3060 gets 12GB when the 3060ti got 8GB?

:confused:

I find it interesting that they could release the 12 GB 3060 before the 6 GB 3050ti, even though both are based on the same die GA106

This could mean that tbe 12 GB 3060 is priced closed to the 8 GB 3060 ti
 
3060 gets 12GB when the 3060ti got 8GB?

:confused:

not entirely unprecedented. Used to be pretty common to see low end GPUs like the geforce 730 with 4gb of low speed ddr3 when everything else mid-range was 2gb ddr5, for instance.

In this case it appears to be a function of the capacity available with the smaller memory bus and a need to differentiate from the 3050ti.

funny though that it will have more memory than the cards all the early adopters overpaid for.
 
Good to see they made a RTX 3050 and not a GTX 3050. I gotta see the prices first before I give any opinion on them.
 
And this is why you do not buy the first release of NVIDIA cards, cause then they do this.

By the time these cards come out I will still be able to sell my card near MSRP and wouldve enjoyed 6month of superior gaming experience.

ive bought flagship gpus since 2003 and just think of the cost lost when reselling as renting the best possible gaming experience. It generally doesnt cost costs 100-200 per generation which is nothing compared to what people spend on movies/dinners. Most people dont realize this and simply look at the big initial cost.
 
Last edited:
If the 3050 is a 90W card, will this gen not have a <75W (i.e., no PCIe power plug required) model? If so, that might make 1650s highly desirable. (Those are the best PCIe slot-powered GPUs still, yes?)
 
  • Like
Reactions: xx0xx
like this
What the hell is going on at Nvidia....


3060 lower end with more video memory than their flagship card?

Go home Nvidia , youre drunk.
 
A 3060 with 12GB would be my goto budget card for just about any of the graduating drafting, animation, video production, students. Memory is good enough that the applications won’t overflow and it should be fast enough that they would finish in a reasonable timeframe. Could also handle light gaming while still sipping power, which is important in many dorms because power there isn’t always great.
 
3060 gets 12GB when the 3060ti got 8GB?

:confused:
It is quite the messy line up, does the 256 bits has to do with it, the graph also have a line memory clock filled with gbs instead of mhz, so maybe there is some typo in it.

If the typo is memory clock is supposed to be memory bandwith and the value are good it would be even stranger, less ram with a larger bus on the more expensive model, but end up with a lower bandwidth ?
 
I don't understand why nvidia doesn't just release one version of their midd-range cards. Instead of let's say $299 for a 6gb and $330 for the 12gb (throwing random numbers) why not an 8gb at $315? Why split a market that doesn't need to be split? You'd still make hella sales and would be less confusing for the consumer. Nowadays 6gb is just not enough. Not even for 1080p.
 
I don't understand why nvidia doesn't just release one version of their midd-range cards. Instead of let's say $299 for a 6gb and $330 for the 12gb (throwing random numbers) why not an 8gb at $315? Why split a market that doesn't need to be split? You'd still make hella sales and would be less confusing for the consumer. Nowadays 6gb is just not enough. Not even for 1080p.
6GB is more than enough for 1080p. For instance my modded Skyrim with all 2K-4K textures only pulls a little over 4GB of VRAM, and that's at 1440p

Nvidia likes to carpet bomb the market.
 
Nowadays 6gb is just not enough. Not even for 1080p.
It the sense of is not even to be safe to play 1080p in the next few year's or right now ?

Looking at specially big game like Cyberpunk 2077 at ultra quality in 1080p, the 6gig 2060 super is ahead of the 1080TI and the geforce 1660 is almost identical to the 8 gig 1070 or Flight Simulator (where the 1660 is actually ahead of the 1070), maybe that not a good way to look at it too
 
Looks like the 3050ti will be the ideal budget gamer option. 3060 doesn't look much faster, and for games, I think 12GB won't be all that useful given the speed of it.
 
I think a lot of these are placeholders because it sounds like Nvidia is preparing to dump their originally planned lineup for a Ti-overhauled lineup.

I doubt Nvidia is even 100 percent sure of what they'll have in 2021, let alone Lenovo.
 
Man if that power draw is right on the 3050 it is going to need a PCIe plug :(. One of the nice things about the 1050 is you can get them with no external power so if you have a low-power system like a Dell Optiplex that needs a dGPU you can add one even though they don't have any PCIe power connectors.
 
I think a lot of these are placeholders because it sounds like Nvidia is preparing to dump their originally planned lineup for a Ti-overhauled lineup.

I doubt Nvidia is even 100 percent sure of what they'll have in 2021, let alone Lenovo.

From the sound of things the Tis are going to be higher priced versions rather than direct replacements. At least initially.

In some parts of the world it is already 2021. They probably know their plans for Q1 at least.
 
  • Like
Reactions: Axman
like this
Oh I would absolutely be shocked if Nvidia removes SKUs and listings from their website, but I think they're going to zero in on a couple of price points, max out production on those parts, and fart out enough of the rest of whatever lineup they formally announce, just to say they actually went all-in top to bottom. And I expect Nvidia and it's partners to stick with higher-memory models, not because of margins, but because that's what the market wants.

Anything else that gets made in quantities will probably be tailored for mining or the like and never hit the retail market in numbers.
 
Man if that power draw is right on the 3050 it is going to need a PCIe plug :(. One of the nice things about the 1050 is you can get them with no external power so if you have a low-power system like a Dell Optiplex that needs a dGPU you can add one even though they don't have any PCIe power connectors.
You gotta watch out, though, Dell is starting to put out systems with only 200W power supplies.
 
Good to see they made a RTX 3050 and not a GTX 3050. I gotta see the prices first before I give any opinion on them.

IIRC NVidia confirmed over the summer that all of this generations cards would support RTX.
 
If the 3050 is a 90W card, will this gen not have a <75W (i.e., no PCIe power plug required) model? If so, that might make 1650s highly desirable. (Those are the best PCIe slot-powered GPUs still, yes?)

This is the first GA-107 card they've revealed. I'd expect a cut down 75W version to show up a bit later. Presumably RTX3040.
 
This is the first GA-107 card they've revealed. I'd expect a cut down 75W version to show up a bit later. Presumably RTX3040.

Yeah, unfortunately, RT adds an additional 20% power consumption.

The only way to massively increase performance for the high-end is to increase power by 50w;. they will have to optimize the hell out of a 75w Ampere cut (might take as long to surface as thr 1650 Ti)

I also expect the 3060 to maintain that 160w TDP like the 2060 at-launch..at least this time around we're actually building an RTX 106 part!
 
The RTX 3050 would be a huge step up from the 1650 (896 cuda cores) asside from TDP as mentioned above.

Most intersting feature being how DLSS could make a low end card such as the 3050 punch above its weight class.
 
So Nvidia is going to release the lower end cards before AMD, AMD comes afterwards and walks all over them -> Nvidia scrambles yet again with even more perturbations of cards :ROFLMAO:. From Ti lineup to a 3rd, Super lineup. The problem I see with AMD is them producing enough dies to make sufficient cards for their PC GPUs. I wonder if Microsoft and Sony are hounding AMD for more, more, more . . . console GPUs? At least Nvidia is getting numbers up for cards.
 
The RTX 3050 would be a huge step up from the 1650 (896 cuda cores) asside from TDP as mentioned above.

Most intersting feature being how DLSS could make a low end card such as the 3050 punch above its weight class.

if you compare it to similar TDPs, it's competing with the 1650 Super. This is going to bet at-best a sidegrade.
 
I'll be honest I'm really confused about the 3060. Are these specs just wrong? Is there really a universe where it has more memory than their flagship cards? This has to be an error.r
It may be aimed at non-gamers who need video memory, but not the raw processing power, which IMO is fair. It very well may not be marketed as a budget card in that sense, that's where the 3050 series come into play.
 
I'll be honest I'm really confused about the 3060. Are these specs just wrong? Is there really a universe where it has more memory than their flagship cards? This has to be an error.r

It seems that NVIDIA no-longer wants the complexity of it's split memory buses (GTX 660 and GTX 970 come to mind).

Ampere parts were all released on the tail-end of 8 Gigabit chip density, so there is going to be some overlap during that eventual transition/refresh; As to why the fuck this exists, the RX 6700 parts are also expected to be 12GB.

https://www.notebookcheck.net/The-A...2-GPUs-with-12-GB-of-GDDR6-VRAM.505945.0.html

When AMD started offering double-density across-the-board, you need to actually refresh today. The 3070 could really use the same memory they ship on the 6800, but the new 3060 parts will be easier to launch this on.

If you think the 3070 is choked for memory, just understand what a fucking constraint it would be dropping a 2080 down to 6GB ram! You can't even max-out Doom Eternal at 1080p with only 6gb ram!
 
Last edited:
If you think the 3070 is choked for memory, just understand what a fucking constraint it would be dropping a 2080 down to 6GB ram! You can't even max-out Doom Eternal at 1080p with only 6gb ram!

This all just makes me realize how much I can’t wait for the 3080ti to come out. Will be upgrading immediately from my 3080. The value on that card is going to be such a big deal.
 
I'm guessing the RTX 3060 needs the extra memory to really justify the price point they want to slot it at. Not that the card needs it, but because it allows them to charge more for it. The 3050Ti looks to be within a few percentage of a 3060 on paper (It's 93.3% of the raw power), so some vendor OC'ed 3050Tis might trade blows with a stock 3060.
 
If the 3050 is a 90W card, will this gen not have a <75W (i.e., no PCIe power plug required) model? If so, that might make 1650s highly desirable. (Those are the best PCIe slot-powered GPUs still, yes?)

There are still numbers left. We could see a 3040 built if there was actually a market for one. The question is would it be fast enough over an integrated GPU / APU to get people to buy one?

One of the primary markets for lower end cards used to be OEM machines. When they sold those cards by the millions in Dell and HPs for businesses there was plenty of reason to make the card. Most of the OEMs I see now don't have discrete graphics, so the volume they've been selling those cards at has likely tanked over the past few years.

That said just looking at this card you see there is only one GA107 card. They could likely do a cut GA107-200 if they are already making those parts, so it does seem like there is a chance to see an RTX 3040. The thing that I've been saying since this series released is they had to make room for more numbers at the top, so they pushed everything down. The 3060Ti is really a 3070, so that would also lend to them making another card that slots below a xx50 card in this generation.
 
I'm guessing the RTX 3060 needs the extra memory to really justify the price point they want to slot it at. Not that the card needs it, but because it allows them to charge more for it. The 3050Ti looks to be within a few percentage of a 3060 on paper (It's 93.3% of the raw power), so some vendor OC'ed 3050Tis might trade blows with a stock 3060.
Nah the 3060 is poised for the perfect spot for enthusiast or student level production work. Pair it with the creator drivers and it has enough memory and all the features needed for all the major software sets while being fast enough to get it done in a reasonable time frame.
 
Back
Top