Of Nanometers and FinFETs

Do you prefer Manufacturer Process?

  • I prefer 16nm FinFET GPU for next gen

    Votes: 4 4.1%
  • I prefer 14nm FinFET GPU for next gen

    Votes: 12 12.4%
  • Manufacturer process is not part of my buying decision

    Votes: 81 83.5%

  • Total voters
    97

Brent_Justice

Moderator
Joined
Apr 17, 2000
Messages
17,755
Just an interesting poll idea I had. Rumors are rumors, and should not be taken as fact. I do not know what the next gen GPUs will be based on or who is fabbing them.

There is the debate or question as to whether AMD and NVIDIA GPUs will use 16nm FinFET or 14nm FinFET. Givin the fact that 14nm exists, one would assume it would be "better" for GPUs. Now, I know fab plants matter, more mature processes can triumph, for example a more mature 16nm process, could potentially be more efficient than a badly processed 14nm process.

Current rumors suggest NVIDIA may be going with 16nm FinFET, and with AMD it is up in the air, but could point to 14nm FinFET, very very rumored.

Given the choice between a 16nm FinFET GPU or a 14nm FinFET GPU would that sway your decision to buy a GPU?

A.) I would prefer a 16nm FinFET GPU in the next gen
B.) I would prefer a 14nm FinFET GPU in the next gen
C.) I do not include manufacturing process into my buying decision

Would be interesting to suss out what the general public thinks about process size, which could be different next gen, I said could, totally based on rumors.
 
I'd rather have a 14nm, but I'm not buying a shitty AMD card over that. AMD's so cheap that it probably wouldn't even be able to connect to my monitor! This is coming from a current 7970 owner. No more AMD cards for me.
 
Manufacturing process is not a part of my buying decision. It is not the only factor that determines the performance or power consumption. For example, a mature 16nm may result in better chips than a new problematic 14nm that suffers from leakage issues.

So is it's not a very useful metric for us consumers. It's the end result that matters, actual performance and power consumption figures.

By the way, if I'm not mistaken, I think these companies have different ways of how they measure these 16nm/14nm transistors isn't it?
 
Last edited:
For those that are actually going to state that one process is better then another I'd really hope they are able to actually explain why along with some type of source for the information they using to reach that conclusion.

Hopefully by taking a moment to consider the above they will then realize that they cannot actually determine which is better and refrain from doing so.
 
I couldn't careless about what nanometre process the GPU is made from.

If 14nm performs worse than 16nm, then there is zero point in choosing 14nm.

Performance is my only metric
 
Performance is my main metric, with noise a distant second (both my kids are setup in the computer room, so headphones are a must unless I want Minecraft videos mixed with my games), then followed up last by power.

Luckily I've moved on, so the need to drive a CRT is a non issue for me, but I look forward to watching rabidz lose his mind over next years cards as well.
 
I couldn't careless about what nanometre process the GPU is made from.

If 14nm performs worse than 16nm, then there is zero point in choosing 14nm.

Performance is my only metric

This.
 
I voted that process is not part of my buying decision, but on occasion it can delay it.
If I know a new process is coming out I might hold out for it, but price vs performance (sometimes how loud and how much power) and how much performance I need are the main factors.

My preference of 14 to 16nM has no bearing at all because I wont get to make that choice on a particular card.
If the process varies I will still hold performance as the bar.
 
Doesn't sway me one way or the other. The card that provides the better experience over all (we need an experience / $ metric!! :p ) wins my vote.
 
Its not like there is a major difference between 16 and 14 nm anyway.

Like it wasn't with 20 and 22nm

That said, most rumors point (and most likely IMO) to both nvidia and AMD going for 16nm for GPUs and AMD going for 14nm for CPUs and Nvidia for tegra.
 
Cell-SizeComparison.png


Are you talking about "16nm" aka 20nm + FinFet aka PR FUD?
 
Just an interesting poll idea I had. Rumors are rumors, and should not be taken as fact. I do not know what the next gen GPUs will be based on or who is fabbing them.

There is the debate or question as to whether AMD and NVIDIA GPUs will use 16nm FinFET or 14nm FinFET. Givin the fact that 14nm exists, one would assume it would be "better" for GPUs. Now, I know fab plants matter, more mature processes can triumph, for example a more mature 16nm process, could potentially be more efficient than a badly processed 14nm process.

Current rumors suggest NVIDIA may be going with 16nm FinFET, and with AMD it is up in the air, but could point to 14nm FinFET, very very rumored.

Given the choice between a 16nm FinFET GPU or a 14nm FinFET GPU would that sway your decision to buy a GPU?

A.) I would prefer a 16nm FinFET GPU in the next gen
B.) I would prefer a 14nm FinFET GPU in the next gen
C.) I do not include manufacturing process into my buying decision

Would be interesting to suss out what the general public thinks about process size, which could be different next gen, I said could, totally based on rumors.

actually you got the answer there Brent, as you exactly said a matured 16nm could potentially perform better. so that's my choice, I would prefer 16nm if that really mean will perform better than 14nm.. this isn't always a thing that I look when buying a new card but if your case scenario would be truth then I would always choose 16nm.
 
Manufacturing process is not a part of my buying decision. It is not the only factor that determines the performance or power consumption. For example, a mature 16nm may result in better chips than a new problematic 14nm that suffers from leakage issues.

So is it's not a very useful metric for us consumers. It's the end result that matters, actual performance and power consumption figures.

By the way, if I'm not mistaken, I think these companies have different ways of how they measure these 16nm/14nm transistors isn't it?

This matches my perspective. I'll look at the overall system-wide performance. The manufacturers themselves are privvy to the process and how it will affect their design decisions. I look at the output.

And, yes, there's different metrics about node sizing. Gate size and Metal 1 Pitch are the relevant ones. https://www.semiwiki.com/forum/content/3884-who-will-lead-10nm.html
 
I care only about result - technology is interesting in itself but can't make up for missing framerate.
 
As a technology enthusiast I find the different manufacturing processes interesting but ultimately I care about the end product and what it delivers.
 
Doesn't matter. What does matter is the three pees:

Price,
Performance,
Power.

I would add features to that list. These days a video card should be able to do more than just push a FPS meter.

It remains to be seen what the new manufacturing processes will offer.
 
Doesn't matter. What does matter is the three pees:

Price,
Performance,
Power.

I'm with this guy. Might as well tack on reliability too.

I don't care if it's something out of the ghetto mod thread if it's superior I'll buy it.
 
I buy based on whats best for my usage, based on real world testing, That said, if there was a new cycle of cards coming out in 2-3 months or less that is going to be using something new and I was just looking for an upgrade rather something I needed at the moment, I would probably hold off to see if there were any major improvements. Worst case, I could get the card I was planning to all along at a possibly lowr price.
 
Doesn't matter. What does matter is the three pees:

Price,
Performance,
Power.
Power = efficiency or watts used? Then yes I completely agree.

I always thought size matters the most but Nvidia's Maxwell changed my mind about that.
 
I would add features to that list. These days a video card should be able to do more than just push a FPS meter.

It remains to be seen what the new manufacturing processes will offer.

Isn't that a given? The new cards have to have the same or better features than the old card, or they won't sell :D

I was referring specifically to the impact of the process node.
 
Power = efficiency or watts used? Then yes I completely agree.

I always thought size matters the most but Nvidia's Maxwell changed my mind about that.

It's why I included the three together. They're not independent variables, so yes I mean performance/dollar, performance/watt (and any other combination of those).
 
Manufacturing process is not a part of my buying decision. It is not the only factor that determines the performance or power consumption. For example, a mature 16nm may result in better chips than a new problematic 14nm that suffers from leakage issues.

So is it's not a very useful metric for us consumers. It's the end result that matters, actual performance and power consumption figures.

By the way, if I'm not mistaken, I think these companies have different ways of how they measure these 16nm/14nm transistors isn't it?

This. I don't care what process so long as the end result meets my wants, i.e. power/temp/noise/performance.
 
My metric is a combination of performance/price (with performance meaning the most) and applied features (a feature that no (or just one) developer uses is irrelevant for me).
 
Are the people who believe one process is "better" than the other, such as the 8 that voted for one of them, willing to and able to explain the rationale behind their choice?
 
Perf/price, with heavier weighting towards performance. 0 fucks given about nanometers.
 
I want It to be 14nm cause I am a nerd but I agree with the rest. Its all about performance.
 
Please explain.

Ok Ill break the simple statement down for you.
I like the advancement of technology be it software or hardware. I love knowing we are moving forward regardless of what area of technology it may be.
Are we good now buddy?
 
Ok Ill break the simple statement down for you.
I like the advancement of technology be it software or hardware. I love knowing we are moving forward regardless of what area of technology it may be.
Are we good now buddy?

More advanced how?

I'm sorry if this seems harsh but I'm trying to illustrate why peoples opinions on one or the other is extremely flawed.

Maybe there are a few industry people on here but otherwise why do people feel that have the information to actually make any remotely informed comparison?
 
The reduction in size allows more transistors which allows chip engineers more room to play with.
How they end up using that is beyond me. The basics of a smaller process node is not lost on me. And the reality that a smaller process node GPU could still suck is not lost on me either.
Also if someone is excited by these things then why judge?
 
The reduction in size allows more transistors which allows chip engineers more room to play with.
How they end up using that is beyond me. The basics of a smaller process node is not lost on me. And the reality that a smaller process node GPU could still suck is not lost on me either.
Also if someone is excited by these things then why judge?

but also an immature 14nm process node can suffer from heavy leakage which can be just make it perform worse than a more matured 16nm of course these are just theory.
 
but also an immature 14nm process node can suffer from heavy leakage which can be just make it perform worse than a more matured 16nm of course these are just theory.

. The basics of a smaller process node is not lost on me. And the reality that a smaller process node GPU could still suck is not lost on me either.

I know you are just expanding on what I said but I felt the need to bold the following text in order to show that I am not clueless as to the realities behind process node shrinks.
 
I'm just going to state upfront this isn't personal but I'm just trying to drive a point across without being buried (which has already happened).

In your posts you basically mention no certainly regarding how the actual characteristics of these two processes compare (which is fine as other then industry insiders who would?) In this case then why do you feel that 14nm is more advanced? Just because of the public facing name used? I think something that should be clear here is that 14nm and 16nm in this context are not from the same company and so you cannot assume some sort of internal consistency.

As for your other point about people being excited about what they want, that is a completely fair point and I'm completely supportive of that. At the same time however I'd hope that they are perhaps open to being more informed. If one prefers 14nm because they feel 14nm>16nm due to the name that they may want to re-examine that position and whether or not the reasoning behind it is sound.
 
As for your other point about people being excited about what they want, that is a completely fair point and I'm completely supportive of that. At the same time however I'd hope that they are perhaps open to being more informed. If one prefers 14nm because they feel 14nm>16nm due to the name that they may want to re-examine that position and whether or not the reasoning behind it is sound.

totally agree.
 
Do I really need to go over all that each time? Factum already posted about this exact thing.
 
We've been on 28nm so long that I won't be basing new GPU purchases on strictly 14 vs 16. What I will be basing it on is availability.

If each company goes a different direction, but one suffers stock issues due to yield, then that helps make my decisions. There's few things more annoying than quantity issues.
 
Back
Top