AMD Demonstrates Revolutionary 14nm FinFET Polaris GPU Architecture

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
​AMD provided customers with a glimpse of its upcoming 2016 Polaris GPU architecture, highlighting a wide range of significant architectural improvements including HDR monitor support, and industry-leading performance-per-watt. AMD expects shipments of Polaris architecture-based GPUs to begin in mid-2016. AMD’s Polaris architecture-based 14nm FinFET GPUs deliver a remarkable generational jump in power efficiency. Polaris-based GPUs are designed for fluid frame rates in graphics, gaming, VR and multimedia applications running on compelling small form-factor thin and light computer designs.

“Our new Polaris architecture showcases significant advances in performance, power efficiency and features,” said Lisa Su, president and CEO, AMD. “2016 will be a very exciting year for Radeon™ fans driven by our Polaris architecture, Radeon Software Crimson Edition and a host of other innovations in the pipeline from our Radeon Technologies Group.” The Polaris architecture features AMD’s 4th generation Graphics Core Next (GCN) architecture, a next-generation display engine with support for HDMI® 2.0a and DisplayPort 1.3, and next-generation multimedia features including 4K h.265 encoding and decoding. AMD has an established track record for dramatically increasing the energy efficiency of its mobile processors, targeting a 25x improvement by the year 2020.
 
Bout' bloody time.

I wonder if the 3d transistors will do better with overclocking then Ivy Bridge did.
 
Given AMD's PR history of promising insane gpu but turns in to not as good as claimed. I wouldn't take this as face value til its proven by independent reviewers.
 
I'll believe it when they are reviewed by the most trusted 3rd party tech sites.
 
Given AMD's PR history of promising insane gpu but turns in to not as good as claimed. I wouldn't take this as face value til its proven by independent reviewers.

They both do the same shit. E.g. the roughly 10x faster than last gen claim by Nvidia..
I wait until the reviews hit.
 
im willing to believe the claims.

the nano was a taste of things to come, running on the first version of hbm, and 28nm. hbm 2, 14nm, and the first real architecture change in years should show massive performance gains across the board.
 
"Demonstrates" does not mean "talks about".

This was my first thought too when reading this AMD PR release was wondering when the definition of "demonstrate" changed. Think I'll go "demonstrate" i.e. talk about how I might at some poing go about the job, to my wife how I've fixed the leak in the sink -- that should go over well :D

Echoing others, I'll believe it when it ships. I have high hopes for FinFet (more due to the die shrink than any architectural magic), but have very, very low expectations for delivery date. Wouldn't be surprised if we don't see anything hit the channel (from either green or red teams) until late 2016.
 
I like AMD but I'm Intel Nvidia guyfor life I learned about Finfet technology from Maximum PC.
Didn't some of the brain power go to AMD recently and then back to Intel then back to AMD.

cut paste

In current usage the term FinFET has a less precise definition. Among microprocessor manufacturers, AMD, IBM, and Freescale describe their double-gate development efforts as FinFET[10] development whereas Intel avoids using the term to describe their closely related tri-gate
 
In my dictionary demonstrates means showing a working example, and not speaking about what they hope it would be.
 
They both do the same shit. E.g. the roughly 10x faster than last gen claim by Nvidia..
I wait until the reviews hit.

Least when nvidia says something it mostly is true and lives up to claims they make unlike AMD, case in point fury x being 30% faster in fps then 980ti? o wait that was true i guess if you used AMD's custom settings, kinda like what Apple used to do back in the day to claim their hardware was faster.
 
I like AMD but I'm Intel Nvidia guyfor life I learned about Finfet technology from Maximum PC.
Didn't some of the brain power go to AMD recently and then back to Intel then back to AMD.

cut paste

In current usage the term FinFET has a less precise definition. Among microprocessor manufacturers, AMD, IBM, and Freescale describe their double-gate development efforts as FinFET[10] development whereas Intel avoids using the term to describe their closely related tri-gate

Does it matter how people market/define a production process? What is important are the benefits clearly the process allows AMD to make better use of their technology does it really matter to describe the manufacturing process in the smallest details?

AMD does not have the own FAB so any people moving around would not have anything to do with the process they are using they have been dependant on Global Foundries or TSMC and now Samsung ...
 
Least when nvidia says something it mostly is true and lives up to claims they make unlike AMD, case in point fury x being 30% faster in fps then 980ti? o wait that was true i guess if you used AMD's custom settings, kinda like what Apple used to do back in the day to claim their hardware was faster.

It seems you've all conveniently forgotten the *cough* 4GB *cough* Nvidia 970 already. And the 2MB L2 cache... wait, I mean 1.75MB L2 cache and 64 ROPs... No, hold on... 56 ROPs it contained? For 4½ months after release until they were called to the carpet on it.

Try to keep it real, folks.
 
"Demonstrates" does not mean "talks about".

This was my first thought too when reading this AMD PR release was wondering when the definition of "demonstrate" changed. Think I'll go "demonstrate" i.e. talk about how I might at some poing go about the job, to my wife how I've fixed the leak in the sink -- that should go over well :D

AnandTech - For their brief demonstration, RTG (Radeon Technology Group) set up a pair of otherwise identical Core i7 systems running Star Wars Battlefront. The first system contained an early engineering sample Polaris card, while the other system had a GeForce GTX 950 installed.

How was that not a demonstration?
 
Weird display to show power consumption. I wanna see a top end card uncrippled... How does playing SWBF vsync 60Hz, medium preset @1080p show much of anything? That just demonstrates power saving abilities. That doesn't demonstrate power consumption under a fully loaded (real world) scenario.
C'mon AMD, get yer shit together.
 
Weird display to show power consumption. I wanna see a top end card uncrippled... How does playing SWBF vsync 60Hz, medium preset @1080p show much of anything? That just demonstrates power saving abilities. That doesn't demonstrate power consumption under a fully loaded (real world) scenario.
C'mon AMD, get yer shit together.

it's easier to start with a smaller die on a new process node to test proof of concept. For years that was the standard. Trying to work out gating and signal leak issues on a large complex die takes much longer.

For a long time we were stuck at 28nm, and all those kinks were worked out, so successive generations went out full bore on size, and then later offered less powered cards with disabled computer units/shaders based on the full size dies with defects.
 
it's easier to start with a smaller die on a new process node to test proof of concept. For years that was the standard. Trying to work out gating and signal leak issues on a large complex die takes much longer.

For a long time we were stuck at 28nm, and all those kinks were worked out, so successive generations went out full bore on size, and then later offered less powered cards with disabled computer units/shaders based on the full size dies with defects.

According to the video, this was demo'd live on the 29DEC2015. AMD somewhere has stated that mid '16 is availability for these cards. If that is indeed the case, the high end stuff is already tapped out and is at OEMs for design and engineering/samples. This information is what has me boggled. Either way, having these two cards running without vsync would have made more sense. Then again, that would have shown us some performance, and they dont wanna do that yet.
 
According to the video, this was demo'd live on the 29DEC2015. AMD somewhere has stated that mid '16 is availability for these cards. If that is indeed the case, the high end stuff is already tapped out and is at OEMs for design and engineering/samples. This information is what has me boggled. Either way, having these two cards running without vsync would have made more sense. Then again, that would have shown us some performance, and they dont wanna do that yet.

If you are pegged out at 60fps, why would you want to engage tearing and non adaptive frame syncing?

That was not the target audience for the chip. They were demoing for laptop purposes to show power efficiency.
 
And early engineering samples go out to OEM's all the time. It doesn't mean they have stable boards ready. The high end parts are rumored to use HMB2. Maybe that isn't ready yet.
 
I bet they dropped a lot of the double FP support as no one is using it except compute workstations. That additional circuitry really eats into a power budget.
 
I bet they dropped a lot of the double FP support as no one is using it except compute workstations. That additional circuitry really eats into a power budget.

Yep. Hugely wasted. Heck, a lot of things could be done a half-FP precision.

And for what I've read, the high-end Polaris chips will be a bit in coming. Smaller dice to start. Only Intel have demonstrated close/at reticle limit at 14nm, so it may be a little bit until Samsung/GF and TSMC's processes are up-to snuff.
 
Stupid question time: is this a name change? I thought AMD's next generation was going to be the Arctic Islands architecture with Greenland, Ellesmere and Baffin chips. Is Polaris the new name for Arctic Islands, then?
 
Stupid question time: is this a name change? I thought AMD's next generation was going to be the Arctic Islands architecture with Greenland, Ellesmere and Baffin chips. Is Polaris the new name for Arctic Islands, then?

Perhaps it is simply the name they are giving for the process itself?
 
Nah Polaris is referring to how cold it is outside of AMD's Commerce Valley Drive's offices ;)
 
As an Nvidia user I wish AMD the very best with Polaris. Having competition keeps everybody honest especially with pricing. If AMD dies Nvidia will raise the price again and they will use the same chip design for years stagnating the market.
 
As an Nvidia user I wish AMD the very best with Polaris. Having competition keeps everybody honest especially with pricing. If AMD dies Nvidia will raise the price again and they will use the same chip design for years stagnating the market.

They can't raise prices in such a manner. Price gouging is illegal in the U.S., and I'm sure even more so in the EU. The market may become stagnet, but someone will buy the Radeon group if something does happen to AMD.
 
I believe that 2016 is going to be an exciting year in the Computer Technology Industry! New video cards on new processes, new CPU's from both sides, DX 12 advancements, storage advancements and I am sure many more things I cannot think of at the moment.
 
They can't raise prices in such a manner.
Um, they already have. $1,000 Titans. $3,000 Titan Z. They can ask anything they like for their products. The only way to guarantee lower pricing is for people to stop buying them. No sales = no revenue = price reductions.
 
Price gouging is illegal in the U.S.

Emm... without competition, proving price gouging is a weeeee bit more difficult. Amongst the many other issues it creates like R&D slowdown... etc...

Without competition would Apple's very high profit margins be considered price gouging? In a competitive environment it isn't, what would price gouging be objectively evaluated at?
 
Um, they already have. $1,000 Titans. $3,000 Titan Z. They can ask anything they like for their products. The only way to guarantee lower pricing is for people to stop buying them. No sales = no revenue = price reductions.

Ulta Top end products don't count, and never have. Thats like claiming price gouging on a Ferrari.
 
Emm... without competition, proving price gouging is a weeeee bit more difficult. Amongst the many other issues it creates like R&D slowdown... etc...

Without competition would Apple's very high profit margins be considered price gouging? In a competitive environment it isn't, what would price gouging be objectively evaluated at?

Its evaluated on market pricing that can be trended, and the data is there. Its when upu have competition, remove it, and then spike in price because you can
 
Its evaluated on market pricing that can be trended, and the data is there. Its when upu have competition, remove it, and then spike in price because you can

How about a downwards spike in costs without lowering msrp?
 
Price gouging is a pejorative term referring to when a seller spikes the prices of goods, services or commodities to a level much higher than is considered reasonable or fair, and is considered exploitative, potentially to an unethical extent. Usually this event occurs after a demand or supply shock: common examples include price increases of basic necessities after hurricanes or other natural disasters. In precise, legal usage, it is the name of a crime that applies in some jurisdictions of the United States during civil emergencies. In less precise usage, it can refer either to prices obtained by practices inconsistent with a competitive free market, or to windfall profits. In the former Soviet Union, it was simply included under the single definition of speculation.

Video cards are hardly a necessity.
 
Nah Polaris is referring to how cold it is outside of AMD's Commerce Valley Drive's offices ;)
With Polaris actually being a star, it must refer to how it can burn your house down with how hot it gets :p.
 
Back
Top