More details on AMD graphics refresh for 2015 and new tech in 2016.

If the next cards AMD rolls out aren't 20nm, they're losing me as a customer. End of story.
All of this bantering back and forth from the rumor mill isn't helping the situation.

AMD may indeed be preparing 20nm GPUs for 2015.
Didn't AMD themselves confirm this a long time ago?
"20nm in 2015" could mean 1 year from today. Obviously they're preparing 20nm for that time frame.
 
If the next cards AMD rolls out aren't 20nm, they're losing me as a customer. End of story.
All of this bantering back and forth from the rumor mill isn't helping the situation.


Didn't AMD themselves confirm this a long time ago?
"20nm in 2015" could mean 1 year from today. Obviously they're preparing 20nm for that time frame.
You realize that the GPU manufacturers hands are tied with regards to process node availability, right? Neither AMD nor NVIDIA own a fab, so they have to rely on what TSMC and GloFo make available, as well as compete with other vendors for production slots.
 
You realize that the GPU manufacturers hands are tied with regards to process node availability, right? Neither AMD nor NVIDIA own a fab, so they have to rely on what TSMC and GloFo make available, as well as compete with other vendors for production slots.
Sure, but if AMD puts out another 28nm series of cards it means Nvidia will be beating them to the punch. Unless you think AMD will launch the 300 series as 28nm, and then another 300/400 refresh on 20nm *before* Nvidia does anything?
I'm all for supporting underdogs but I refuse to support incompetence.

At this point (well, February) a new series on 3-year-old hardware is an insult.

We already knew the GTX 900 series would be a stop gap. And Nvidia beat them by 5-6 months already.
 
So if it turns out faster and cheaper but 28nm you will still be disappointed. From my understanding GM200 will also be 28nm, so not sure what the issue is. Both AMD and NVIDIA will be using 3 year old technology.
 
If the next cards AMD rolls out aren't 20nm, they're losing me as a customer. End of story.
All of this bantering back and forth from the rumor mill isn't helping the situation.


Didn't AMD themselves confirm this a long time ago?
"20nm in 2015" could mean 1 year from today. Obviously they're preparing 20nm for that time frame.

So go get 20nm stuff from nvidia.
 
AMD wanted to build Hawaii on GloFo 28nm fab but it wasn't ready .. It's my understanding that it's ready now and will offer 20-25% of what 20nm would give across the board just using it and not counting anything AMD added to the build in speed or power savings..

Now think about the 290x offering 20- 25% more performance and power savings useing GloFo 28nm just switching from TMSC and then Hawaii full fat and you get the 380X from those leaked benchmarks.. :D
 
I think people get too hung up on the knobs and bolts... Really: if it runs my games faster and doesn't turn my room into a sauna, I'm happy! I don't care if it's made on 60nm, if it beats out the old tech, what does it matter?
 
I think people get too hung up on the knobs and bolts... Really: if it runs my games faster and doesn't turn my room into a sauna, I'm happy! I don't care if it's made on 60nm, if it beats out the old tech, what does it matter?

+1

If anything, let's say if 60nm performs better than 28nm chip, generates less heat and eats lets electricity, how would being made of 60nm actually matter?

Seriously, I don't understand why people even gets hung up about the need to go lower process. Sometimes optimising old tech gives better results than always shooting for new tech (we are going to hit a brick wall with silicon pretty soon, I'd rather they hit it later than earlier)

If the 3YO tech works better than 6months old tech, why the rush to get the 6months out
 
The reason to rush out the 6 mo old tech is so hopefully I can have the performance of a 290x, the power usage of a 270x, and it will only be 6.7 in long. I am starting to get tired of these super long power hungry graphic cards just so we can have some great graphics. You would think that with shrinking die sizes they could start implementing some serious power savings.
 
The reason to rush out the 6 mo old tech is so hopefully I can have the performance of a 290x, the power usage of a 270x, and it will only be 6.7 in long. I am starting to get tired of these super long power hungry graphic cards just so we can have some great graphics. You would think that with shrinking die sizes they could start implementing some serious power savings.

Shrinking only gets you so far.
Trying to throw 2x more transistors into a GPU but only getting 20-30% less power due to the shrink means that power consumption increases. Not to mention the potential increase in leakage and just the general complications that come with a new process.

The key issue is cost right now. 20nm and lower is going to drastically increase the cost of designing and manufacturing ASICs until a major breakthrough happens.
 
The reason to rush out the 6 mo old tech is so hopefully I can have the performance of a 290x, the power usage of a 270x, and it will only be 6.7 in long. I am starting to get tired of these super long power hungry graphic cards just so we can have some great graphics. You would think that with shrinking die sizes they could start implementing some serious power savings.

Well, let's face it: there already exists 270s, which are more powerful than 6970s yet use a lot less power, but AMD then dumped all that TDP headroom of the GCN arch into a card just as power hungry (if not more) even though the 7870/270 is already faster. And that is why no Die-Shrink will ever be 'the one' that eliminates the big, power hungry cards. If you make an arch that is ~30% more efficient, you release the efficient card AND the monster that is 30% faster while using the same power, heat and size.

'But Nvidia!' Well, Nvidia have an ace up their sleeve, I just know it. There is no way they have all that TDP headroom an have no designs to take advantage of it.
 
guys forget about gpus being on 20nm

The only thing that will be 20nm is the ps4 and xbox one stuff. Current the yields SUCK on the larger dies, making it nearly impossible to mass produce.

Amd's new cards slated for 2015, will likely be on Global Foundries 28nm FD-SOI. You heard correctly a SOI made gpu instead of bulk. Now before you start ranting about it being 28nm. The Global foundries 28nm FD-SOI has the yields of 28nm (really good) but the performance and efficiency matching those of a 20nm process.


900x900px-LL-78510c28_bc5vtCJ.png
 
My question is what are the limits of chip size and will density go up?
Hawaii is extremely dense, but i want more.
 
Correct.. it is said that 28nm FD-SOI has the fastest transistors out there and even faster then Intels..

This will be a new era for AMD and as they said there will be no let up on performance from here on.. Nvidia will soon find out.
 
Rory Read bungled every single release during his tenure. It's hurt AMD market penetration. I believe Lisa Su will prove to be far more adept. It's kind of a tired story and a sensitive subject for some, but for a woman to succeed in a male dominated industry she needs to be really really good. I hate to put the pressure of her gender on her shoulders, and it's not what I'm trying to do, but I'll be very surprised if she doesn't do the best job of any AMD CEO in recent times. Time will tell.
 
The reason to rush out the 6 mo old tech is so hopefully I can have the performance of a 290x, the power usage of a 270x, and it will only be 6.7 in long. I am starting to get tired of these super long power hungry graphic cards just so we can have some great graphics. You would think that with shrinking die sizes they could start implementing some serious power savings.

but as soon as they give us that we start wanting to run 3@4K displays at 60 Hz or better

and we are back to big hot cards again
 
Rory wanted to totally ditch the GPU side when he first came to the helm. He was a GPU-centric CEO. That is not to say that he did anything for the CPU side either during his tenure. It's APU biz, that's another story.
 
My question is what are the limits of chip size and will density go up?
Hawaii is extremely dense, but i want more.

Reticle limit should still be around 600mm2, depends on the shape of the ASIC.

I believe there was talk of some density benefits compared to TSMC 28HPM but that will also depend on the libraries used and the design itself.

The main benefit is the lower cost compared to 20nm while getting most of the power/performance benefits.
 
Looks more like a typo, what kind of idiot would put an unannounced GPU on their LinkedIn profile?
 
If the next cards AMD rolls out aren't 20nm, they're losing me as a customer. End of story.
All of this bantering back and forth from the rumor mill isn't helping the situation.


Didn't AMD themselves confirm this a long time ago?
"20nm in 2015" could mean 1 year from today. Obviously they're preparing 20nm for that time frame.

http://www.fudzilla.com/news/graphics/36721-20nm-gpus-not-happening

Neither company looks like they are going to have 20nm GPUs.
 
Is that even something to brag about? :confused::confused::confused:
I thought they proposed LESS power consumption with the new chips.

Some people are going to need separate power supplies.

well, if it's 2x as fast as a 290x, then 300W wouldn't bother many people.

considering they are using 28nm, the really big question is, how much performance is it going to put out vs NVs offering. most people will ignore power usage if the performance is high enough.
 
Is that even something to brag about? :confused::confused::confused:
I thought they proposed LESS power consumption with the new chips.

Some people are going to need separate power supplies.

That's why case fans and watercooling were invented. :)
 
Must have gotten their 4th quarter earnings back that they have to report in February. This will show what a dent Maxwell in its first full quarter did to their bottom line. Don't kid yourselves they are teetering right now.
 
Is that even something to brag about? :confused::confused::confused:

If true, technically yes. But I doubt it is true, except for maybe a halo card with the hybrid cooler.
Remember, not all the TDP is from the GPU chip itself all the subsystems and power filters use power as well.

Some people are going to need separate power supplies.
Separate power supplies for a 300w TDP?
Remember Fermi? Remember Hawaii? Remember every dual GPU card ever made?
 
If true, technically yes. But I doubt it is true, except for maybe a halo card with the hybrid cooler.
Remember, not all the TDP is from the GPU chip itself all the subsystems and power filters use power as well.


Separate power supplies for a 300w TDP?
Remember Fermi? Remember Hawaii? Remember every dual GPU card ever made?

If the card is 300w and it's top-end, that's an improvement over their previous generation. It may not be winning any efficiency awards, but hey, that means the second-tier down May only be 275w...
 
Wasn't there a Voodoo 5000 or something like that , towards the end of the 3dfx line , that had an external connector for power coming directly from an AC adapter as at the time power supplies weren't really ramped up for that kind of wattage ?

Anyway even as a member of team green for years I have been on team red with the 9700/9800 series. I'd love to see ATI come out swinging with something more powerful than Nvidia , not just on par for cheaper or whatever , to force Nvidia to counter and speed things up also.

Just better all around when both sides are one upping each other more frequently.
 
Wasn't there a Voodoo 5000 or something like that , towards the end of the 3dfx line , that had an external connector for power coming directly from an AC adapter as at the time power supplies weren't really ramped up for that kind of wattage ?

Anyway even as a member of team green for years I have been on team red with the 9700/9800 series. I'd love to see ATI come out swinging with something more powerful than Nvidia , not just on par for cheaper or whatever , to force Nvidia to counter and speed things up also.

Just better all around when both sides are one upping each other more frequently.

The 290x kinda did that, but let's face it: the 780 was a much better card for the time once prices dropped. The 780ti still remains too overpriced.
 
That's why case fans and watercooling were invented. :)


lol seriously. People have become too babied these days to remember what we used to put up with in days of old. I mean god forbid we hear the slight whisper of a fan under constant full load for hours on end, or someone call the fire department if a chip designed for 98-105C runs hotter than 70C at any time, much less reaches 80C. Suddenly it's like the world is ending. :D
 
lol seriously. People have become too babied these days to remember what we used to put up with in days of old. I mean god forbid we hear the slight whisper of a fan under constant full load for hours on end, or someone call the fire department if a chip designed for 98-105C runs hotter than 70C at any time, much less reaches 80C. Suddenly it's like the world is ending. :D

I'm, 46 and I've been "PC" gaming since the Vic 20. I bought some of the original "frag tape" from this site. I am all about cooling and modding to get the most out of my hardware.



If it can run off 2 six pin headers, I'll buy. Otherwise I'll wait till it's a little more efficient.
 
If it can run off 2 six pin headers, I'll buy. Otherwise I'll wait till it's a little more efficient.

300w doesn't mean its inefficient if the perf/w is there.

I'll take 300w card that performs close to 2x 290X in a heartbeat.

Now make that happen AMD...
 
Back
Top