From ATI to AMD back to ATI? A Journey in Futility @ [H]

The sale of RTG to Intel wouldn't be that bad a thing to be honest. Yes Intel is a Shark Tank. HOWEVER

With declining CPU sales, Intel could be looking to expand their bottom line offerings to maintain profits. Intel has several advantages including advanced nodes that are at least a generation ahead of their competitors. Even if Intel only took back 50% of the market against NVIDIA, it would still generate competition and benefit us all.

But I thought the same thing with the AMD group...too bad AMD treated it like the bastard red headed step child.
 
That's not how things work at Intel. You think getting wafer starts in the newest logic-process fabs is easy at Intel? Only if you are the x86 CPU guys. Funding, mindshare of management, resources of all kinds: it's hard as hell to get those at Intel if you aren't the main product line.

When was the last time an Intel acquisition didn't eventually just die or get sold off?

Their nodes are underused right now. Sales of new Skylake CPU's has been "Disappointing"
 
It's clear that AMD is not firing on all cylinders, interesting to get a look into why that is.

Did you actually watch that video? The guy who made it doesn't even understand how the Turbo Boost 3 dynamic overclocking feature is designed to work and based on that ignorance makes a lot of very stupid statements.

Who cares what the guy is saying? Is the card hot or not? Is it throttling or not? Tom's are also getting 80+ degrees and throttling after four minutes of gameplay, unless the fan is on 100%. TPU also got similar results.

Anyway, this isn't supposed to be an nvidia thread and I did not mean for it to become one (i.e. to generate even one reply about the 1080). However, it's quite clear that the 1080FE isn't the coolest of cards and does indeed throttle when put in a case. So, as Kyle specifically stated that Polaris is hotter than the 1080, and as we've seen the 1080 are quite hot cards (at least the FE), then AMD are in big trouble if what Kyle heard is true.
 
I'll admit, I used to be a big AMD fan. From the K6 and K6-2, they were pretty decent, the K6-3, not so much. When the K7 (Athlon) came out, it was a game changer. They held on strong through the Athlon XP days, but then they started on the downward trend (imo). When I saw they bought ATi, I thought to myself, "this is going to be a clusterf*ck", and tbh, it has been. I loved Eyefinity, and had great hopes for it (still running a 7970), but with the lackluster performance of the past couple flagship cards compared to the nVidia monsters, I figured it was only a short amount of time before something like this came to light.

All I can hope is that something positive for ATi comes from this...I'd hate to see the last of the "big boys" of graphic cards fall and only nVidia remain. It's bad enough that it's a 2 party system (hmmm...sounds familiar elsewhere...), but to be down to a single, the consumer can only lose.
 
What happened to that apparent sample of Polaris running a Star Wars game at 86 watts vs. 140 watts when the two cards were capped at 60fps? There is no way I'd drop $350-600 for a video card, but $150 for a decently fast and highly efficient card to replace my 9600GT would be great. haha. If not AMD, hopefully Nvidia gets some lower end stuff out in their new manufacturing process relatively soon.
 
That's not how things work at Intel. You think getting wafer starts in the newest logic-process fabs is easy at Intel? Only if you are the x86 CPU guys. Funding, mindshare of management, resources of all kinds: it's hard as hell to get those at Intel if you aren't the main product line.

When was the last time an Intel acquisition didn't eventually just die or get sold off?
Well I'm assuming intel would be interested in replacing their iGPU with stronger ATI based designs so I'm assuming some form of ATI GPU would be produced attached to a x86 CPU at any new given process node, so that GPU tech would then already be designed and working at that node level so expanding it out to a full discrete sized chip would be less difficult since a bunch of that work would already be done. Intel launched 14nm in 2014 so even giving the CPU team a full year of exclusive access to a new process node would mean we'd have seen 14nm GPUs last year some time, hell imagine if the 300 series and Fury X could have used intels old 22nm process instead of 28nm, that would have shaken some shit up.

But regardless of how possible it may or may not be politically inside intel, a nerd can dream, and an intel fabbed GPU would be amaze-balls.
 
  • Like
Reactions: N4CR
like this
BTW that capsaicin video makes Jen Hsun look like a rockstar in comparison.
It's hard to watch. At least make it through the part where Lisa and Raja are "interacting."
 
No you do it only when you are angry, but hey it works just as well.
Angry is a very strong word. Trust me, if I have been angry about anything, it was the sink being clogged up in my bathroom that required me to get into the attic yesterday to drill into a vent pipe so I could get the damn snake down it. And then I was not angry, I was frustrated.

As for not being invited to the Macau trip, I could really give a shit less. I try my best NOT to go on those trips any more. I did Computex 11 years in a row...there is no business need for me to go back to China/Taiwan. Been there done that. Although I do love the Taiwanese food.
 
That's not how things work at Intel. You think getting wafer starts in the newest logic-process fabs is easy at Intel? Only if you are the x86 CPU guys. Funding, mindshare of management, resources of all kinds: it's hard as hell to get those at Intel if you aren't the main product line.

When was the last time an Intel acquisition didn't eventually just die or get sold off?

You are quite correct. I was laid off from Fab 11x last June (I was the guinea pig for this year's layoffs), but in my time at Intel it was damn near impossible to get silicon for anything other than production.

It was impossible to get sufficient numbers of wafers to even make process improvements.


I would be sad to see RTG fold up into the Intel wing, as Intel has a disastrous history of acquisitions. It would be the kiss of death.
 
This is terrible news.

Intel/Nvidia will surely have less reason to innovate and charge more for what they do deliver if AMD goes down. I hope they somehow right the ship!
 
I strongly question whether Intel gives a damn about discrete gaming GPUs. If they buy RTG, they'll use the IP and engineers to improve integrated graphics in their CPUs. And Nvidia will have zero competition.

You guys are making jokes about it being a one horse race now, but that really isn't true. AMD is poor competition, but it is competition. Losing AMD would be disastrous.
 
I strongly question whether Intel gives a damn about discrete gaming GPUs. If they buy RTG, they'll use the IP and engineers to improve integrated graphics in their CPUs. And Nvidia will have zero competition.

You guys are making jokes about it being a one horse race now, but that really isn't true. AMD is poor competition, but it is competition. Losing AMD would be disastrous.

If AMD falls someone else will rise, the market is too atractive for just one player.
 
S
No matter, we will see soon enough either way, keeping in mind P10 was NOT designed to compete with GTX1080(officially or otherwise) so many keep forgetting this part and of course are LOL, and LMAO AMD sux type comments, I very much doubt AMD would or can afford to release actual true information, and from what I have seen so far Lisa Su and her Team have been spot on with launches and stuff when they have officially stated anything since she had taken over the reigns.


Spot on with launches???? FuryX was supposed to crush 980ti, fail, Radeon Pro Duo was not even made available to reviewers but PC PERSPECTIVE found it to be a stuttery mess. They were supposed to be in the black a year ago and STILL losing millions every quarter. ZEN is now 2017, Vega 2017, Polaris NO actual launch dates, and now a reveal in MACAU???
Kyle may well be right!

So, where is the information you have about Vega and Zen being pushed to 2017? Link please or is that just FUD? Accuse me of being a Fanboy if you want but, I prefer proof, not just opinions, regardless of how much I may like reading them online.
 
What happened to that apparent sample of Polaris running a Star Wars game at 86 watts vs. 140 watts when the two cards were capped at 60fps? There is no way I'd drop $350-600 for a video card, but $150 for a decently fast and highly efficient card to replace my 9600GT would be great. haha. If not AMD, hopefully Nvidia gets some lower end stuff out in their new manufacturing process relatively soon.

Like Ivybridge, FinFET 3D transistors run at a low voltage. But as soon as you try ramping them, they get hot quickly and in a negative way.
 
If AMD falls someone else will rise, the market is too atractive for just one player.

Somebody would have to. I picture all the disenfranchised AMD fanboys roaming the landscape like a scene from The Walking Dead. I also think losing the ATi label was the beginning of the end, and I've owned ATi, Matrox, and nVidia cards FWIW.
 
Kyle with NV coming out so strong out of the gate with 16nm... It raises a lot of questions with what exactly in the name of (insert Deity here) AMD/RTG was doing. My view is Lisa needs to get Raja and his staff in line... the ones that wont get inline should be shown the door.
 
Somebody would have to. I picture all the disenfranchised AMD fanboys roaming the landscape like a scene from The Walking Dead. I also think losing the ATi label was the beginning of the end, and I've owned ATi, Matrox, and nVidia cards FWIW.

Why would someone have too? If the cost of entry is too high and profit is considered being too low, why would someone enter the market that is, unfortunately, shrinking. (Not shrinking as is dying but as in smaller than it once was.)
 
S
So, where is the information you have about Vega and Zen being pushed to 2017? Link please or is that just FUD? Accuse me of being a Fanboy if you want but, I prefer proof, not just opinions, regardless of how much I may like reading them online.

They confuse the datacenter Zen with the desktop Zen all the time ....
Kyle with NV coming out so strong out of the gate with 16nm... It raises a lot of questions with what exactly in the name of (insert Deity here) AMD/RTG was doing. My view is Lisa needs to get Raja and his staff in line... the ones that wont get inline should be shown the door.

The problem is that it is normal rather then something that almost never happens.
 
Intel acquiring a real GPU company that then gets help from intels engineers and direct access to intels top of the line fabs would be fucking amazing. This would cause nvidia's asshole to seriously pucker hard.

Then we can have see how far Nvidia/IBM will push their alliance until we have skynet-grade AI controlling the market
 
Can you clarify what you mean by Polaris being hotter?

Does Polaris 10 have higher power consumption than GP104? Or, do you meant that it has worse performance-per-watt? (or both)

Or are you simply referring to the default target temperatures of the fan curves, and that Polaris runs hotter than the 82 degrees of the 1080 FE?
 
I guess going to the NVidia invite was a lot easier since you do not live far from there anyways.
I had to go since Brent had a prior family engagement to attend. But yeah, I drove down to Austin from Dallas. Lot faster than flying. :)
 
Can you clarify what you mean by Polaris being hotter?

Does Polaris 10 have higher power consumption than GP104? Or, do you meant that it has worse performance-per-watt? (or both)

Or are you simply referring to the default target temperatures of the fan curves, and that Polaris runs hotter than the 82 degrees of the 1080 FE?
I am privy to more actual data than what is in the article but am not in a position to discuss it openly. So you will have to wait for a bit. Sorry.
 
There is way too much information here for anyone in their right head to claim this is just NVidia bias.

I was really pulling for AMD. I wanted the old days of competition again.


However, there is something interesting reading between the lines here.

A LOT of the issues seem to stem from AMDs graphics division. Let's say they pull off that coup and become ATI again, and then are acquired by Intel.

Then lets say (unless your sources tell you otherwise) that ZEN ends up being competitive for AMD - a success.


Could the actual net result be a surprise boon for AMD by cutting the dysfunctional dead weight and having a competitive processor?

And could that dysfunction that Intel then would bring in cause problems for their company, which could benefit AMD?

Its a stretch, but not an outrageous one. Its fathomable at least.
 
I thought this was a good article. Obviously a good reporter never reveals his/her exact sources, but "people who are in and have been around AMD" is good enough for me.

I wouldn't have a problem with AMD if they could dominate the medium/desktop market first and then put out the true enthusiast product, but if it runs hotter, slower, and uses more power than originally expected, this would definitely put a dent in this strategy and muck things up for the next while at least.

I have to say I'm disappointed, but I work with someone who used to work at AMD/ATI and he's pretty much confirmed a lot of what is said here.

Just asked him, and he says that this article has a lot of accuracy. Years ago when they outsourced design to Taiwan, most of the engineers they got rid of just ended up getting jobs at Team Green.

The end is NIGH

Sad.
 
Last edited:
BTW that capsaicin video makes Jen Hsun look like a rockstar in comparison.

Yah, I watched the whole 1080 presentation live. It may have been awkward and off the rails a few times but you could tell Jen was really proud of the 1080 and not just mouthing words.
 
But regardless of how possible it may or may not be politically inside intel, a nerd can dream, and an intel fabbed GPU would be amaze-balls.

But we have seen what Intel does for the GPU already when Micron had the Rendition Verite they stopped making it When Intel flooded the market with I740 with rather low price they (you guessed it again) they stopped doing things. It seems that they really don't give a flying fuck about graphics in general. If anything it borders on what people would call some form of monopoly and then Intel would get split up same thing would happen if AMD goes the way of the dodo , this works for Intel weak competitor happy investors ...
 
Last edited:
The AMD ship started taking on water back in 2014. I personally broke free from AMD last year. With the 980ti and the 1080 I picked up this morning, I am never looking back.

I also heard a rumor that Microsoft is going with a Pascal 1060 variant for their Xbox One refresh. Microsoft wants to firmly take the performance crown back.
 
The AMD ship started taking on water back in 2014. I personally broke free from AMD last year. With the 980ti and the 1080 I picked up this morning, I am never looking back.

I also heard a rumor that Microsoft is going with a Pascal 1060 variant for their Xbox One refresh. Microsoft wants to firmly take the performance crown back.

Are you sure about the Microsoft thing? Won't make much sense if they completely redesign the hardware and rip out the APU.
 
There is way too much information here for anyone in their right head to claim this is just NVidia bias.

I was really pulling for AMD. I wanted the old days of competition again.


However, there is something interesting reading between the lines here.

A LOT of the issues seem to stem from AMDs graphics division. Let's say they pull off that coup and become ATI again, and then are acquired by Intel.

Then lets say (unless your sources tell you otherwise) that ZEN ends up being competitive for AMD - a success.


Could the actual net result be a surprise boon for AMD by cutting the dysfunctional dead weight and having a competitive processor?

And could that dysfunction that Intel then would bring in cause problems for their company, which could benefit AMD?

Its a stretch, but not an outrageous one. Its fathomable at least.


They dysfunctional deadweight is the CPU division
 
I miss the long gone days where you upgraded your computer every year as there was something new and exciting growing by leaps and bounds. When every new product cycle was like a clash of the titans as you waited with baited breath to see who would win to go into your build. Now the product cycles are dull, who will win on the cpu side? Intel. Beating out the previous reigning champ, Intel, by 5%. The video card side is a hair more exciting, if no less monolithic. Who will win? Nvidia, but at least by 25% or so. And so the cycle repeats year after year.
 
Back
Top