AMD R&D Expenditure Down 40%

HardOCP News

[H] News
Joined
Dec 31, 1969
Messages
0
AMD’s chief technology officer, Mark Papermaster, was in the hot seat at the Global Tech Leadership Forum. Questions about Moore's Law, competing with Intel and the sharp drop in R&D spending all seem to have investors a bit concerned.

McConnell asked Papermaster about the sharp drop-off in R&D levels at AMD, stating “when I talk to investors about AMD, there’s some concern — I mean, we’ve seen a decline by close to 40% versus levels we were at in the beginning of the decade.”
 
It's not like AMD has much choice. Between a steep decline in revenue and profit margins plus an increase in debt expenditures, they are getting blood from a stone.
 
Definitely a sad thing. I'm thinking even with their 14nm GPUs planned for next year that may still not be enough to stay in the game with Nvidia. This is going to be bad news down the road where we're all paying pretty much double for incremental improvements from Nvidia cards and no serious competition.
 
This is awful, but not unexpected after the harm Intel did you them with their illegal business practices in the early 2000's.

Even with the billion dollar settlement Intel paid, the harm was done. That settlement shod have been at least an order of magnitude higher, but they had AMD over a barrel, and they were forced to accept.
 
Definitely a sad thing. I'm thinking even with their 14nm GPUs planned for next year that may still not be enough to stay in the game with Nvidia. This is going to be bad news down the road where we're all paying pretty much double for incremental improvements from Nvidia cards and no serious competition.

AMD will exist in the GPU market, someone will buy them.

While the Fury was not what everyone had hoped its still relatively competitive with a stock Nvidia 980 Ti.

I don't think your going to see something as extreme as what your suggesting.

However on the CPU side of things, I think they are in trouble, hell their custom gaming system uses an Intel processor because there CPUs are so underpowered.
 
This is going to be bad news down the road where we're all paying pretty much double for incremental improvements from Nvidia cards and no serious competition.

Nvidia still wants to make good annual profits, they won't mark up their GPUs astronomically (if at all) when AMD goes under. They also will make sure it still has enough updates/improvements to make people want to upgrade; again, because they want good annual profits.
 
Nvidia spends more on R&D than AMD and Nvidia only really does gpus. Intel outspends both of them by like 10 times.
 
I'm completely shocked, in the unexpected non-shocked sense.
 
And the key here is: they've already dropped R&D, and they're still losing money.

What else can they cut?
 
At least it isn't a 40% cut since last year. The span of a decade is fine, the company needed to cut the fat. Hopefully Zen is a success so they can actually live.
 
While the Fury was not what everyone had hoped its still relatively competitive with a stock Nvidia 980 Ti.

Lol give it up already. A stock 980 Ti owns a stock Fury, an overclocked 980 ti owns an overclocked Fury. Unless they have significantly reduced the price of the Fury since launch, and Nvidia has not done the same with the 980Ti, Fury is a complete waste of money that is only bought by AMD fanboys. And yes, if it was the other way around, Nvidia fanboys would be acting the same way.
 
Nvidia still wants to make good annual profits, they won't mark up their GPUs astronomically (if at all) when AMD goes under. They also will make sure it still has enough updates/improvements to make people want to upgrade; again, because they want good annual profits.

Eh, I dunno. If a stale platform still rakes in money without having to increase employee's, investment and R&D, and there is no one to compete with them...

Trust me. They'll get as lazy as a tree sloth. It's the way it works.

Anything happening t AMD would be major bad for the graphics industry. At least for consumers.
 
Eh, I dunno. If a stale platform still rakes in money without having to increase employee's, investment and R&D, and there is no one to compete with them...

Trust me. They'll get as lazy as a tree sloth. It's the way it works.

Anything happening t AMD would be major bad for the graphics industry. At least for consumers.

No, you don't know. If the platform is stale, people won't upgrade because there is no incentive/need to, thus no revenue is generated. So there is a need to innovate and not raise prices astronomically (again, because people won't buy and revenue decreases), even without a competitor in the field.
 
If the platform is stale, people won't upgrade because there is no incentive/need to, thus no revenue is generated.

I mean, FFS, that is the exact position AMD is in right now as proof, lol. It's not going to happen with Nvidia when AMD goes away.
 
Nvidia still wants to make good annual profits, they won't mark up their GPUs astronomically (if at all) when AMD goes under. They also will make sure it still has enough updates/improvements to make people want to upgrade; again, because they want good annual profits.

I don't know about that. Here's the price history of nvidia dual gpu's on launch:
Code:
$600	2006-8
$700	2011
$1000	2012-14
$3000	2014
$1000	2015 (single gpu)
 
I don't know about that. Here's the price history of nvidia dual gpu's on launch:
Code:
$600	2006-8
$700	2011
$1000	2012-14
$3000	2014
$1000	2015 (single gpu)

Those are prices for their highest end GPUs though (which always tended to be high) ... with market segmentation they produce a wide range of GPU products at multiple price points (including some very affordable high performance and main stream performance cards) ... you can't judge them on just their highest price (anymore than you can judge Intel by what they charge for their Extreme chips)
 
Nvidia still wants to make good annual profits, they won't mark up their GPUs astronomically (if at all) when AMD goes under. They also will make sure it still has enough updates/improvements to make people want to upgrade; again, because they want good annual profits.
I think they would, it would just spread out differently. Let's say Nvidia and AMD were neck and neck. You might expect the next generation to be something like 20-30% faster, with their high-ends being maybe $500-700. Now let's say AMD wasn't competing at all. You would expect their entire line-up to cost maybe 10-20% more, and maybe be 10-15% faster. Then at the very high end, they would have something maybe 20-25% faster and cost anywhere from $1000-2000.

I'm not saying Nvidia wouldn't keep making faster cards, I'm saying the performance gains would be a LOT slower to market and they would cost more than if there was competition. I mean hell, look at Intel. Intel isn't exactly blowing away its previous CPU generations each time, its extremely incremental. I mean how many threads have you seen with people bragging that their years-old Intel CPU is still good enough for them? The same thing would happen to Nvidia.
 
Nvidia still wants to make good annual profits, they won't mark up their GPUs astronomically (if at all) when AMD goes under. They also will make sure it still has enough updates/improvements to make people want to upgrade; again, because they want good annual profits.
Am I reading this right? That Nvidia won't mark up prices if AMD goes under? Nvidia will not increase prices because people are only willing to spend a certain amount of money. So instead the products may shift. For example a GTX 960 may be priced at 970 prices, but it won't be called a 960. Instead a new rebrand would occur and it'll be GTX Z60 with 8GB of VRAM and a minor clock speed increase.

Pray that AMD doesn't go under or if they do then someone will buy their graphics and make it competitive.

Nvidia spends more on R&D than AMD and Nvidia only really does gpus. Intel outspends both of them by like 10 times.
Except Nvidia released their own tablet, console, portable hand held, and don't forget the entire Tegra line of SoC's. All but Tegra is not doing well in the market.

Lol give it up already. A stock 980 Ti owns a stock Fury, an overclocked 980 ti owns an overclocked Fury. Unless they have significantly reduced the price of the Fury since launch, and Nvidia has not done the same with the 980Ti, Fury is a complete waste of money that is only bought by AMD fanboys. And yes, if it was the other way around, Nvidia fanboys would be acting the same way.
Neither the 980 Ti or Fury are realistic for most people to buy. It's just AMD and Nvidia fighting each other to see who has the better overall faster GPU. Nobody is going to spend $500 - $600 for a graphics card. But by being the "winner" Nvidia will sell more GPUs because people are dumb. An R9 285 is faster than GTX 960 but herp derp buy Nvidia cause 980 Ti is winner.

So why own these cards? Obviously for 4k gaming cause why else would you own these? And the Fury isn't bad at 4k. But for the price AMD wants them for especially compared to the 980 Ti, yes it's bad. Prices need to drop or drivers need to get a lot better. But you know the real winner is? The R9 295X2, cause it wins in all benchmarks. Not that you'd want to own that either.
 
I'm not saying Nvidia wouldn't keep making faster cards, I'm saying the performance gains would be a LOT slower to market and they would cost more than if there was competition. I mean hell, look at Intel. Intel isn't exactly blowing away its previous CPU generations each time, its extremely incremental. I mean how many threads have you seen with people bragging that their years-old Intel CPU is still good enough for them? The same thing would happen to Nvidia.

This is because we're hitting the limit s of Instruction Level Parallelism (ILP) with x86, and the upper limits on frequency for a wide x86 core that can be scaled effectively from 5w to 100w.

They could clock it faster, but it would use far more power. And you'd have to balance longer pipelines with branch mispredictions, something Intel failed to do well with Prescott.

Intel has countered this by adding more cores and more threads (thread level parallelism), and new instruction sets like AVX, AVX2, and FMA that attempt to do more operations in a single clock by letting the compiler/programmer optimize things (similar to the gains promised by VLIW).

I'm sorry you're pissed off about it, but x86-64 is a serial architecture, with variable instruction sizes. It's hard to get ILP above a certain limit without tossing the entire instruction set out the window. And no, the rest of the world CANNOT agree on which instruction set is superior, which is why we have so many :D

Luckily for us, Nvidia has no such limitations when it comes to designing their GPUs. They design them to run Cuda well, but for the most part the architectures and instruction sets are free to change at any time. The only thing holding back GPUs at this point is Moore's law imploding.
 
So why own these cards? Obviously for 4k gaming cause why else would you own these? And the Fury isn't bad at 4k. But for the price AMD wants them for especially compared to the 980 Ti, yes it's bad. Prices need to drop or drivers need to get a lot better. But you know the real winner is? The R9 295X2, cause it wins in all benchmarks. Not that you'd want to own that either.

I'd argue that at 4k, for anything but lighter / older titles, you are going to need at least two 980ti's if not more.

The opening scene of Metro 2033 - for instance - runs at ~40fps with two 980ti's in SLI at 4k. Most on here would not consider 40fps acceptable.

So, if you installed went with a Fury, it's not going to be any faster than the 980ti's, and you'b be stuck with Crossfire.

SLI is no walk in the park, and suffers from stuttering and other issues, but Crossfire makes SLI look like the second bloody coming of Christ. Crossfire is really for masochists only.
 
Am I reading this right? That Nvidia won't mark up prices if AMD goes under? Nvidia will not increase prices because people are only willing to spend a certain amount of money. So instead the products may shift. For example a GTX 960 may be priced at 970 prices, but it won't be called a 960. Instead a new rebrand would occur and it'll be GTX Z60 with 8GB of VRAM and a minor clock speed increase

Nvidia, like Intel, has to compete with themselves.

Reality of the graphics card business:

1. Most people don't care about features when deciding which card to buy . They just care about results.

2. The graphical improvements possible in each new generation of new DirectX has shrunk, most notably after DX9.

3. People will not replace their graphics card unless you give them a need, and give them a tantalizing performance upgrade for their money. PEOPLE HAVE A SET LIMIT THEY WILL PAY, or they will wait to upgrade.

If you raise prices, you slow down or stop the upgrade treadmill. People who can't afford the new cards will pass, and game makers will stop targeting their games at higher fidelity, given the shrinking market for newer cards.

Nvidia knows what the graphics market will bear, and they know that it depends IMPLICITLY on periodic improvements being made available to the market as a whole. IF they restrict those improvements artificially, then cutting-edge game development will slow down. And what the hell else reason do people have to buy a new discrete GPU in 2015?
 
This is because we're hitting the limit s of Instruction Level Parallelism (ILP) with x86, and the upper limits on frequency for a wide x86 core that can be scaled effectively from 5w to 100w.

They could clock it faster, but it would use far more power. And you'd have to balance longer pipelines with branch mispredictions, something Intel failed to do well with Prescott.

Intel has countered this by adding more cores and more threads (thread level parallelism), and new instruction sets like AVX, AVX2, and FMA that attempt to do more operations in a single clock by letting the compiler/programmer optimize things (similar to the gains promised by VLIW).

I'm sorry you're pissed off about it, but x86-64 is a serial architecture, with variable instruction sizes. It's hard to get ILP above a certain limit without tossing the entire instruction set out the window. And no, the rest of the world CANNOT agree on which instruction set is superior, which is why we have so many :D

Luckily for us, Nvidia has no such limitations when it comes to designing their GPUs. They design them to run Cuda well, but for the most part the architectures and instruction sets are free to change at any time. The only thing holding back GPUs at this point is Moore's law imploding.
I think you're misreading me, I'm not pissed off about the incremental improvements, christ I'm on an FX processor for that matter. But it's fair enough that maybe that's not a great example. I DO remember Intel's pricing and rate of development being a lot slower prior to the Athlon release however. A better comparative example would be the Geforce 3 and 4. These were more incremental upgrades, then AMD came in with the 9700 and blew everything out of the water and it took Nvidia a while to catch back up again. They did however, with a vengeance. Companies simply aren't as competitive without, you know, competition.

Ashbringer said:
Am I reading this right? That Nvidia won't mark up prices if AMD goes under? Nvidia will not increase prices because people are only willing to spend a certain amount of money. So instead the products may shift. For example a GTX 960 may be priced at 970 prices, but it won't be called a 960. Instead a new rebrand would occur and it'll be GTX Z60 with 8GB of VRAM and a minor clock speed increase.

Pray that AMD doesn't go under or if they do then someone will buy their graphics and make it competitive.
What you're describing is a sure thing in the absence of competition, but I think they would ALSO increase prices at different tiers also. So the top-tier end might get stupid pricey and medium tiers you're getting less for the same money as you described.
 
We have already been suffering the negative effects of Intel's market dominance. Some kind of real competition needs to exist or it will only get worse.
 
AMD hasn't been competition for NVIDIA in awhile. Titan proved that. There is no way they would have been able to release Titan if they had actual competition from anyone. We already live in a world without AMD.
 
AMD hasn't been competition for NVIDIA in awhile. Titan proved that. There is no way they would have been able to release Titan if they had actual competition from anyone. We already live in a world without AMD.
That's only true of the top end however. For everything midrange and below AMD has been in competition quite a bit (though they're starting to slip more and more). Even while Nvidia was ruling the roost, the AMD game bundles were having some impact. You say there's no competition, but you can believe it would get worse if that was literally true.
 
I think you're misreading me, I'm not pissed off about the incremental improvements, christ I'm on an FX processor for that matter. But it's fair enough that maybe that's not a great example. I DO remember Intel's pricing and rate of development being a lot slower prior to the Athlon release however. A better comparative example would be the Geforce 3 and 4. These were more incremental upgrades, then AMD came in with the 9700 and blew everything out of the water and it took Nvidia a while to catch back up again. They did however, with a vengeance. Companies simply aren't as competitive without, you know, competition.

But in that time the market wasn't saturated. People were still buying a PC for the very first time in the 1990s, and that didn't end until about 2003, when the desktop market began to shrink. The laptop market continued to grow, but it hit a wall a few years ago.

So yes, competition meant a lot in the 1990s, as it helped to shape the industry. The companies were raw, and had no idea what the right way to go was. But today the market is shrinking, so there's not room for multiple competitors (unless one consumes the other, like Intel is doing to AMD).

So after Intel is done consuming AMD, all they have to compete with is themselves. And they already have to do that for 3/4 of all new PC sales, since they already own 3/4 of the installed market. The vast majority of PC purchases are replacements, so unless you want to wait for it to break, you have to provide enticement to UPGRADE.

Also, Intel has competed with themselves in the past, even before the Athlon was a thing. The Original Celeron 266/300, a cachless POS, was slower at business applications than the ancient Pentium 233 MMX:

http://www.tomshardware.com/reviews/big-cpu-shoot,84.html

The name was only saved by Intel eating crow and releasing an on-die 128k cache version less than 6 months after the original release. Now THAT is fast turnaround, when you have egg on your face that bad :D
 
Zarathustra[H];1041788238 said:
This is awful, but not unexpected after the harm Intel did you them with their illegal business practices in the early 2000's.

Even with the billion dollar settlement Intel paid, the harm was done. That settlement shod have been at least an order of magnitude higher, but they had AMD over a barrel, and they were forced to accept.
This certainly didn't help, but AMD buying ATi was what really did them in. They couldn't afford them and it will end up making two monopolies in the end.
 
This certainly didn't help, but AMD buying ATi was what really did them in. They couldn't afford them and it will end up making two monopolies in the end.
I was going to reply with this.

I think AMD's two biggest missteps were buying ATI and spinning off their fabs. As much as I liked ATI GPUs and used them extensively - it was a ton of debt to take on for a company that wasn't equipped to pay it back. And lacking control over the production of their technology is hurting them pretty badly now. AMD is still selling 32nm CPUs while Intel has already ramped up 14nm production.
 
Neither the 980 Ti or Fury are realistic for most people to buy. It's just AMD and Nvidia fighting each other to see who has the better overall faster GPU. Nobody is going to spend $500 - $600 for a graphics card. But by being the "winner" Nvidia will sell more GPUs because people are dumb. An R9 285 is faster than GTX 960 but herp derp buy Nvidia cause 980 Ti is winner.

So why own these cards? Obviously for 4k gaming cause why else would you own these? And the Fury isn't bad at 4k. But for the price AMD wants them for especially compared to the 980 Ti, yes it's bad. Prices need to drop or drivers need to get a lot better. But you know the real winner is? The R9 295X2, cause it wins in all benchmarks. Not that you'd want to own that either.

As far as I am concerned, every card you mentioned is not worth buying, by anyone, for any reason, and the same goes for the Titan X. Right now 4k is an inferior technology, it's unplayable on a single card, and SLI/crossfire are still broken, as they always have been. Worse image quality and features on 4k monitors, far worse FPS, scaled down graphics are often required, lack of 120/144hz, lack of gsync/freesync, etc...
 
Nvidia still wants to make good annual profits, they won't mark up their GPUs astronomically (if at all) when AMD goes under. They also will make sure it still has enough updates/improvements to make people want to upgrade; again, because they want good annual profits.

Let me fix what you said: "They also will make sure it still has enough planned obsolescence to force people to want to upgrade; again, because they want good annual profits."

They've proven that they can and will do that.
 
No, you don't know. If the platform is stale, people won't upgrade because there is no incentive/need to, thus no revenue is generated. So there is a need to innovate and not raise prices astronomically (again, because people won't buy and revenue decreases), even without a competitor in the field.

Nice vision, but no competition = high prices.
 
Nice vision, but no competition = high prices.

That is only true if you are in a market where your product is not optional ... as much as we (the performance PC crowd) might like separate GPUs, those are not necessary commodities for most businesses or consumers ... NVidia must compete with Intel on the chipset front (even without AMD) and with other ARM manufacturers on the tablet/phone front (so there is competition there with or without AMD) ...

on the PC front they must persuade the user that they need more power than the chipset can offer so they will pay for a separate GPU ... this calculation will involve cost and capability so they will still have competition to get users
 
That is only true if you are in a market where your product is not optional ... as much as we (the performance PC crowd) might like separate GPUs, those are not necessary commodities for most businesses or consumers ... NVidia must compete with Intel on the chipset front (even without AMD) and with other ARM manufacturers on the tablet/phone front (so there is competition there with or without AMD) ...

on the PC front they must persuade the user that they need more power than the chipset can offer so they will pay for a separate GPU ... this calculation will involve cost and capability so they will still have competition to get users
Yeah that's a nice story, but doesn't really hold up. In markets with no competition where your product is optional, you STILL get soaked, because they know the people who DO buy it, want it. Your arguments may be valid for the low end, but high end graphics have long sold themselves in computing. You're not going to running high end games with integrated Intel graphics. It would mean that if you wanted high performance in PC gaming, you have to fork over cash to Nvidia, the end.
 
Yeah that's a nice story, but doesn't really hold up. In markets with no competition where your product is optional, you STILL get soaked, because they know the people who DO buy it, want it. Your arguments may be valid for the low end, but high end graphics have long sold themselves in computing. You're not going to running high end games with integrated Intel graphics. It would mean that if you wanted high performance in PC gaming, you have to fork over cash to Nvidia, the end.

Except most of us on the high end are not as price conscious as those on the low end ... I used to buy the latest video cards when they were in the $600-700 range so I would have no issue if they returned to that price (as long as the performance continued to be satisfactory) ... competition forcing the high end cards down to $300-400 is not realistic and is probably one reason that AMD is failing ... competition is nice but I don't want competition that bankrupts companies either ;)
 
ATI used to be such a dominant player back in the day. I really hope someone else buys out AMD's assets so we're not going to be at the mercy of Nvidia taking advantage of the lack of competitive landscape.
 
ATI used to be such a dominant player back in the day. I really hope someone else buys out AMD's assets so we're not going to be at the mercy of Nvidia taking advantage of the lack of competitive landscape.

I would not worry about it. However, I am more concerned about what would happen in the console space. Also, AMD created the X86-64 extensions, if AMD goes out of business, I hope Intel will have to buy those extensions or stop using them. Otherwise, AMD made a horrendous deal with Intel and they deserve to die. :eek:

This is speaking as a person who would prefer AMD over Intel or Nvidia most of the time.
 
Back
Top