Phil Rogers AMD Fellow Jumps Ship to NVIDIA

I would maybe agree in a general sense, but oil is a commodity. High-end video cards are not. Not everyone can just start up a company that can compete with Nvidia, Intel, or AMD. Slightly different type of product. I agree with you that no one of these companies is going to just take over the market, jack prices into the stratosphere, etc. but I think it's more possible than it would be with your example.

The same can be shown for just about anything, that is how the market works, the ONLY times it happens and stays in place, is when they have control of the government regulation and the government is what is enforcing their power and control of the market. Intel, has been under no pressure from AMD for some time, yet accounting for inflation, have dropped prices on CPUs and given us more powerful and less power hungry CPUs even though we are getting very close to the limits of current tech. If they do not push, or keep prices the market will bare, people will not buy it, even worse than a raw commodity like oil, because people don't have to buy it. There are also many many companies already out there that have the ability and funds to jump into that market if the profit margin were to jump up, you don't have to have a new startup to get into it, as well, in this day and age, we have seen so many startups come into the fold with billions in startup backing, many of these companies not even having a real functional plan to monetize their service or product, where there is money to be made and large profit margins, there will always be investors.
 
These ideas about grabbing the market and holding it hostage with prices are a myth, yet people hold in their mind that's how things were before the government jumped in to save us, but in any professional economic debate, you would be laughed at for suggesting it as no actual proof has ever been shown.

Even the inventor of the quote the "invisible hand" believed that.

Adam Smith said:
People of the same trade seldom meet together, even for merriment and diversion, but the conversation ends in a conspiracy against the public, or in some contrivance to raise prices.

What is your explanation for the positive effect of the breakup of AT&T?
 
Intel could just throw some billions into R&D and compete with Nvidia without buying ATI. The Intel brand of anything also has a better image than ATI anything. I hope Intel starts getting serious about graphics whether they buy ATi or not. If anyone could come out smashing all competition it'd be Intel.

Intel 740 ( https://en.wikipedia.org/wiki/Intel740 ) and Intel Larrabee ( https://en.wikipedia.org/wiki/Larrabee_(microarchitecture) )would like a word with you.
 

The 740 was actually not THAT bad. The hardware was half-decent, but their drivers were horrible. We had a few in a lab that I worked in to do some compatibility testing before they were released. It didn't perform nearly as well as its main competitors, but it supported most of the popular graphical features of the time. If they had developed a higher tier version, and worked on the drivers, it could have been at least viable. As it was though, it wasn't. Point being, with a little more effort behind it, they could have easily pulled it off I think. I believe we tested them with Incoming a bit, and it played fairly well on them, which was fairly graphically intense at the time if I remember correctly. (that was a long time ago...)
 
Fellas look on the bright side, with the money saved from this guys probably very large salary, AMD can now divert it to much needed areas like Lisa Su's bank account. :p Just a sign of the times, AMD is finished.
 
I would maybe agree in a general sense, but oil is a commodity. High-end video cards are not. Not everyone can just start up a company that can compete with Nvidia, Intel, or AMD. Slightly different type of product. I agree with you that no one of these companies is going to just take over the market, jack prices into the stratosphere, etc. but I think it's more possible than it would be with your example.

While prices would go up they won't go too high. No one will buy $500 video cards if that is what you need to play games at medium-high settings. No one will pay $800 for a card to max out 1080. They'll kill their own market because everyone will go for consoles instead. At the least, prices will rinse and innovation will stall. This means less a GPU will cost more but might last longer, making the overall price/cost ratio pretty even. But if there is little performance/graphical improvements, again, more people will turn to consoles.

Desktops are niche as it is and massive price increase won't help change that.
 
AMD has already spun off the old ATI parts to form Radeon Technology Group. IMO, this is in preperation for the inevitable ... a sale of the graphics division to someone (Samsung or Microsoft).

If you ask me, it was stupid for AMD to buy ATI in the first place and try to fight a "two front war" with BOTH Intel and nVidia.

Intel's R&D budget is larger than AMD's entire sales. No way they can compete.

I hope AMD breaks up and the graphics division is bought out SOON. In the long run, if that happens and a company like Samsung or Microsoft (both of whom have very deep pockets) buys them and makes investments, then we will finally get REAL competition with nVidia.

With the cash sale, and no longer having to split their advertising and other overhead, they can finally focus on Intel, and maybe AMD CPU's will finally stop sucking.
 
What is your explanation for the positive effect of the breakup of AT&T?

No real positive effect, the reason for AT&T (Ma Bell) in the first place was capture of the regulators by AT&T. The monopoly enjoyed by AT&T for decades was purely the creation of government intervention. When the initial patents (whole other discussion there) expired in 1893 and AT&T no longer had full control, they had dozens of competitors, who took up over 5% market share almost instantly. By 1900, they had thousands of competitors, and by 1907 those competitors had captured 51% of the market, and not only did they give options, but drove prices to rock bottom, and AT&T was still losing market share. There were absolutely no evidence of economies of scale, as AT&T, the regulators and politicians argued, for the favor of a natural monopoly, that we still have today in many copper services and from water to electricity. The creation of the telephone monopoly was the direct result of AT&T buying politicians and pushed a "universal telephone service" as a pork-barrel entitlement to their constituents. They started by denouncing competition as wasteful and harmful, and even went as far as to claim "There is nothing to be gained by competition in the local telephone business", yeah, wonderful congress for you there.

They had the explicit intention of creating the monopolistic telephone industry through the government and succeeded when the government used WWI as an excuse to nationalize the industry in 1918. AT&T still operated its telephone system, but it was controlled by government commission headed by the postmaster general. Once this happened, AT&T only had to capture those regulators and used the whole apparatus to eliminate competitors, and "By 1925 not only had virtually every state established strict rate regulation guidelines, but local telephone competition was either discouraged or explicitly prohibited within many of those jurisdictions."

Now, the breakup it self was good, but it only had to be done because the monopoly was enforced and carried out by the government in the first place, had they not been involved and the market allowed to run as it was, AT&T, might not even exist today as had trends continued, they were being pushed out of the market, and this breakup would never have needed to happen. It is like being a firefighter who sets a house on fire, then comes and puts it out and then expects to get a pat on the back for his "good deed". As I said, abusive monopoly do not happen or stay in effect, unless put there or enforced by government.
 
AMD has already spun off the old ATI parts to form Radeon Technology Group. IMO, this is in preperation for the inevitable ... a sale of the graphics division to someone (Samsung or Microsoft).

If you ask me, it was stupid for AMD to buy ATI in the first place and try to fight a "two front war" with BOTH Intel and nVidia.

Intel's R&D budget is larger than AMD's entire sales. No way they can compete.

I hope AMD breaks up and the graphics division is bought out SOON. In the long run, if that happens and a company like Samsung or Microsoft (both of whom have very deep pockets) buys them and makes investments, then we will finally get REAL competition with nVidia.

With the cash sale, and no longer having to split their advertising and other overhead, they can finally focus on Intel, and maybe AMD CPU's will finally stop sucking.

If Microsoft or Samsung bought out RTG, you can bet they would stop making desktop cards and instead use those teams to build integrated chips. So no, there wouldn't be any competition for NVIDIA.
 
There are no DX12 or Vulkan games. So the 390 is better in games that don't exist on the market yet?
I know right. I mean who's looking forward to games coming out soon? Fucking losers.
The 390 uses 100+ more watts than the 970 while not giving any significant performance. You are literally wasting money and electricity by buying a 390 over a 970.
It's only 100W+ when gaming, while idle it consumes 2W+ over the 970. The amount of electricity would barely be worth mentioning it unless you game 24/7.

But also consider that the 970's 3.5GB handicap might actually come into play one day when more games push for the 4GB limit. Also, DX12.

fable-fps.gif
 
I know right. I mean who's looking forward to games coming out soon? Fucking losers.

It's only 100W+ when gaming, while idle it consumes 2W+ over the 970. The amount of electricity would barely be worth mentioning it unless you game 24/7.

But also consider that the 970's 3.5GB handicap might actually come into play one day when more games push for the 4GB limit. Also, DX12.

fable-fps.gif

That's 100w+ when gaming; meaning you're going to need better cooling. I like when power consumption is brought up its usually just mentioned just for the cost of electricity; but its much more than that. Like how much better of a heatsink + case fan set-up needed to dissipate that extra 100w, how much more stress you place on the VRMs, or also the impact of that extra 100w of heat on the surrounding components.
 
well there goes any chance for AMD creating a viable alternative to CUDA. Yes, yes, OpenCL but CUDA is the game to beat.
 
in terms of just marketshare CUDA is by far and aware the most prevalent API. they've invested the money and the time and made a viable product
 
Basically, you just said there are forces outside their control that are forcing their prices to remain where they are and not go higher. However, if they could raise their prices without hurting them, they would do so in a single heartbeat.
Yes, every company including AMD would raise their prices further if they could, that is what business is about. That doesn't change the fact that there are other factors beyond competition that determines how far they can raise their price.

For example, both nVidia and AMD relies on user upgrading their hardware to sell their desktop products. Imagine if both sold their GPU at 10k. Performance wise, both products are very competitive. But is that enough to make people buy their product? Of course not. Very few will spend 10k on a gaming PC. They'll just settle for older hardware or move to console.

So with or without competition, both companies would need to have products that are priced at a level where people can afford them. Otherwise, there are plenty of other option. And I think this is especially true for PC gaming, where the hardware these companies are selling are only as useful as the software allows. If no one can afford discrete GPU, the software industry will not aim for something no one have. They will just aim for something like Intel embedded graphics.

Yesterday != tomorrow.
I think embedded graphics are still important tomorrow. As we move towards compact devices, embedded graphics makes more sense in terms of cost, power consumption, cooling requirement, etc.
 
The 740 was actually not THAT bad. The hardware was half-decent, but their drivers were horrible. We had a few in a lab that I worked in to do some compatibility testing before they were released. It didn't perform nearly as well as its main competitors, but it supported most of the popular graphical features of the time. If they had developed a higher tier version, and worked on the drivers, it could have been at least viable. As it was though, it wasn't. Point being, with a little more effort behind it, they could have easily pulled it off I think. I believe we tested them with Incoming a bit, and it played fairly well on them, which was fairly graphically intense at the time if I remember correctly. (that was a long time ago...)

My memeory is different..they came, flopped (performance below competition), made the Intel 752 and then imploded with a vengance.


Yesterday != tomorrow.

You think it's more esay to design a GPU today than it was 10 years ago?
You are jesting right?
 
Is it? Serious question.

Yes.

NVIDIA has had a great success with getting CUDA out in the educational enviroment..aka it's the thing people study in class...not OpenCL.
(And thus the future programmers think "CUDA" not OpenCL")

That and the MASSIVE eco-system build around CUDA.

OpenCL have a long way to go before they can claim parity with CUDA.
 
Are standard intel processors less 1,000 dollars?
Yes, because Intel does not wish to commit hari-kari. Not because AMD products are forcing them to keep prices low. AMD does not compete with Intel.

AMD could go under tomorrow and neither the CPU market nor the discrete desktop GPU market would change one bit. AMD has been a total non-competitor on those fronts for years. ARM Holdings competes somewhat with Intel, Nvidia has no competition.
 
Those markets would change quite a bit.
This idea that all CPUs would be 1K is stupid but so is the idea that prices would not raise one bit.
 
That's 100w+ when gaming; meaning you're going to need better cooling. I like when power consumption is brought up its usually just mentioned just for the cost of electricity; but its much more than that. Like how much better of a heatsink + case fan set-up needed to dissipate that extra 100w, how much more stress you place on the VRMs, or also the impact of that extra 100w of heat on the surrounding components.

Well a lot of the problem is that most Nvidia cards use aftermarket coolers while a lot of AMD cards were using stock coolers. I'm the kinda of person that will remove the stock cooler immediately and plop on my water block to cool the GPU cause doesn't matter if it's AMD or Nvidia cause both companies have graphic cards that are running 80C+.

Yes.

NVIDIA has had a great success with getting CUDA out in the educational enviroment..aka it's the thing people study in class...not OpenCL.
(And thus the future programmers think "CUDA" not OpenCL")

That and the MASSIVE eco-system build around CUDA.

OpenCL have a long way to go before they can claim parity with CUDA.
Yea no, CUDA is a joke. Whatever problems there was with OpenCL will match and surpass CUDA with SPIR-V which uses OpenCL 2.1. Part of the power of SPIR-V is that it can take code from CUDA as well as HLSL/GLSL and convert them to usable code for SPIR-V. SPIR-V is amazing.

It's not like Nvidia isn't going to support it cause SPIR-V is part of Vulkan.
 
Those markets would change quite a bit.
This idea that all CPUs would be 1K is stupid but so is the idea that prices would not raise one bit.

Technically they have. We just don't think about it because how many of us need 8 core CPUs? The i7-5960X is over $1k and we don't bat an eye because it has nearly 0 effect in gaming. But it's technically a proper evolution of CPU design unlike the Skylake CPUs which for the most part perform nearly the same as the Haswell's and cost a bit more in price.
 
Well a lot of the problem is that most Nvidia cards use aftermarket coolers while a lot of AMD cards were using stock coolers. I'm the kinda of person that will remove the stock cooler immediately and plop on my water block to cool the GPU cause doesn't matter if it's AMD or Nvidia cause both companies have graphic cards that are running 80C+.


Yea no, CUDA is a joke. Whatever problems there was with OpenCL will match and surpass CUDA with SPIR-V which uses OpenCL 2.1. Part of the power of SPIR-V is that it can take code from CUDA as well as HLSL/GLSL and convert them to usable code for SPIR-V. SPIR-V is amazing.

It's not like Nvidia isn't going to support it cause SPIR-V is part of Vulkan.

Are we living in the same universe?
Comparing CUDA to Vulkan...you are joking right?
 
This idea that all CPUs would be 1K is stupid but so is the idea that prices would not raise one bit.

Even without ATI as competition, Nvidia competes against their own older cards. I'm still rocking my GTX580 until something better seems cost effective enough for my needs.
 
Even without ATI as competition, Nvidia competes against their own older cards. I'm still rocking my GTX580 until something better seems cost effective enough for my needs.

AMD hasn't been competition for NVIDIA in years. Outside of forums like this where AMD marketing makes a lot of noise, most people don't even know AMD makes video cards. NVIDIA's 82% market share is proof of that. AMD gets 18% mostly from selling to HP and a handful of diehard fans who are slowly turning their back on AMD.
 
Yes, because Intel does not wish to commit hari-kari. Not because AMD products are forcing them to keep prices low. AMD does not compete with Intel.

AMD could go under tomorrow and neither the CPU market nor the discrete desktop GPU market would change one bit. AMD has been a total non-competitor on those fronts for years. ARM Holdings competes somewhat with Intel, Nvidia has no competition.

Yeah, I assure you it will be one week after AMD goes under that Intel will at the letter b to all its processors, call it a new chip, double the price and release a 1500$ 'top end' chip that is 100mhz faster than current top end.
AMD 'is no competition' because Intel prices are reasonable, it they were as insane as they used to be, AMD would be growing overnight.
 
Yeah, I assure you it will be one week after AMD goes under that Intel will at the letter b to all its processors, call it a new chip, double the price and release a 1500$ 'top end' chip that is 100mhz faster than current top end.
AMD 'is no competition' because Intel prices are reasonable, it they were as insane as they used to be, AMD would be growing overnight.

This is absolute economic myth produced as absolute truth and "matter of fact", when nothing exists to prove this stance.

"It is no crime to be ignorant of economics, which is, after all, a specialized discipline and one that most people consider to be a 'dismal science'. But it is totally irresponsible to have a loud and vociferous opinion on economic subjects while remaining in this state of ignorance."
 
Yeah, I assure you it will be one week after AMD goes under that Intel will at the letter b to all its processors, call it a new chip, double the price and release a 1500$ 'top end' chip that is 100mhz faster than current top end.
AMD 'is no competition' because Intel prices are reasonable, it they were as insane as they used to be, AMD would be growing overnight.
I'll take that bet. How about $1500?
 
Yeah, I assure you it will be one week after AMD goes under that Intel will at the letter b to all its processors, call it a new chip, double the price and release a 1500$ 'top end' chip that is 100mhz faster than current top end.
AMD 'is no competition' because Intel prices are reasonable, it they were as insane as they used to be, AMD would be growing overnight.

Intel's top end chip is already far beyond $1500 - INTEL XEON 18C E5-2699V3 (2.3GHz/9.6GT/s/45MB) CPU $4235.00. The mid-tier top end prices have been around $1,000 for a long time now. I remember my Pentium 90 was about that -- along with a 1GB SCSI hard drive that was also $1000.
 
Considering his stated experience at AMD and new title at NVIDIA, it sounds like he will be the guy in charge of writing drivers for the interaction between CPU & GPU over NVLINK.
 
Are we living in the same universe?
Comparing CUDA to Vulkan...you are joking right?

CUDA to SPIR-V not Vulkan. SPIR-V just comes with Vulkan. You should really read up about it. It's basically your one stop shop for GPU compute, thus making CUDA obsolete.

SPIR-V is the base, not for only OpenCL-C, but also OpenCL-C++, OpenCL-Python, OpenCL-C#, OpenCL-Julia, OpenCL-Java and many more. Creating a single source language is a simple next step, by making splitting a part of the compile-phase or even using JIT.

spir-v_in_out.png
 
SPIR-V is interesting but it doesn't make CUDA a "joke", nor does it suddenly make it no longer the most-used framework for GPGPU computing, which is what the original question was about.

In fact, I'll be surprised if it'll even be used much outside of games.
 
Well a lot of the problem is that most Nvidia cards use aftermarket coolers while a lot of AMD cards were using stock coolers. I'm the kinda of person that will remove the stock cooler immediately and plop on my water block to cool the GPU cause doesn't matter if it's AMD or Nvidia cause both companies have graphic cards that are running 80C+.

Do you not know how energy works? 100w more power = more heat which even an aftermarket cooler isn't going to magically remove from your environment. The only way this wouldn't be a factor is if you have A/C running 24/7 and maintaining the same temperature, otherwise the 390 will heat up your room quicker than the 970. Even if both were to run at 80C, the 390 would still be dumping more heat in your room than the 970.
 
Just like Mantle was going to take over the world :rolleyes:

You missed the part where Mantle is now DX12 and Vulkan. More specifically Vulkan since AMD has more direct development with that API. Think about the name used for Vulkan. Mantle is like the layer around the Earth and volcanism is the release of that layer in what we call a volcano.

Volcanism
Vulkanism
Vulkan

You really that dense not to see Mantle has moved onto Vulkan?

Do you not know how energy works? 100w more power = more heat which even an aftermarket cooler isn't going to magically remove from your environment. The only way this wouldn't be a factor is if you have A/C running 24/7 and maintaining the same temperature, otherwise the 390 will heat up your room quicker than the 970. Even if both were to run at 80C, the 390 would still be dumping more heat in your room than the 970.
I know how it works, but you're making a big deal out of something that your graphics card is going to be doing FOR AS LONG AS YOU'RE GAMING. Otherwise if it's idle it's consuming nearly the same watts as a 970. So unless you're playing Counter Strike for hours a day or running Folding@home on your GPU, this isn't a problem. Also the 390 is going to be considerable faster in DX12, the extra 100w is justified. After all that Async Compute is a big deal for DX12 and might be the reason why AMD chips run hotter. Kinda like how Intel owners think their Haswell's are super cool running chips until they fire up Prime95 and kill their CPU due to use of AVX which super heats their CPU and breaks it. Except since Nvidia GPUs don't have Async Compute, they generate less heat.

less compute
less heat

Though it does mean that the 970 is a really well optimized DX11 graphics card, but not really meant for DX12. But hey, Ashes of the Singularity will be out in a week publicly as Alpha, and the benchmark will be publicly available. Feel free to remind me that I'm wrong.
 
Back
Top