NVIDIA Next-Generation GPU Rumored to Be Called “Ampere”

Hynix spilled the beans of a 384bit gddr6 gpu due in q1 ages ago. Razor1 is right far more likely than you are. (insert jumping to conclusions mat here)
Interesting that Samsung won the Best Innovation award from CES 2018 for their GDDR6 that's still in development.
 
https://videocardz.com/newz/samsung-16gb-gddr6-memory-receives-ces-innovation-award

Even VGZ says this:

The CES takes place next year in January. NVIDIA already announced a keynote for January 7th. It is either there or in March (GPU Technology Conference) where Jensen will unveil Volta-based GeForce models. The GDDR6 is likely to make an appearance with GeForce 2000 series.

Its also worth noticing Samsung for example now got 16Ghz chips instead of 14Ghz. The future is bright for GDDR6.
 
Last edited:
It just like these people all doom and gloom for Intel when Ryzen came out and how Intel is in a panic. Yet Intel still own 80+ of the market almost a year later..

Which honestly is not a good thing. Intel is one of those companies that are truly terrible for the consumer and make nvidia look like a saint. I wish someone with deep pockets would take over AMD and crush Intel but unfortunately US regulators are too afraid of foreign companies or countries controlling a company like AMD. If that wasn’t the case I’d guarantee a Chinese company would have taken over AMD entirely and funded it really well.
 
Which honestly is not a good thing. Intel is one of those companies that are truly terrible for the consumer and make nvidia look like a saint. I wish someone with deep pockets would take over AMD and crush Intel but unfortunately US regulators are too afraid of foreign companies or countries controlling a company like AMD. If that wasn’t the case I’d guarantee a Chinese company would have taken over AMD entirely and funded it really well.

For what? Waste billions into a black hole? There need to be a valid business case and even gaming graphics went sour now with Intel entering the discrete segment.
 
For what? Waste billions into a black hole? There need to be a valid business case.

Money would go a long way with helping speed up cpu design times and they’d be able to expand GPU design teams to compete with nvidia and Intel in AI/deep learning. It’s no secret AMD is severely hampered by funding. I know everyone shits on Raja and RTG but he basically took an older architecture and built it on a budget in China and while it doesn’t match 1080 Ti it still does well enough. More funding would definitely help them reach parity with Nvidia.
 
Money would go a long way with helping speed up cpu design times and they’d be able to expand GPU design teams to compete with nvidia and Intel in AI/deep learning. It’s no secret AMD is severely hampered by funding. I know everyone shits on Raja and RTG but he basically took an older architecture and built it on a budget in China and while it doesn’t match 1080 Ti it still does well enough. More funding would definitely help them reach parity with Nvidia.

Compete in both CPU and GPU? So lets see the calculation. Lets for the sake of argument say AMD is flat in profit/loss over the time period.

Buy AMD for 12-13B$.
1.5B$+ per year in Graphics R&D the next 3-4 years. And now you not only have to compete with NVidia but also Intel
3-4B$+ per year in CPU R&D the next 3-4 years with the only growth segment being DC and IoT.

So 25-30B$ out the pocket before you even begin and then you get a company with ~10% share in both segments and on a declining trend. And that's assuming they even want to stay in the current markets.

50 year plan?

There is a reason why nobody bought AMD when it was trading at 2$. Even just for the graphics.
 
Last edited:
Compete in both CPU and GPU? So lets see the calculation. Lets for the sake of argument say AMD is flat in profit/loss over the time period.

Buy AMD for 12-13B$.
1.5B$+ per year in Graphics R&D the next 3-4 years. And now you not only have to compete with NVidia but also Intel
3-4B$+ per year in CPU R&D the next 3-4 years with the only growth segment being DC and IoT.

So 25-30B$ out the pocket before you even begin and then you get a company with ~10% share in both segments and on a declining trend.

50 year plan?

There is a reason why nobody bought AMD when it was trading at 2$. Even just for the graphics.

They are already competing well with a fraction of the budget with both Intel and Nvidia, they don’t need to scale to the same level as them to do better. They have the foundation to improve in the high margin markets and just need the extra cash now especially given the projected growth of AI/deep learning a long term investment wouldn’t be a bad thing and certainly wouldn’t be much for a nationalized company from China or Korea to pick up. Unless ARM can scale to server x86 levels soon, AMD still makes for a nice target for countries like SK/China. Even the Arab countries like Saudi Arabia want to diversify into the tech market for the long haul. If the US gets rid of xenophobic restrictions I can almost guarantee AMD would be gone in a heartbeat.
 
Compete in both CPU and GPU? So lets see the calculation. Lets for the sake of argument say AMD is flat in profit/loss over the time period.

Buy AMD for 12-13B$.
1.5B$+ per year in Graphics R&D the next 3-4 years. And now you not only have to compete with NVidia but also Intel
3-4B$+ per year in CPU R&D the next 3-4 years with the only growth segment being DC and IoT.

So 25-30B$ out the pocket before you even begin and then you get a company with ~10% share in both segments and on a declining trend. And that's assuming they even want to stay in the current markets.

50 year plan?

There is a reason why nobody bought AMD when it was trading at 2$. Even just for the graphics.
Don't forget AMD's liabilities of $3.07B, leading to shareholder equity of only $520M; for those of you unfamiliar with what any of that means: if AMD was liquidated (sold off at what they think their assets are worth), there would only be $520M left over after paying off all debt - and people say someone should swoop in and pay over $10B for AMD!
 
Didn't the gpus in the AI driven vehicles have 4 gpus and 2 of them were supposed to be highly advanced.
 
Joke
—————
Your head.

And the Fermi NPP was named after Enrico Fermi, Pascal did work in mathematics and barometric pressure hence the unit named Pascal.

Sue me. I thought you were serious. Hard to tell with all the nerd rage in this thread and sometimes on this forum LOL.
 
Pretty sure this will be a day one buy for me to keep up with 4k gaming. Everyone with 1080Ti's will love to SLI at that time lol.
 
They are already competing well with a fraction of the budget with both Intel and Nvidia, they don’t need to scale to the same level as them to do better. They have the foundation to improve in the high margin markets and just need the extra cash now especially given the projected growth of AI/deep learning a long term investment wouldn’t be a bad thing and certainly wouldn’t be much for a nationalized company from China or Korea to pick up. Unless ARM can scale to server x86 levels soon, AMD still makes for a nice target for countries like SK/China. Even the Arab countries like Saudi Arabia want to diversify into the tech market for the long haul. If the US gets rid of xenophobic restrictions I can almost guarantee AMD would be gone in a heartbeat.

They are competing so well their market share keeps plummet and only one to have a decline in Q4 even after RR launch.

"If just someone would bail them out". If it was such a great deal someone would have done it ages ago.

You seem to blame something else for holding back the bailout. But there isn't. Arabs and their oil money have already been used. They wanted nothing from AMD in the end besides a cash in.
 
Last edited:
They are competing so well their market share keeps plummet and only one to have a decline in Q4 even after RR launch.

"If just someone would bail them out". If it was such a great deal someone would have done it ages ago.

You seem to blame something else for holding back the bailout. But there isn't. Arabs and their oil money have already been used. They wanted nothing from AMD in the end besides a cash in.

So you think the CFIUS would let AMD be taken over by a foreign company? Never would happen. Also, the x86 cross-license deal with Intel is unfortunately another factor that has held a noose on AMD. They have the talent, just not the money.
 
So you think the CFIUS would let AMD be taken over by a foreign company?

Yes I do. Just as GloFo/AMD fabs was sold. Or how ARM makers are sold left and right and occupy a much greater space with foreign companies.
 
Yes I do. Just as GloFo/AMD fabs was sold. Or how ARM makers are sold left and right and occupy a much greater space with foreign companies.

ARM is UK based with the US regulators having no say over it. Fabs aren’t as important as the technology behind these x86 processors, especially gloflo. There’s no chance in hell the CFIUS and the Trump administration would let China get their hands on AMD.
 
ARM is UK based with the US regulators having no say over it. Fabs aren’t as important as the technology behind these x86 processors, especially gloflo. There’s no chance in hell the CFIUS and the Trump administration would let China get their hands on AMD.

What makes x86 so precious to the US? Also there are x86 manufactors outside the US. So what part of AMD would be an issue for national defence in the US?
 
What makes x86 so precious to the US? Also there are x86 manufactors outside the US. So what part of AMD would be an issue for national defence in the US?
Well the fact that there are x86 mfgs outside the US wouldn’t stop CFIUS. More vital than x86 is Memory and CFIUS would not allow a Chinese company to buy (they tried to buy Micron two years ago and Micron didn’t even entertain it due to regulatory concerns) and the biggest memory mfgs are not in the US.
 
Soon?

It will be OVER two years since the introduction of the 1000 series GPUs.


Wow... Has it really been that long.


Time flies.

I remember in college though. I went from a GeForce 2 GTS, to a GeForce 3 TI500 to a GeForce 6800GT in three years and I SKIPPED TWO GENERATIONS (4 and FX)

Things certainly have slowed down a bit. It just doesn't feel like it, as time passes more quickly when you get older.
 
Ampere will be the follow up to volta = ie the next chip in the line - volta,ampere,ohm watt....LOL...i luv that one everyone chant OHM!!! OHM!!!, OHM!!!
and get the cadence Volt x Amp=Watt ...Amp^2 x Ohm =watt...so watt will be last in this line - at least that's what i think

since there was no last chip layout at last GTC we don't know what is next after volta - so that has to come out during this GTC - not necessarily saying there will be product right away seeing they just got volta out the door

and we may not see 7nm if yields are low so think about 10nm as a possibility since it would be more mature - maybe even a refined 10nm line


my guess we see volta consumer next - and maybe a fall ampere for HPC


and don't fall they are scared of INTEL and AMD together ...please don't make me laugh

NVIDIA grew its share of the market with a 29.53 percent quarter-to-quarter increase in total GPU shipments, according to industry-tracking firm Jon Peddie Research. AMD shipped 7.63 percent more and Intel was up 5.01 percent during the same period.

in [H] news

really????.... scared???..... pffftttt!!!!!

if Nvidia is scared it be from HPC market and more players entering the field



and update

--------------------------------
New computing architecture Nvidia Volta 11/21/2017. All the latest information as of 11/21/2017.





In the first half of next year, not enough fast GDDR5 memory and more expensive GDDR5X will come to the new generation of chips - GDDR6. Samsung and SK Hynix are already engaged in the production of the latter, and mass deliveries should begin in the coming months. Nevertheless, in serial video cards GDDR6 will appear only in the spring - with the first announcements of adapters NVIDIA GeForce and / or TITAN on 12-nm chips Volta. Before, Samsung Electronics informed the public about its plans to produce GDDR6 chips with a bandwidth of 14 to 16 Gbit / s per contact (against the maximum 9 Gb / s for GDDR5 and 10-11.4 Gbps for GDDR5X), and SK Hynix announced 8 Gbps (1 GB) GDDR6 chips with a bandwidth limit of 16 Gbps per contact.





The problem of the limited bandwidth of GDDR5 graphics memory in its time led to the appearance of video cards and HPC accelerators with buffer memory HBM (HBM1) and later - HBM2. The connection of the graphic core and crystals of High Bandwidth Memory through the intermediate silicon layer allowed both to increase the bandwidth of the memory subsystem and to significantly reduce the area occupied by the key elements of the video card. At the same time, there were many shortcomings in HBM / HBM2 solutions: high production costs and, as a result, a limitation of memory capacity, a practical lack of the possibility to replace VRAM chips within a single generation of GPUs (again due to additional costs) and strong dependence on contractors. All this was the reason for the parallel release of high-end video cards with buffer memory types GDDR5,













The next generation of graphics architecture after the NVIDIA Volta will be Ampere, which is named after the physicist Andre-Marie Ampere. According to the source of this information, Nvidia is preparing the architecture for the announcement at the GPU Technology Conference in the coming year. The first real product based on Ampere will be announced later, and, in all likelihood, will be intended for supercomputers.





It's worth noting that NVIDIA has not released consumer video cards on the Volta architecture until this time, so only devices based on Pascal are available, which was presented a year and a half ago. If NVIDIA continues to use the four-digit numbering scheme, then, following the GeForce 10 Pascal family, we can expect the appearance of a series of GeForce 20 Volta and GeForce 30 Ampere. In the meantime, we can only guess. Against the background of not the most attractive price-performance ratio of AMD Radeon RX Vega adapters (especially in the first months after the announcement), GeForce 10 products are in great demand among players, and so much so that NVIDIA, for example, decided to offer the fans the TITAN Xp accelerator in a new design , not replacing the application characteristics.
 
Last edited:
Back
Top