cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,074
AMD unveiled the world's first lineup of 7nm GPUs for the datacenter that will utilize an all new version of the ROCM open software platform for accelerated computing. "The AMD Radeon Instinct MI60 and MI50 accelerators feature flexible mixed-precision capabilities, powered by high-performance compute units that expand the types of workloads these accelerators can address, including a range of HPC and deep learning applications." They are specifically designed to tackle datacenter workloads such as rapidly training complex neural networks, delivering higher levels of floating-point performance, while exhibiting greater efficiencies.

The new "Vega 7nm" GPUs are also the world's first GPUs to support the PCIe 4.02 interconnect which is twice as fast as other x86 CPU-to-GPU interconnect technologies and features AMD Infinity Fabric Link GPU interconnect technology that enables GPU-to-GPU communication that is six times faster than PCIe Gen 3. The AMD Radeon Instinct MI60 Accelerator is also the world's fastest double precision PCIe accelerator with 7.4 TFLOPs of peak double precision (FP64) performance.

"Google believes that open source is good for everyone," said Rajat Monga, engineering director, TensorFlow, Google. "We've seen how helpful it can be to open source machine learning technology, and we're glad to see AMD embracing it. With the ROCm open software platform, TensorFlow users will benefit from GPU acceleration and a more robust open source machine learning ecosystem." ROCm software version 2.0 provides updated math libraries for the new DLOPS; support for 64-bit Linux operating systems including CentOS, RHEL and Ubuntu; optimizations of existing components; and support for the latest versions of the most popular deep learning frameworks, including TensorFlow 1.11, PyTorch (Caffe2) and others.
 
Welcome to the age of trickle-down innovation.
AMD, Intel, and NVidia are focusing on products for the 0.00001% (Google, Facebook, Amazon), who don't care what the chips cost.
Maybe some day, we peons will benefit from these super-mega-ultra-chips, but don't hold your breath.
 
Welcome to the age of trickle-down innovation.
AMD, Intel, and NVidia are focusing on products for the 0.00001% (Google, Facebook, Amazon), who don't care what the chips cost.
Maybe some day, we peons will benefit from these super-mega-ultra-chips, but don't hold your breath.

99.9999% of people have no use for this product. Or any other datacenter specific product.
 
Welcome to the age of trickle-down innovation.
AMD, Intel, and NVidia are focusing on products for the 0.00001% (Google, Facebook, Amazon), who don't care what the chips cost.
Maybe some day, we peons will benefit from these super-mega-ultra-chips, but don't hold your breath.

You do understand that we do right? And it has ALWAYS been like this. Lots of consumer CPUs etc have been cut down server designs, and even back in the day people were BIOS modding consumer GPUs into their workstation GPUs, as everything else was just about the same. How you think we have not benefited from the tech and designs from the enterprise level, I have no idea. As consumer level stuff has little demand on it, server however has a lot of factors that need to be improved. The people on the [H] are NOT normal consumers, and make up a super small fraction of the consumer base.
 
Additionally, don't we see the trickle down affect in literally ever consumer GPU we have right now?
 
Welcome to the age of trickle-down innovation.
AMD, Intel, and NVidia are focusing on products for the 0.00001% (Google, Facebook, Amazon), who don't care what the chips cost.
Maybe some day, we peons will benefit from these super-mega-ultra-chips, but don't hold your breath.

You just mentioned 3 companies that almost everyone on the planet with an internet connection uses. Google for schools that converted to Chrome books and google docs could consume entire data centers of capacity and its only growing. Facebook is the junker in the list but there's no arguing how much impact it has today and how much traffic is rolls. Amazon has AWS, small business lifeline of the current and future. AMD and Intel know who will buy their cutting edge gear by the crate, it's not johnny best buy/tablet computer family.
 
45636461_10157971181794942_2780231707654094848_o.jpg?_nc_cat=102&_nc_ht=scontent.fntr6-2.jpg


Man, that thing looks bigger than a Raspberry Pi
 
You just mentioned 3 companies that almost everyone on the planet with an internet connection uses.

I don't use any of them, except for having an Android phone, and that's not using Google's servers.

The point is, don't expect any of what was announced today to produce any improvement in gamer's rigs any time soon.
 
I don't use any of them, except for having an Android phone, and that's not using Google's servers.

The point is, don't expect any of what was announced today to produce any improvement in gamer's rigs any time soon.

That's it everyone, stop ALL R&D work, its not going to affect gamers anytime soon, you heard it here first people!
 
I'd be OK if this makes it into the consumer side GPU's (like the Nvidia tech did).

There's a lot of data-scientists having to use Nvidia equipment because AMD is lagging in the ML/AI compute field.
 
Welcome to the age of trickle-down innovation.
AMD, Intel, and NVidia are focusing on products for the 0.00001% (Google, Facebook, Amazon), who don't care what the chips cost.
Maybe some day, we peons will benefit from these super-mega-ultra-chips, but don't hold your breath.

I feel entitlement in your statement, we've benefitted a shit ton for a long time now from their innovations. Or are you typing on [H] with a Cyrix chip right now?
 
i love new hardware because in a few years i normally get to play with it. and of course for anyone who didnt know you can by a ibm power 9 workstation that supports pcie 4
 
Welcome to the age of trickle-down innovation.
AMD, Intel, and NVidia are focusing on products for the 0.00001% (Google, Facebook, Amazon), who don't care what the chips cost.
Maybe some day, we peons will benefit from these super-mega-ultra-chips, but don't hold your breath.

Well we will see consumer cards based on this soon enough.

Having said that. Who do you think will be using all the cloud services powered by this hardware ? Do you not use Google ? I'm sure you are one of the few not on facebook.

It isn't trickle down... its powering the services real people use.
 
Welcome to the age of trickle-down innovation.
AMD, Intel, and NVidia are focusing on products for the 0.00001% (Google, Facebook, Amazon), who don't care what the chips cost.
Maybe some day, we peons will benefit from these super-mega-ultra-chips, but don't hold your breath.

you may be suprised how much cheaper this hardware can be obtained for sometimes. for example in a few years when amd releases another gen they often liquidate all stock they have for DIRT cheap if you know the people who get first dibs you can get parts for pretty dang cheap. computers move very fast it may be possible to obtain one of these for a reasonable price withen the year if its a popular product.
 
I don't use any of them, except for having an Android phone, and that's not using Google's servers.

The point is, don't expect any of what was announced today to produce any improvement in gamer's rigs any time soon.

Cloud computing for games and visuals.

While this is not big news for most of us, it's where most of the profits are. And AMD being a more profitable & competitive company does benefit us in the end one way or another.
 
Last edited by a moderator:
Not like relational databases were built to keep track of the parts of Saturn, or that thread handing and a tonne of other shit, was first invented for the LEM guidance computer, all stellar examples of everyday consumer hardware.

Its actually a change from the recent exception of consumer gaming hardware pushing basically everything. nVidia crushed it parent SGI, now the circle completes.
 
Welcome to the age of trickle-down innovation.
AMD, Intel, and NVidia are focusing on products for the 0.00001% (Google, Facebook, Amazon), who don't care what the chips cost.
Maybe some day, we peons will benefit from these super-mega-ultra-chips, but don't hold your breath.

PC tech has ALWAYS been trickle-down in this manner because the Enterprise/Business segments are the bulk of sales volume and revenue for manufacturers.

The consumer segment pales in comparison.

...the enthusiast portion (most of us here) of that consumer segment is but a tiny sliver.
 
The point is, don't expect any of what was announced today to produce any improvement in gamer's rigs any time soon.
Ha! Give it five years or less and we will be seeing the benefits.
As the others have stated, it has always been this way, even since the 1990s.

I remember in the late 1990s, only the military had access to 1000Base-T NICs and gear, and by 2001, it was fairly mainstream in mid-range desktops.
Even CPUs - VT-x and VT-d were both only offered on high-end Xeon CPUs from Intel, and within two years it was then available on Core 2 CPUs and a bit later even Atom CPUs.

So, by you saying "any time soon", that really means within a few years, at least going by history of computing, software development, and feature sets within the last 25 years.
 
Ha! Give it five years or less and we will be seeing the benefits.
As the others have stated, it has always been this way, even since the 1990s.

I remember in the late 1990s, only the military had access to 1000Base-T NICs and gear, and by 2001, it was fairly mainstream in mid-range desktops.
Even CPUs - VT-x and VT-d were both only offered on high-end Xeon CPUs from Intel, and within two years it was then available on Core 2 CPUs and a bit later even Atom CPUs.

So, by you saying "any time soon", that really means within a few years, at least going by history of computing, software development, and feature sets within the last 25 years.
Maybe he doesn't realize the Vega chip in workstation/datacentre currently is the same as the Vega in the desktop gaming version...
 
Then the gamer equivalent consumer variant of this gpu and memory stack (like Vega Frontier to Vega 64) would be, what, a 20-30% improvement over Vega 64 and on par with a 2080?
 
Then the gamer equivalent consumer variant of this gpu and memory stack (like Vega Frontier to Vega 64) would be, what, a 20-30% improvement over Vega 64 and on par with a 2080?
This is about what I estimated too.
They reckoned they wouldn't' release it though.. I think they may at the price the 2080 is..
 
This is about what I estimated too.
They reckoned they wouldn't' release it though.. I think they may at the price the 2080 is..

they need to price it like the 2070 if it has any hope. It's missing the RTX extensions.
 
Welcome to the age of trickle-down innovation.
AMD, Intel, and NVidia are focusing on products for the 0.00001% (Google, Facebook, Amazon), who don't care what the chips cost.
Maybe some day, we peons will benefit from these super-mega-ultra-chips, but don't hold your breath.

10 years from now you'll get one on Ebay for $100.
 
they need to price it like the 2070 if it has any hope. It's missing the RTX extensions.

Gimick marketing.

Nvidia has had "RTX" tech since their volta chips. They can call there tensor units "RTX ray tracing McGimmick" all they like it doesn't change anything. Its just 16 bit tensor hardware. Nvidia wanted to find a use for it as it is baked into volta and beyond and finding a consumer use for it is logical. (They are not ever designing a real ground up game chip ever again, on one is NV, AMD, Intel. Money is in the server market.) No one wants to leave 20-30% of a die doing nothing when they use those same chips in consumer parts, or worse fuse it off so people can't use consumer cards over their big expensive server tensor enabled parts. So you have to hand it to NV Marketing they got out there and spun a great story. (I have ZERO doubt the game developers that have been talking about RTX... have also had their games running on beta AMD hardware.)

From AMDs MI60/50 release;
"ROCm has been updated to support the TensorFlow framework API v1.11.....
These low-level instructions implement compute operations all the way from single bit precision to 64-bit floating point. The most beneficial instruction for the acceleration of deep learning training is a float 16 dot product which accumulates into a 32-bit result, maintaining the accuracy of the operation."

AMD can call their machine learning (tensor flow bits) super compute units or whatever they want. Bottom line its the same type of hardware you find on Nvidias volta and beyond server parts as well. The bottom line is Tensorflow is a google thing... they opensourced it and it became a major standard for use in machine learning. Both NV and AMD have built hardware to accelerate it. (don't let marking fool you... NV isn't doing anything new or unique they didn't invent tensor flow... they are simply building to Googles API like everyone else is / will be, Intel is on the way as well and no doubt there cards will also be aimed at the server market and the tensorflow api)

When AMD releases their next consumer card. Don't worry all the partial real time ray tracing developers are talking about will work just fine... and most likely via Vulkan for either vendor.
 
I don't use any of them, except for having an Android phone, and that's not using Google's servers.

The point is, don't expect any of what was announced today to produce any improvement in gamer's rigs any time soon.

What does soon mean for you, 5 days?

Ha! Give it five years or less and we will be seeing the benefits.
As the others have stated, it has always been this way, even since the 1990s.

I remember in the late 1990s, only the military had access to 1000Base-T NICs and gear, and by 2001, it was fairly mainstream in mid-range desktops.
Even CPUs - VT-x and VT-d were both only offered on high-end Xeon CPUs from Intel, and within two years it was then available on Core 2 CPUs and a bit later even Atom CPUs.

So, by you saying "any time soon", that really means within a few years, at least going by history of computing, software development, and feature sets within the last 25 years.

I'm sure we'll see it on consumer level (probably limited to prosumer level, but consumer level nonetheless) within 5 months of its actual launch.
 
where is this picture from? i cant in any of the press releases.

she showed it on stage and everyone went ape shit trying to take a picture of it.. i'd be surprised of the only ones that actually got a good picture of it were the ones in the front row and the only 2 people there that were using real camera's, lol.
 
Right off the bat, if this read as a co-cpu giving you all that it talked about... without video at all.. i'd buy one and put in my pc.


but if this is a video card, then it either runs on part with a rtx 2080 ti or.. i'm sticking with nvidia.

so if anyone out there at amd is listening, start making ai coprocessing cards and ill buy one of those.
 
Right off the bat, if this read as a co-cpu giving you all that it talked about... without video at all.. i'd buy one and put in my pc.


but if this is a video card, then it either runs on part with a rtx 2080 ti or.. i'm sticking with nvidia.

so if anyone out there at amd is listening, start making ai coprocessing cards and ill buy one of those.

These ARE GPU's but not video cards, they are used as accelerators like you wanted. If all you cared about is speed for AI processing then NVidia also offers accelerators for that too (already available).
 
funny how AMD is all in the news when the stock dropped more than 40% in two weeks.

Feel sorry who ever owned this stock
 
Depends where you purchased the stock. Although with that said its over $20 again.
 
Back
Top