AMD Next Horizon Event Scheduled for November 6

AlphaAtlas

[H]ard|Gawd
Staff member
Joined
Mar 3, 2018
Messages
1,713
A page for the "AMD Next Horizon" event has popped up on AMD's investor relations website. At their Q3 conference call last month, Laura Graves said that AMD will discuss technology "designed for the datacenter on industry-leading 7-nanometer process technology" at the event, and AMD unveiled Ryzen at their last "Horizon" event. While further specifics are scarce, there's a good chance AMD will discuss their upcoming Rome server CPUs and/or 7nm Vega at the event.

I would like to highlight some important dates for you. AMD's next Horizon event is scheduled for Tuesday, November 6, 2018, where we will discuss innovation of AMD products and technologies, specifically designed for the datacenter on industry-leading 7-nanometer process technology.
 
Last edited:
Me thinks next AMD GPU will have much better clockspeeds when produced by TSMC. Global Foundries has been dragging AMD down and the change has been long overdue.
What AMD has achieved with Ryzen is impressive considering their R&D budget vs Intel & inferior Global Foundries process.
 
Who else read this as "AMD Next Event Horizon" and got a little excited:

3.jpg
 
Looking forward to see that the boys will be passing on stage, hope a little of it is something i can use and get exited about.
 
"Where we are going we don't need raytracing..."

I like the idea of an industry standard when it comes to ray tracing. Graphics fidelity has had it's diminishing returns lately. I thought PhysX would be a nice way to move things forward, but that fizzled out (although, in game physics have come a long way). I would love to see some implementation of ray tracing in the new AMD hardware. At this point, it'd be the initial step so nothing mind blowing, but if the industry accepted it, I think it'd move forward quite a bit. I love the eye candy, and there are times when I'm admiring the visuals and I get blown away.

More cores, more speed is definitely great. But, when they add more visual options, I'm always turning them up while sacrificing frame rate. I remember cranking things up on 3DMark just to see how great things could look, even with 2 FPS. T&L, bump mapping, pixel shading, etc. all helped increase that visual quality and I like seeing more of those kind of things being added. Ray tracing seems like a nice addition at this point.
 
All roads lead to Rome

I heard rumors of a big announcement at the end of October with regards to next gen. I assumed it was Zen 2 preview. Could be Rome/Epyc or BOTH. Plus I see a preview of Vega 7nm. It would get us up to 1080ti speeds I'm betting. Price it about $700 and it will put pressure on the 2080 to bring prices down.
 
THANK GOD!!! Something to read about next Tuesday besides election coverage!

Why do you think they picked this time slot for the demo. It's a great place to bury what could well be less than stellar news.

Just imagine Lisa Su striding confidently out on to the stage to speak in glowing terms about how AMD's newest 7nm GPU offerings will be almost competitive in terms of raw processing power with Nvidia's product stack from 2014 or 2015 at only twice the power consumption.
 
Why do you think they picked this time slot for the demo. It's a great place to bury what could well be less than stellar news.

Just imagine Lisa Su striding confidently out on to the stage to speak in glowing terms about how AMD's newest 7nm GPU offerings will be almost competitive in terms of raw processing power with Nvidia's product stack from 2014 or 2015 at only twice the power consumption.

I got news for you, unless you dole out $1200 for a 2080ti, there HAVE been ZERO improvements in terms of speed and power consumption since 2014/2015 for NVIDIA as well.

In other words, there is more than enough room to compete in the current product stacks. If 2080 prices don't come down, Vega at 7nm and $700 could be a very compelling option.

Tomorrows announcement might also be a demo of Vega 12nm for notebooks as well, which might be a huge disappointment.
 
You may have noticed from my post history that I'm not exactly a fan of Nvidia's extreme price gouging across their mid range and higher product stack either.
 
What the fuck happened to their stock. Everything went to hell a few days after I told my wife how much money we have in stock. We don't have nearly as much now. No house for this big mouth guy here
 
What the fuck happened to their stock. Everything went to hell a few days after I told my wife how much money we have in stock. We don't have nearly as much now. No house for this big mouth guy here

Even though AMD's health in the CPU market and overall financial picture has improved, it did not improve at a rate greater than the arbitrarily elevated rate wallstreet analysts expected it to improve. So that plus the decline in GPU sales due to the third giant crypto bust caused AMD's stock to crater.
 
Even though AMD's health in the CPU market and overall financial picture has improved, it did not improve at a rate greater than the arbitrarily elevated rate wallstreet analysts expected it to improve. So that plus the decline in GPU sales due to the third giant crypto bust caused AMD's stock to crater.

nvidia shed off about $100. were both companies making that much money from crypto?
 
the entire stock market pretty much took a violent dump, not just this tech sector... I won't state any theories on why because it'll probably lead to this thread being closed down... but let's just say a lot of people went for a taco bell forth & fifth meal the night before...
 
What the fuck happened to their stock. Everything went to hell a few days after I told my wife how much money we have in stock. We don't have nearly as much now. No house for this big mouth guy here

There's a rough valuation that says the stock market will return a little over 7% including an average recession every 10 years with an average 20% loss that year. Depending on how wide the market swings (variance factored into monte carlo) statistically you are better off keeping it in a broad variety of companies over a 10 year period. If you drop after a loss, you risk losing out on the gains the commonly appear in a couple years.
 
nvidia shed off about $100. were both companies making that much money from crypto?

Profit = Ce^unitsSold. So even a 1% difference could be huge as you pull away from the baseline fixed cost.

Profit per unit is a function of the number of units sold due to fixed cost.

What's a fixed cost? R&D, and cost of doing business (support, distribution, power etc)

The fixed cost per unit goes down the more units you sell.

Profit per sale = Sale Price - (MFG Cost + FixedCost/# of units sold)

It becomes exponential
 
Last edited by a moderator:
I like the idea of an industry standard when it comes to ray tracing. Graphics fidelity has had it's diminishing returns lately. I thought PhysX would be a nice way to move things forward, but that fizzled out (although, in game physics have come a long way). I would love to see some implementation of ray tracing in the new AMD hardware. At this point, it'd be the initial step so nothing mind blowing, but if the industry accepted it, I think it'd move forward quite a bit. I love the eye candy, and there are times when I'm admiring the visuals and I get blown away.

More cores, more speed is definitely great. But, when they add more visual options, I'm always turning them up while sacrificing frame rate. I remember cranking things up on 3DMark just to see how great things could look, even with 2 FPS. T&L, bump mapping, pixel shading, etc. all helped increase that visual quality and I like seeing more of those kind of things being added. Ray tracing seems like a nice addition at this point.

Industry standard raytracing is fine, proprietary BS that is supported on only one brand is not. However, I am more interested in performance with "standard" graphics, not in manufacturers sacrificing valuable die space for a feature that may not even get implemented by the majority of developers.
 
Industry standard raytracing is fine, proprietary BS that is supported on only one brand is not. However, I am more interested in performance with "standard" graphics, not in manufacturers sacrificing valuable die space for a feature that may not even get implemented by the majority of developers.

I don't like NVIDIA, but I have to admit, their technology isn't proprietary. It's part of DX12 and AMD can implement it.
 
I don't like NVIDIA, but I have to admit, their technology isn't proprietary. It's part of DX12 and AMD can implement it.

I had assumed that NVIDIA's ray tracing implementation was proprietary. If it's part of DX12, then I am happy. AMD (I still want to type ATI) will support it eventually to be 'fully' DX12 compliant. Especially with more games supporting the technology and it being fairly open (well, not proprietary).
 
RTX is proprietary, the DX12 version of ray tracing is open standard.

And most developers will port to DX12 which will then call the RTX extensions in the driver.

It's not too dissimilar to Vulkan or Metal. You could directly exploit RTX with an extension if you needed to do so, but as there is a standard API it would be advantageous to use the standard API to ensure the most compatibility unless you need that bleeding edge performance which limits your target audience. As we have seen, there are limited titles which exploit bleeding edge metal as it takes too much in developer resources to cater to one market. Even with AMD's extensions on consoles, and a large installed base it really hasn't helped them that much make headway into the PC space. The number of games that used PhysX was also limited and to non critical visuals.

I mean PhysX still exist and is used in Unreal, but AMD cards still work with Unreal by using standard calls and they are still competitive on the Unreal engine.
 
Last edited by a moderator:
RTX is proprietary, the DX12 version of ray tracing is open standard.

Also , technically, all of Nvidia's and AMD's Windows drivers are closed source and proprietary, and they don't necessarily have to be. If you talk to any Linux devs, they would most definitely reiterate that point.
 
Profit = Ce^unitsSold. So even a 1% difference could be huge as you pull away from the baseline fixed cost.

Profit per unit is a function of the number of units sold due to fixed cost.

What's a fixed cost? R&D, and cost of doing business (support, distribution, power etc)

The fixed cost per unit goes down the more units you sell.

Profit per sale = Sale Price - (MFG Cost + FixedCost/# of units sold)

It becomes exponential

That actually makes sense. I should be paying more attention to the economic side of things
 
Looks like Team Red will be dominating on November 6th. Rapid Packed math will make short work of those pesky NPCs.
 
I heard rumors of a big announcement at the end of October with regards to next gen. I assumed it was Zen 2 preview. Could be Rome/Epyc or BOTH. Plus I see a preview of Vega 7nm. It would get us up to 1080ti speeds I'm betting. Price it about $700 and it will put pressure on the 2080 to bring prices down.
You are fantasizing. There will be NO Vega 7nm consumer card. That will come on Navi around July or August.
 
You are fantasizing. There will be NO Vega 7nm consumer card. That will come on Navi around July or August.

You could possibly be correct. They may talk about mobile Vega, 590, or nothing at all.

But amd wants to stay relevent in investors eyes. They couldn't possibly sell enough 7nm Vegas for professional space only. It wouldn't be worth their investment.
 
Back
Top