AMD Presents New Horizon

It is possible. Neural network is just a fancy term for several inputs being put into a larger processing machine for an overall decision. Each of these inputs represents a wavelette or linear equation that can be stuffed into a AVX and evaluated. Where the scaler lands after some training will determine if the answer is true or false.

Yeah I know what neural nets are, and i respectfully disagree I really don't see how conflating this with neural networking is acceptable in this context. it's bs
 
Getting a bit nervous about AMD making a habit of comparing againat a $1000 chip...


Like seriously: AMD has a history of making the worse possible decisions when handed golden tickets.

The ONLY way they are going to make buckets of cash from this new CPU is if it is so cheap that it's a plain, simple, no-other-choice move against buying the competition. Just like AMD's video cards: people will ALWAYS buy the competition when performance and price are equal. The genral consensus is that MOST people will only jump the Nvidia ship when the price/performance is close to 60% on a comparably quick AMD card. In other words, AMD has to offer the same speed at 2/3 the price, or offer 30% faster performance on a similarly priced card.

This puts AMD at a HUGE disadvantage . NVidia's mindshare is so strong that AMD needs to outright surpass Nvidia by a huge, obvious and undeniable margin before average Joe Gamer will even consider moving from the USDA approved family-friendly Nvidia brand.

And my guess is that this is even MORE severe in the CPU space. My guess is that average 'buy their stuff from bestbuy' people wont even know about Ryzen CPUs unless they breach a 50% price/performance ratio. And that is just knowing about it: imagine what its going to take to get them to even CONSIDER buying some strange, alternative, 'best value' plastc-bag-cerial brand called 'ATM or something' versus the familly freindly, 100% American, plays ads on TV with a cute jingle, has their product names embedded in every consumer's mind brand Intel.

The fact that AMD is trying to set the expectation that Ryzen is worth $1000 is making me REALLY worried that they are going to shit the bed and ACTUALLY price Ryzen CPUs in the X99 CPU range.

If the biggest-dick-in-the-room 3.4Ghz 8-core unlocked mega chip is a single penny over $399.99, it will collect dust on shelves. It's not fair, but it's the truth of the market right now.



Also, AMD put out the demo of the Titan X SLI at 4K, one running on intel 6900k, the other on Ryzen, and said "you cant tell the difference !"
...
This is AMD here: if their CPU was ACTUALLY FASTER in games, they would milk that PR teet so dry that it would be ALL that we would be talking about. instead, AMD's CPU is shown to perform the same as an already inferior CPU for gaming in a heavily GPU bottlenecked workload. TitanX SLI at 1440p 144Hz would have been a MUCH more realistic and CPU -dependant workload.
Stop with the mindshare nonsense. If AMD chip is genuinly competitive they don't have to undercut by a massive amount. The chip will sell.
 
Blender Render: 1:14:51 so 1 minute and 14 seconds, almost 1 minute and 15 seconds time to complete on my i7 4770K @ 4.6GHz / DDR3 1600
 
He's not wrong.

Let's imagine Ryzen is legitimately faster in games compared to a 6900k.

How do you think that would go-down, press conference-wise? Wouldn't you think AMD would show off a number of games, at a variety of resolutions, showing that Rysen tops the intel chip every time?

Or would they show a single game at a GPU-bound resolution, and say 'you cant tell the difference'.
It doesn't have to be faster and a Titan X is the best you can get so showing it does not slow it down is all they have to do especially if the chip is much cheaper.
 
It doesn't have to be faster and a Titan X is the best you can get so showing it does not slow it down is all they have to do especially if the chip is much cheaper.


Not really.

First off: You didn't answer my question. How would YOU show off that your chip is faster? and if it's NOT faster, why are you excited? A 6700K is already faster than the 6900k for every type of gaming workload, and AMD did not show their chip against the 6700k. In other words,

Why are you excited for AMD showing off a chip 'making no difference' against a less-than-optimal CPU in a GPU-bound gaming workload?


also:

It doesn't have to be faster and a Titan X is the best you can get so showing it does not slow it down is all they have to do especially if the chip is much cheaper.

all they have to do especially if the chip is much cheaper.


if the chip is much cheaper.


 
A PCB teaser of Gigabyte GA-AX370 Gaming K3.

oifngigvpd3y.jpg
 

I assume you're running the 64-bit benchmark? Well my Ivy Bridge i5-3350p at stock speeds finishes the render in 02:37.58 on Windows XP x64 and 32-bit Blender binary! (64-bit Blender wouldn't run for some reason, should be a lot faster on x64):

blender.PNG


A CPU with twice less cores running at stock speeds and on a 32-bit binary instead of 64-bit beats yours, I think you sir, need an upgrade. :)
My OS choice probably makes a difference, but it shouldn't be that much faster than newer Windows versions.
 
Half the games I've played this year are stuttering on my 4670K hitting 100% load.
Maybe a 6600K at 4.5+ GHz would resolve that but my next upgrade will be 4c/8t minimum.
That's a Quad. We are finally getting to the point were more cores are better generally, especially if you run a lot of stuff. So far my pick for more cores (Intel(R) Core(TM) i7-4930K over here) has paid off big time. I'd gladly give up 5 FPS here or there in return for more programs / more VMs.
 
Last edited:
Hey Z what was your memory speed there?

1866Mhz I think?

I haven't messed with it in a while. I believe that's where I've been running it.

I've had difficukty overclocking the RAM on this chip and I didn't find that it made much of a performance difference in any of my normal tests, so I just left it at 1866 last I recall.
 
1866Mhz I think?

I haven't messed with it in a while. I believe that's where I've been running it.

I've had difficukty overclocking the RAM on this chip and I didn't find that it made much of a performance difference in any of my normal tests, so I just left it at 1866 last I recall.

It seems to make a big difference in blender?
 
He's not wrong.

Let's imagine Ryzen is legitimately faster in games compared to a 6900k.

How do you think that would go-down, press conference-wise? Wouldn't you think AMD would show off a number of games, at a variety of resolutions, showing that Rysen tops the intel chip every time?

Or would they show a single game at a GPU-bound resolution, and say 'you cant tell the difference'.
For a press release? 1 or 2. I can't remember anyone doing an entire suite during a press release. The only time that has happened is when the product has been otherworldly like Westmere or the Athlon 64. Zen doesn't need to be Athlon 64 is just needs to be actually competitive more often than not. If that happens it will be good enough. This isn't the GPU market. In the CPU market every time Intel releases a new product it inevitably means a new motherboard, a new socket, which almost always means new cooling. People are much more apt to switch when they have to throw out what they have anyway when they want to upgrade. If this chip is competitive in perf/$ and perf/w they will do it.
 
  • Like
Reactions: N4CR
like this
I get 1:04-1:05 on that benchmark on my 6700K @ 4.7 ghz ram is still at 2133, might get faster if i turn on xmp and run the ram at 2666
 
I wonder where everyone else was. Looks Lisa put her face on this presentation. I am sure prior marketing failures led her to be like just let me do this shit. It did look like she was pretty damn confident in IPC numbers when she was mentioning it so I am sure they over delivered here. They can use some of that after under delivering with polaris, let me rephrase that. After fucking up polaris so bad. lol


Lisa is, by FAR, the best presenter AMD has.
 
Getting a bit nervous about AMD making a habit of comparing againat a $1000 chip...


Like seriously: AMD has a history of making the worse possible decisions when handed golden tickets.

The ONLY way they are going to make buckets of cash from this new CPU is if it is so cheap that it's a plain, simple, no-other-choice move against buying the competition. Just like AMD's video cards: people will ALWAYS buy the competition when performance and price are equal. The genral consensus is that MOST people will only jump the Nvidia ship when the price/performance is close to 60% on a comparably quick AMD card. In other words, AMD has to offer the same speed at 2/3 the price, or offer 30% faster performance on a similarly priced card.

This puts AMD at a HUGE disadvantage . NVidia's mindshare is so strong that AMD needs to outright surpass Nvidia by a huge, obvious and undeniable margin before average Joe Gamer will even consider moving from the USDA approved family-friendly Nvidia brand.

And my guess is that this is even MORE severe in the CPU space. My guess is that average 'buy their stuff from bestbuy' people wont even know about Ryzen CPUs unless they breach a 50% price/performance ratio. And that is just knowing about it: imagine what its going to take to get them to even CONSIDER buying some strange, alternative, 'best value' plastc-bag-cerial brand called 'ATM or something' versus the familly freindly, 100% American, plays ads on TV with a cute jingle, has their product names embedded in every consumer's mind brand Intel.

The fact that AMD is trying to set the expectation that Ryzen is worth $1000 is making me REALLY worried that they are going to shit the bed and ACTUALLY price Ryzen CPUs in the X99 CPU range.

If the biggest-dick-in-the-room 3.4Ghz 8-core unlocked mega chip is a single penny over $399.99, it will collect dust on shelves. It's not fair, but it's the truth of the market right now.



Also, AMD put out the demo of the Titan X SLI at 4K, one running on intel 6900k, the other on Ryzen, and said "you cant tell the difference !"
...
This is AMD here: if their CPU was ACTUALLY FASTER in games, they would milk that PR teet so dry that it would be ALL that we would be talking about. instead, AMD's CPU is shown to perform the same as an already inferior CPU for gaming in a heavily GPU bottlenecked workload. TitanX SLI at 1440p 144Hz would have been a MUCH more realistic and CPU -dependant workload.



People think I am crazy for thinking this but... I think AMD should price this chip at 350-399. Precisely so that it WILL blow out sales for them and get them some much needed revenue and profit. A fast nickel is better than a slow dime. They need the word to get out in a BIG way that the AMD chip is the way to go, and being similar to an intel part for 15% less is not going to cut it with the massive mindshare intel has. Relatively low pricing will amplify the success of this chip and help cut through YEARS of negative press and word of mouth regarding amd cpus.
 
Just for shits and giggles, and since I can't test my 6600k at home and my work PCs i5 650 would be pitiful, I tested this out on my server I keep here at work, running a Xeon e5 2670 running at stock speeds.

Not too shabby for something from around the Sandy Bridge era.

Blender bench.png
 
I get 1:04-1:05 on that benchmark on my 6700K @ 4.7 ghz ram is still at 2133, might get faster if i turn on xmp and run the ram at 2666

nope increasing ram speeds didn't change my result so forget that idea.
 
Hmmm would LOVE to see what my Xeon X5670 @ 4.6ghz could do.....blah would have to wait until I get home.
 
I wonder when samples will ship to reviewers, if AMD actually has a good product on their hands they'll want to get them out as soon as they're stable.

The fact that they're not proactively backpedaling like they did with Polaris is a good sign (RX480 was meant to compete with GTX 1080 and missed that mark by a mile).
 
While there's probably some spin on this, the 3.4 clock speeds are encouraging (that's pretty darn good for an 8 core). At the very least, there appears to be little evidence of a bulldozer sized fail this time around. I'd be pretty surprised at this point if it doesn't at least match the 6800k in real world threaded tasks. Considering bulldozer at launch couldn't match the 2600k at anything, and struggled to separate itself from the 2500k even in threaded tasks, things look a lot better for Zen.

What I'm really interested in is what clocks the 4 and 6 core chips are going to run (if they even exist, I assume so). Let's be honest, even if the 8 core matches a 6900k, it's still not a chip the majority of us need. Most of us aren't sitting around encoding all day long hammering on 8 cores. We're much better served by higher clocked chips with lower core counts. The 6700/7700k still will likely be the best gaming + all around usage chip, barring some higher than expected clocks on the 4 or 6 core Zens.

My curiosity is definitely piqued.
 
I assume you're running the 64-bit benchmark? Well my Ivy Bridge i5-3350p at stock speeds finishes the render in 02:37.58 on Windows XP x64 and 32-bit Blender binary! (64-bit Blender wouldn't run for some reason, should be a lot faster on x64):

View attachment 12525

A CPU with twice less cores running at stock speeds and on a 32-bit binary instead of 64-bit beats yours, I think you sir, need an upgrade. :)
My OS choice probably makes a difference, but it shouldn't be that much faster than newer Windows versions.
yeah that was 64bit and kinda made me said then I decided to try the 32bit:

32bit run 1
32.PNG


32bit run 2
32_2.PNG


32bit run 3
32_3.PNG



then I reloaded 64 bit and got this:

64bit run 1
Capture2.PNG


WTFMFFF?!!?!
So my cpu barely beat yours in 32bit which I kinda expect being a FX and all but what the hells wrong with my 64bit?!
 
Hmm, there's definitely something odd about the times they're getting in their bench. Even their 6900k put up a way faster time than people here are getting on similar chips. Well, considering AMDs history, I guess I shouldn't be surprised if there's some misdirection going on here. That said, she definitely did seem to very confident when saying thier new chip was a match for Intel's offering. I'm lowering my enthusiasm to "cautiously optimistic" until we get some real silicon reviewed in a month or two.
 
Well we will see if this is all smoke and mirrors when we get the review. I am looking for similar to 6900k performance at 2/3 the price or less. I dont even play games anymore but my workload really benefits from more cores and threads, wouldnt mind buying AMD again if its really competitively priced.
 
This is reminding me all too much of the bulldozer launch. They spent all their marketing effort trying to show how the 8150 'kept up' with the 980X in gaming, and how AMD was offering a '$999 chip for less than $400' forgetting to mention that the 8150 could hardly keep pace with a (then) current generation i3 2100 in gaming.

Deja bullshit.
 
This is reminding me all too much of the bulldozer launch. They spent all their marketing effort trying to show how the 8150 'kept up' with the 980X in gaming, and how AMD was offering a '$999 chip for less than $400' forgetting to mention that the 8150 could hardly keep pace with a (then) current generation i3 2100 in gaming.

Deja bullshit.
hahahahahaaaaa oh ummm like my chip.... :( oh well I don't use it for any of that shit anyways. I game and it games just fine for now and I surf, watch movies and that aboot it. so its fine for that. but yes anything rendering or very cpu intensive and the poor thing just crumbles.
 
Please let this chip not be a dismal failure.

It would be great to see Intel get a good swift kick right in the ass.
 
For a press release? 1 or 2. I can't remember anyone doing an entire suite during a press release. The only time that has happened is when the product has been otherworldly like Westmere or the Athlon 64. Zen doesn't need to be Athlon 64 is just needs to be actually competitive more often than not. If that happens it will be good enough. This isn't the GPU market. In the CPU market every time Intel releases a new product it inevitably means a new motherboard, a new socket, which almost always means new cooling. People are much more apt to switch when they have to throw out what they have anyway when they want to upgrade. If this chip is competitive in perf/$ and perf/w they will do it.

Pin compatibility is actually easy for them to do, the reason why Intel didn't do pin compatibility with their higher core chips was to increase margins by making those people that wanted more cores to buy more expensive motherboards.
 
Back
Top