AMD Cinebench Benchmark Demo at CES 2019 Buries the Current Intel Lineup

Referencing previous generation (Ryzen 7 2700x) as evidence of the performance of the generation following it, is just ignorant and a fool's errand. I'm not even going to entertain this topic since it seems you don't understand how that's completely irrelevant.

Who isn't paying attention? You? All you have to do is look to the past my friend. Here, I'll help you.

https://cpu.userbenchmark.com/Compare/Intel-Core-i9-9900K-vs-AMD-Ryzen-7-2700X/4028vs3958

People are not "salty" ... they are "skeptical" .... big difference between the two.

We heard the same shit with Zen + about AMD "mopping" the floor with Intel.

And who really gives an actual fk about "power savings" .. Listen, if I own a Lambo, I'm gonna put in $4 a gallon high octane fuel and expect to get 5 miles a gallon. I doubt seriously kids are so concerned with saving a few pennies here and there.

But, I have my money ready. I'm a fan for performance, not a brand name.

Let's see how this thing overclocks and benches ...
 
You know what I like?

Computers. They're pretty neat.

The salt is pretty clear on this forum. Bunch of fanboys that overpay for hardware and have to justify it with flawed reasoning. Pay your $500 for an extra 5 fps. Everyone needs to stop pushing their buyer's remorse on all of us that just want to enjoy new tech and see what every company has to offer.
 
I wouldn't be surprised. Clearly if something can't stand up to scrutiny, the next logical play is irrational fear mongering. Or FUD, that's a nice play Microsoft has perfected over the years.

You realize Intel has paid shills all over the net, right?

That DEFINITELY explains some of the "salt" and "naysaying".
 
He was pretty much spot on regarding his Ryzen 3 predictions.


He never said Ryzen 3 would be released at CES.

Yes you are correct. His slides say "TBA...To be announced" at CES.


Nostradamus is the father of astrology. Spew ambiguous bullshit, and leave it to people to distort reality enough to fit some of it. Predicting actual market events and product specifications is a tad more grounded than that. It's not fortune telling, it is educated guessing.

Exactly. We're all Nostradamus when it comes to future product launches. :)

AdoredTV makes two kind of predictions, his own, and those he copies/stoles from elsewhere. The second kind of predictions are usually correct, because aren't made by him. The first kind are plain wrong and the reason why he has been changing his nickname during last decade. AdoredTV is his four identity?

Are you still around?
 
If they refresh the cores to go with the ddr5 transition they could also refresh this platform with the modular design by keeping this I/O chip.

It gives a lot of flexibility and should allow them to keep the apu side bang up to date.

AMD are still way smaller than Intel and even best case will be for a long time. This let's them be agile with the minimum resources.

Intel will struggle at the high core counts on a monolithic design to keep up. Less so at low core counts though.
 
I'm pretty sure amd made intel shit their pants and will get to a big gain and this will be 2003-2006 period all over again

The thing is Intel 10nm is due to finally drop this year, which will put them back on top again. Not by that much though.
 
First of all, it's "VERY" important that any potential customer of these AMD chips should absolutely know that AMD uses a multi chip setup with these CPU's .... so the I/O ... the interconnects .. the data pathways between these chips are slower than Intel. There is an inherent latency that AMD faces ... now, these numbers are very low but much higher than Intel. This is why Intel still beats AMD in many benchmarks. Especially games. In fact, nearly all games.
Intel will eventually have to take a page out of AMD's book when it comes to making newer chips with more cores. Why? Because it simply comes down to complexity and the fact that as you try to add more cores to a chip the harder it is to get a good enough batch of chips to sell. You can see this in how the 9900K is so damn expensive. I have no doubt that it's because Intel has to either throw a lot of imperfect chips away or end up disabling cores to make lesser i5s and even lesser i3s.
 
Back on top.. You sure of that?

They've been waiting for their 10nm process to mature for what, four years now? And AMD has just barely caught up to Skylake in that time?

Intel isn't always the best bet, but they're very rarely the worst.
 
Silicon is at its limits. There isn't a better way to go
This. As a system becomes more complex (ie. more cores) the more chances of things going wrong during the manufacturing of the chip. The days of monolithic chips are over, we need to transition to multi-module chips (chiplet-based packages) or the cost of making the chips of the future is going to be absolutely hideous.
 
Proof of this is in the Xeon line. They aren't expensive just because Intel likes to charge so damn much, it's because chips with that many cores in a monolithic package is [H]ard as fuck to manufacture. I'd not be surprised to see a failure rate of more than 65% and that's being optimistic.
 
Proof of this is in the Xeon line. They aren't expensive just because Intel likes to charge so damn much, it's because chips with that many cores in a monolithic package is [H]ard as fuck to manufacture. I'd not be surprised to see a failure rate of more than 65% and that's being optimistic.
no. the proof is in the Zen line. Nice try.
 
Ignoring the bickering in this thread, I managed to watch the video only now and ho lee fuk if what he is saying is true, meaning this was just a 65W TDP lower midrange non-X model then the X models are going to be fricking awesome for both productivity and gaming. And the high end models with more cores are going to be absolutely bonkers for productivity work. Poor man's Semi-Threadrippers for everyone!
 
Proof of this is in the Xeon line. They aren't expensive just because Intel likes to charge so damn much, it's because chips with that many cores in a monolithic package is [H]ard as fuck to manufacture. I'd not be surprised to see a failure rate of more than 65% and that's being optimistic.
You're right monolithic is old school.
 
You're right monolithic is old school.
Yep. If we want chips with more cores the industry is going to have to adopt a multi-package or chiplet design or you can kiss the idea of more cores goodbye.

Either that or enjoy $1000 chips and I'm sure nobody is going to like that.
 
Last edited:
Back on top.. You sure of that?

Oh yeah. The shrink will be good for them but more importantly the arch is getting some substantial improvements to. It'll push them back ahead. Ryzen 2 won't be exactly bad then but it won't be the home run either.
 
Oh yeah. The shrink will be good for them but more importantly the arch is getting some substantial improvements to. It'll push them back ahead. Ryzen 2 won't be exactly bad then but it won't be the home run either.
its already back and forth. Sorry, i don't see it happening. Should it happen, it will be short.
 
its already back and forth. Sorry, i don't see it happening. Should it happen, it will be short.

AMD is going back and forth with Skylake. AMD's going to need more than the small tweaks from Zen through Zen+ to Zen 2 to take the lead.
 
AMD is going back and forth with Skylake. AMD's going to need more than the small tweaks from Zen through Zen+ to Zen 2 to take the lead.

No I see zen2 being the better of Skylake even if the latter was on 10nm. Its the improvements in Icelake that will make the difference.
 
No, but I gave names of people published the leaks/predictions before him.

Just to substantiate this point for all members. Thanks for the mention i did no knew K.H Chia. But that made me read his twitter posts... pretty good btw.
So i found this chiakokhua/status/1060526913536458753 dated earlier than the graph you posted as damaging to AdoredTV`s Jim.

Its pretty clear K.H. Chia knew and follows Jim. In another tweet basically uses previous AdoredTV video as source of new educated guesses. At least , the mentioned graph can be collaborative or inspired by...

Bye the way, K.H. Chia has a whole thread on latency between chip-lets and IO die.
The summary is this quote "The idea that Rome will experience a huge increase in latency just because the memory controller is moved to the I/O die needs to be debunked."
 
Last edited:
The thing is Intel 10nm is due to finally drop this year...

And Intel's 10nm was due [according to Intel] to drop in 2018, and in 2017, and in 2016, and originally in what 2015?

At this point, what credence can Intel's claims regarding when either credible 10nm products or 10nm products that can combat Ryzen will release be given?

 
Last edited:
Back
Top