AMD Ryzen Threadripper 1950X and 1920X CPU Review @ [H]

I guess the question depends on how well TR overclocks because I would not call the 16C a clear winner in this chart at all. I mean 583 versus 589 is not that big of a deal. Although the 589 is overclocked and I expect a few hundred extra MHz from the TR when overclocked.
I'd say 4.1-4.2 is looking to be the normal max under heavy water OC. Also note the TR does not appear to have much optimisation due to lower utilisation and lower scores than Ryzen in a few tests even with higher clocks... Considering they did this well, in a few months they will do even better in a few tests with these optimisations and compiling tweaks.
 
Did people just read the same review I did? Intel Stitched up threadrippers ass with steal string in all but Blender ,Cinebench and POV-Ray. Games and Premiere prefer Intel. Hell econding MP3 and compression still prefers Intel. As a desktop processor, which I believe it is marketed as, the x7900 is the better chip. Is the AMD bandwagon so full of shit that people can't see? I'm Glad that thread ripper exist to force innovation and competition but get real men(and women).

For Premiere, when looking at all the benchmarks today, the nod still goes to Threadripper 1950X over SKL-X 7900X, if just barely. As a workstation CPU, 1950X appears to be anywhere from roughly matching the 7900X to trouncing it by 30-40%. As they are the same price, and the 1950X gives you more lanes and ECC memory support, I'd say it wins as a rendering/video workstation CPU, except in a few edge cases. From what I saw at Anand and Toms, it looks to make a decent data workstation as well.

In gaming, there is no beating Intel's superior IPC, except in a few edge cases where the higher core count helps. But from a gaming standpoint, HEDT CPUs are a waste of money anyway. Criticizing the Ryzen 5 and Ryzen 7 CPUs for lackluster 1080p gaming performance was very fair, as they were marketed in that price bracket. But Threadripper really isn't in that bracket. It's still good to see gaming performance benchmarked against its competition here, but it's much more academic in this market.
 
For Premiere, when looking at all the benchmarks today, the nod still goes to Threadripper 1950X over SKL-X 7900X, if just barely. As a workstation CPU, 1950X appears to be anywhere from roughly matching the 7900X to trouncing it by 30-40%. As they are the same price, and the 1950X gives you more lanes and ECC memory support, I'd say it wins as a rendering/video workstation CPU, except in a few edge cases. From what I saw at Anand and Toms, it looks to make a decent data workstation as well.

In gaming, there is no beating Intel's superior IPC, except in a few edge cases where the higher core count helps. But from a gaming standpoint, HEDT CPUs are a waste of money anyway. Criticizing the Ryzen 5 and Ryzen 7 CPUs for lackluster 1080p gaming performance was very fair, as they were marketed in that price bracket. But Threadripper really isn't in that bracket. It's still good to see gaming performance benchmarked against its competition here, but it's much more academic in this market.
Or just game at 4K and the difference vanishes.

I am saddened by the 1440P numbers - but I suppose I can't claim to be that surprised.

For workstation and for 4K gaming it's a win overall, and it might just drive Intel prices down a little. Maybe.
 
Great review as usual Kyle, good to see some real competition for Intel and at the right price !!
 
Did people just read the same review I did? Intel Stitched up threadrippers ass with steal string in all but Blender ,Cinebench and POV-Ray. Games and Premiere prefer Intel. Hell econding MP3 and compression still prefers Intel. As a desktop processor, which I believe it is marketed as, the x7900 is the better chip. Is the AMD bandwagon so full of shit that people can't see? I'm Glad that thread ripper exist to force innovation and competition but get real men(and women).

I'd still go AMD for more PCIe lanes and ECC memory support. I don't really care about losing 5% or so performance in games (looks like that's all it is, on average at higher resolutions). If Intel didn't gimp their HEDT platform compared to their server counterparts, then it would be a tougher choice...
 
Almost a hundred posts and no mention of how highly binned AMD said these chips are? I am disappointed. Did any other sites mention this? I was not sure is that was exclusive information or not.
 
Um TR won in Premiere.
1502322393x097rt9n5y_5_2.png

Essentially if it's productivity TR is pretty much the winner, which is exactly why you buy such a platform. Throw in ECC which is pretty much mandatory when you have your project sitting in memory and I would question your understanding of the market for these chips. They are not for games. You want to game? Get a 7700K.
What are your reading? This shows Intel's 10-core CPU beating AMD's 12-core CPU in this test, and keeping up with the 16-core part when both are overclocked.
 
Fixed that for you.;)
Oh I agree but if you want the most FPS and are willing to get an EOL socket then the 7700k is still the best for pure gaming. For me an OC'd R5 1600 would be just fine as I am a price to performance guy. I'm GPU bound anyway with a 1070 at 3440x1440 so it doesn't really matter.
 
If the few OC results we have continue to come in similar, then the top 5% binning statement likely has some substance to it. Didn't see this coming to be honest, but it's a great move. Make the HEDT truly HEDT to the silicon pick, instead of the same dice roll as every other. Reminds me a little of the early Opterons, they would OC so hard and were worth the slight premium.
 
Nice review. Glad to see there is some competition in CPU's again.

Looks like to me that in servers this thing should kick ass. Lot's more VM's in your hosts, likely blades using this would be ram limited really... But those types of blades, the $500 price difference isn't a big % change...

I like the extra PCIE lanes... been annoyed that those amounts on Intel cpu's hasn't really changed since forever. I suppose less need for that since SLI configs are on the decline... But in a server, wouldn't this allow for extra 10Gb nics and HBA's?

Any chance we could get some IPC comparisons? Force clocks to 4Ghz, and run some single threaded tests (forcibly turn off the other cores maybe) in those various benchies? Quick estimate was that it was 12 to 30% slower in single threads...

You mentioned these were binned, was that that all these CPU's for sale are the top 5%? Or those used just for your reviews? If that is the case we should see more overclocking reviews using retail purchased cpu's.

I get the feeling that for gamers, Intel is still the way to go for now. But if they can make inroads into the server market, which is much more lucrative, then Intel might finally get off it' ass..
 
naB2gnl.png


Surprising :eek:

I know that they compete head to head in the GPU market, but I've always gotten the feeling that Nvidia and Intel don't like each other very much. I would say that good AMD CPUs would be welcome to Nvidia.
 
Thanks for the review, I turned to YouTube later to see other reviewers opinions. Sifted through a lot of shit (personalities that show their deep Blue or Red hate) where 80% of the video was about flinging crap at one another before any of the benchmarks even started.

Glad to see TR, looking at getting one in a few months once new platform issues, drivers, BIOS, revisions, etc... have been worked out. Well that all depends on how well it runs with Proxmox and handles hardware pass through.

As a primary team Blue guy here, its pretty cool to see AMD shake up the market, will make Intel evaluate its core tax and enterprise pricing options on consumer motherboards.

For those arguing in the thread, AMD finally has something to compete, something that will end up making Intel innovate/evolve/adjust price. if not and AMD keeps making leaps in their processor technology its going to a repeat of the earlier years where Intel was on the bottom and AMD was on top.

Either way, buy the processor that meets your expectations/needs, and STFU about the other side.

Besides Intel DOMINATED AMD in Unigine...
 
I know that they compete head to head in the GPU market, but I've always gotten the feeling that Nvidia and Intel don't like each other very much. I would say that good AMD CPUs would be welcome to Nvidia.

Ehhh.... I do not see it that way, seems that last few generations NVidia has been on its own level for mid to high, hell profits alone show that. But we shall see what Vega does, hopefully it brings some much needed competition to the high end market. While $800-$1200 every couple years inst so bad, I still rather not give x company $800 in profits off my purchase due to no competition.
 
Once again, you are spot on the money. I was wondering why the 5% of the silicon still does not do more than 4G. Didn't want to bring it up as i don't see much people wanting to OC a Workstation where more threads would be usefully.
Am I the only one that thinks AMD is still catching up? They still don't have a product that can out run an i7 7700 in games.

The way that AMD built the Ryzen stack, it's obvious that their goal was to create a base to scale on. Threadripper is an HEDT contender, no doubt. It excels in what it's designed for. I would be very happy to see one of these chips in the upcoming iMac Pro coming out at the end of the year but sadly Apple's product description seems to indicate Xeon-only. :banghead:
 
What are your reading? This shows Intel's 10-core CPU beating AMD's 12-core CPU in this test, and keeping up with the 16-core part when both are overclocked.
Oh I don't know maybe i'm looking at the processor that's competing with the 7900 the 1950x. Both are priced the same. Therefore they compete with each other, which is lower 624 or 714? This isn't that hard of a math problem.
 
I know that they compete head to head in the GPU market, but I've always gotten the feeling that Nvidia and Intel don't like each other very much. I would say that good AMD CPUs would be welcome to Nvidia.

Nvidia and intel were at loggerheads a good few years back, nvidia used to basically compare how many transistors were in their gpu's compared to intel's cpu's. Not sure if that was ever resolved.
 
Nice review Kyle. If you do any follow up, I'd be interested in seeing a little more detail on the power draw, Tom's showed pretty high idle power consumption which isn't something I want in my next workstation build.

Also, I miss the review index at the bottom. I usually skip gaming benches and having to load every page, even ones I don't care about on my phone is kind of a PITA.
 
What are your reading? This shows Intel's 10-core CPU beating AMD's 12-core CPU in this test, and keeping up with the 16-core part when both are overclocked.
What price list are you reading? This shows Intel's $999 CPU losing (albeit slightly) to AMD's $999 CPU... Based on price the competition to the i9 7900X is the 1950X, and the only thing Intel has in the "win" category is performance per core... not much of a win if you're not constrained by licensing software per core.

X299 is (at best) completely pointless - other than more PCIe lanes (on the top-end cpus only) it's no different than Intel's consumer Z270 chipset, and is trounced by the features available on X399.
Unless Intel drops the price of the 7900X to $799 or you have one or two single-threaded apps you use constantly (while still using heavily threaded apps often enough to justify 10 cores) it's a tough sell.
 
Nice review Kyle. If you do any follow up, I'd be interested in seeing a little more detail on the power draw, Tom's showed pretty high idle power consumption which isn't something I want in my next workstation build.

Also, I miss the review index at the bottom. I usually skip gaming benches and having to load every page, even ones I don't care about on my phone is kind of a PITA.
As noted in the review, we are not done. Iirc 100w idle with 1080.

Yes that is coming back.
 
Now if that top part had scored below the 7900x or costed more then it would be a win for the 7900x until that time I'm looking at AMD's top SKU because it costs the same as the 7900

I think a lot of people will purchase the 7900X for the same reason. I mean price being the same and it outperforms the 16C TR in the applications they want..

With that said I want ECC so some time later this year I expect to purchase the 16C TR.
 
Last edited:
This reminds me of the socket 939 days. I think the next few years should bring some nice advances in CPUs. Intel has enough money to get back to the drawing board and pull off another C2Q now that AMD's got something competitive.
 
  • Like
Reactions: N4CR
like this
Glad to see this is a stable platform (as compared to the Rizen roll-out) . Very curious to see how the 1900 performs. Hoping to build another system soon that will have the usefulness and longevity of my current build,
 
Even if you can OC a 7900 to be comparable to the 1950x you are losing out on 20 PCIE lanes and ECC memory. Anyone planning on using this for gaming or gaming and streaming is just wasting their money.

It is overkill for my current usage scenarios, but if I buy now and spread the cost over 5-7 years like I have with my i7-920, it's really not that bad.

Tough call either way, but I'm looking forward with a desire to future proof a little and AMD's offering has the clear price to performance W under its lid.
For now, anyway... by the time I save enough to buy a full system, who knows what will be out but it'll mean Intel needs to get their ass in gear and finally compete instead of taxing consumers.
 
I think a lot of people will purchase the 7900X for the same reason. I mean price being the same and it outperforms the TR in the applications they want..

With that said I want ECC so some time later this year I expect to purchase the 16C TR.
ECC is a big deal for me because my all -in- one powers the whole house and now I can add a third video card. So now people can game in three rooms.
 
Newegg sells one (1) ecc ddr4 product, ddr4-2133 8GB CL15. Amazon has ecc ddr4-2666 8GB CL19. Seems like ECC options are sorely limited?
 
AMD needed to go with the Lego strategy they did with each chip being a module. Lowers cost even if it does introduce latency.
 
Once again, you are spot on the money. I was wondering why the 5% of the silicon still does not do more than 4G. Didn't want to bring it up as i don't see much people wanting to OC a Workstation where more threads would be usefully.
Am I the only one that thinks AMD is still catching up? They still don't have a product that can out run an i7 7700 in games.

My perception of the binning is that Ryzen is built on a process optimizing efficiency, not frequency headroom - therefore the binning is probably related to voltage (and thus heat). It makes logical sense considering there are 4 CCX's crammed into there.
 
Last edited:
Nice review Kyle. If you do any follow up, I'd be interested in seeing a little more detail on the power draw, Tom's showed pretty high idle power consumption which isn't something I want in my next workstation build.

Also, I miss the review index at the bottom. I usually skip gaming benches and having to load every page, even ones I don't care about on my phone is kind of a PITA.

It was moved to the top of the page - took me a bit to find it, too.
 
Nice review, Kyle. Looks like I might have two builds in my future, one for work, the other for play. A TR4 and an X299. Two months should be long enough to firm up those decisions.

Now where did I leave that blow up doll...
 
I probably missed it.....

Is the MB used a E-ATX layout since the CPU is giant?
Or does all this fit in a standard ATX form factor???
 
If you have a case that has a good place to put it, I have seen an inexpensive PCIe-to-PCI adapter on NewEgg. I, too, have a need for PCI slots...
I've been eyeing that but I still can't figure out the best way to mount it. I haven't seen any pictures or examples online anywhere.
 
Back
Top