3rd gen Threadripper, big fail and missed opportunity for AMD.

A monopoly has a specific definition. It is possible to illegally influence a market without being a monopoly. Bulldozer was a poor design, and there are no excuses to be made for it. AMD competed just fine - in fact DAMNED WELL - even against earlier shady Intel behavior from 1999 to 2006 or so. But you will recall, that when they had the fastest products in the market, and began approaching that 50% market penetration number, they charged a premium price for them because they could.

Monopoly - the exclusive possession or control of the supply of or trade in a commodity or service.

Emphasis on control and just because at one point spot sales reached 50% does not invalidate how much of the market intel possessed. None of that has to do with the topic at hand. A judgement was made against intel for using illegal practices to artificially control AMD's growth.

Again this isn't about the merits of technology, it's about illegal practices, which you seem to gloss over, marginalize, ignore, etc.

Again, you're making a category error. You know, I see this shit all the times these days. People trying to put a moral and ethical spin onto a corporation. It's like when Nike endorses some dude making a popular political statement and people think "OMG Nike is such a good company because they support [some dude]." No, they don't. They're just trying to sell some shoes to somebody, and gullible customers are buying into this spin. Intel, and to lesser extent, Nvidia have done shady things because they are in dominant positions, and wish to stay there, not because they are more or less moral than the competition. I tell you this, because if AMD continues to excel, at some point they may establish a dominant market position...

...and then you will find the shoe is now on the other foot.

Nice irrelevant example. Almost an appeal to popularity. Making a marketing mistake by coming out with a Jeremy Epstein just do it line is not illegal in any way.

Long and short of it. You could be right, someday AMD will corner the market and screw us all over. At this point it has not happened so arguing as if it already happened is disingenuous.
 
I wouldn't say 16 core is redundant, TRX40 should have had a less expensive option in some form IMO.

Personally, the lack of a cheaper entry point to TRX40 means I may stay on X399. I am an edge case, but I actually use all those PCIe lanes that X399 gets me (2x GPU, 2x 16x SSD cards, additional M.2 SSDs). to move down to the 570 mainstream platform would mean less bandwidth or less storage. Zen 2 would be great - as would PCIe 4, but TRX40 and a 24c/48t cpu are overkill for my processing needs.

Fair point!
Lookee here.. you might be in luck;
https://www.amd.com/en/products/cpu/amd-epyc-7302p
They have a 16c Epyc Rome already....
editL 128Mb of cache for 16 cores, holy jeebus!! 8mb per core.. that's a lot of super-pi.

So I'd say it's a matter of time, capacity and launch benchmark ass-raping (24-32c overkill) as to why there is no 16 core option yet.


And Intel dropping prices on their HEDT is still almost irrelevant because they get smoked by a 12 core desktop CPU already... 24 core on HEDT is a mutilation. Outside of edge cases like yours where you need the IO it's a no brainer. 16c TR3 may be more up your alley if you are patient. Nothing wrong with waiting if on x399 already..
 
Last edited:
Long and short of it. You could be right, someday AMD will corner the market and screw us all over. At this point it has not happened so arguing as if it already happened is disingenuous.

Correct. The fact is tech journalist could solve this debate relatively easily if they're honest.

I'll break it down. Intel's largest performance gains come in the area of cache performance. It is not an error that all of the vulnerabilities in the Intel architecture have focused on the cache.

The gaming performance that you see on the Intel platforms comes from the ridiculous speed of Intel's cache. The problem for Intel as it moves to smaller and smaller nodes is that the cache performance takes a hit.

It is a fact when it comes to manufacturing processors that no one does transistors that switches fast for processor cache than Intel.

The problem is at the nodes we are now talking about Intel has to face the reality that latency is needed in order to make dies manufacturerable past the 14nm node.

If they introduce the latency required the single thread performance will drop and it will be far more similar to everyone else. This is an obvious problem across the board. This is why it's taken Intel so long and it's the only explanation that makes sense.
 
I believe that Threadripper is targetted for professional workload people and not enthusiasts. If you do algo trading for a living or are a video editor you will pay 2k for a CPU that gets the job done 50-70% faster than Intel every single time. Because your time is literally money. The price is justified.

If you are just an enthusiast who likes to stream and do occasional video work then the 3950X is the CPU that is right for you, not the Threadripper.
 
AMD has the performance crown. They can now charge a premium for having the fastest product. They could easily charge intel prices but the fact that they’re charging 5% more per core compared to previous gen on a much faster, more expensive architecture to produce, is great for us. They could have easily pulled an Intel and charged Cascade lake prices considering the performance but they didn’t, causing intel to drop their prices by 50%. Judging by the performance of the 3950x, AMD’s pricing could be a steal for those that can actually make use of all that horsepower. Have you guys considered that alternative? Instead you want more for cheaper. What world do you live in? They have shareholders and employees as well. Remember, threadripper is a niche market, it accounts for what, 3%, of their total CPU sales? Volume is low, target segment are streamers, content creators, and people that have an ass load of money and want the best. So the first 2 have potential income, and the last don’t have a budget. If you’re complaining about the price, deal with it. You’re not in the target segment even if you think you are. One last point, these are epyc cast-offs which are being repurposed for consumers. Enjoy the technology and R&D at a fraction of the cost with higher clocks because you don’t need the thermal envelope of a server environment.
 
Last edited:
I believe that Threadripper is targetted for professional workload people and not enthusiasts.

Threadripper and HEDT as a whole are somewhere between. 'Professionals' are likely going to want Epyc or Xeon from a major vendor for the support, if time really is money, excepting those where cost really is an issue, thus the market for HEDT will remain fairly small.

And with 8+ cores available in consumer sockets at much lower platform costs... it's likely to stay that way.
 
Threadripper and HEDT as a whole are somewhere between. 'Professionals' are likely going to want Epyc or Xeon from a major vendor for the support, if time really is money, excepting those where cost really is an issue, thus the market for HEDT will remain fairly small.

And with 8+ cores available in consumer sockets at much lower platform costs... it's likely to stay that way.

HEDT is for the Prosumer market, or very small business Professional. The big time professionals will use Epyc or Xeon, yes. However, where I disagree somewhat is the notion that HEDT will remain a small market. I think there's a lot of potential here. Everybody with a GoPro who needs to edit footage, to people with decent drones doing Real Estate flybys, to lifestyle vloggers, to dudes with CNC machines in their garages doing test renders, will be a potential market for these things. I think HEDT is actually a market with serious expansion potential for this reason.

However, I also think that high-end AM4 CPUs will partially satisfy this need too. AMD is kind of self-cannibalizing in that respect. But whatever. Prosumer is a thing, in everything from cameras to audio equipment, and I think it's a market that will only grow bigger.
 
However, I also think that high-end AM4 CPUs will partially satisfy this need too. AMD is kind of self-cannibalizing in that respect. But whatever. Prosumer is a thing, in everything from cameras to audio equipment, and I think it's a market that will only grow bigger.

Honestly that's the coin-flip. The challenge I see for HEDT, regardless of vendor, is that consumer platform performance is increasing too. Given that people have been doing all of these things with far less powerful systems to me it seems that they'd be weighing cost versus amount of speedup -- and in that, you're limited to those that can justify the cost in the first place, i.e. they have the expendable income like many of us, or they actually make money from their 'hobby' work.

Also note that for much of what you mention, especially the video stuff, people will need a very compelling reason to invest in more CPU over more GPU. A 3950X, a striped pair of PCIe 4.0 SSDs, and whatever GPU has the best transcoding block at the moment will likely be near indistinguishable in both output and time per job for most.

So, let me refine my point to take into account yours: I see the HEDT market growing, just not so much more as a percentage of 'prosumer' desktops, and certainly not so much as a percentage of consumer desktops. As cheap as AMD's 7nm chiplet products can make cores, most people- including most prosumers- don't have a need for them.


As an example: so what if a render of whatever takes the length of a Simpsons episode instead of a run to the gas station? Is that worth ~2x the investment? I realize that for some that answer will be an unequivocal yes, I just don't expect it to be more than that.
 
I think the extra PCIe lanes over the extra cores might drive a few people to TR over AM4.
 
Honestly that's the coin-flip. The challenge I see for HEDT, regardless of vendor, is that consumer platform performance is increasing too. Given that people have been doing all of these things with far less powerful systems to me it seems that they'd be weighing cost versus amount of speedup -- and in that, you're limited to those that can justify the cost in the first place, i.e. they have the expendable income like many of us, or they actually make money from their 'hobby' work.

Also note that for much of what you mention, especially the video stuff, people will need a very compelling reason to invest in more CPU over more GPU. A 3950X, a striped pair of PCIe 4.0 SSDs, and whatever GPU has the best transcoding block at the moment will likely be near indistinguishable in both output and time per job for most.

So, let me refine my point to take into account yours: I see the HEDT market growing, just not so much more as a percentage of 'prosumer' desktops, and certainly not so much as a percentage of consumer desktops. As cheap as AMD's 7nm chiplet products can make cores, most people- including most prosumers- don't have a need for them.


As an example: so what if a render of whatever takes the length of a Simpsons episode instead of a run to the gas station? Is that worth ~2x the investment? I realize that for some that answer will be an unequivocal yes, I just don't expect it to be more than that.
Read a review about streaming while gaming and basically any x264 preset beyond fast caused stutters. The higher count processers (above 8) exhibited steady frame times without dropped frames on higher settings.

They also looked at streaming using the GPU and even there the best set up was high core count setups.
 
  • Like
Reactions: Mega6
like this
To complete the thought we already are being pushed to find use for the extra cores for even just regular gamers. I wouldn't be surprised if software wasn't upgraded to account for multi-monitored setups and through our last discussion Ray traced games do use more CPU power not just the GPU.

So streaming ray traced games will require even more cores to stream whenever we get to that point where those games that support the feature are ubiquitous.

Basically even at the fast preset for x264 you'll need more than 8 to do it without stutter. You'll need 12 cores at least just to pull it off which puts gamers that want to do this very close to HEDT which means the people that need that many cores it's allot greater than before.
 
Last edited:
No matter which way you lean on the arguments in this thread.... back to topic somewhat, there's a reason I buy the high end of last year's model. Maybe that's "anti-enthusiast", but it works for me.
 
Read a review

Link a few.

You'll need 12 cores at least just to pull it off which puts gamers that want to do this very close to HEDT which means the people that need that many cores it's allot greater than before.

If you need to stream, you'll use another system to prevent local impacts to gaming regardless of the CPU you're running.

I wouldn't be surprised if software wasn't upgraded to account for multi-monitored setups and through our last discussion Ray traced games do use more CPU power not just the GPU.

They do -- and they need single-core performance to boot. Luckily AMD has finally released a Zen SKU that's nearly equal to the six year old Skylake CPUs.
 
No matter which way you lean on the arguments in this thread.... back to topic somewhat, there's a reason I buy the high end of last year's model. Maybe that's "anti-enthusiast", but it works for me.

Built the wife a system, had assorted Intel parts laying around like the board, and memory that is older than Ryzen, so I used an i3 9100F. Lowest-power quad-core for the current socket but otherwise similar to the i5 6500 of yore, sans IGP, and I was enthusiastic about making it quiet in a Node 202 ITX enclosure ;)

It did get the RX460 4GB too...
 
First gen and second gen threadripper was a great opportunity for AMD to get enthusiast on their HEDT segment.

Many many enthusiast are waiting for a good threadripper with good single core and multi thread performance,
so the 3rd gen was the CPU that all people was waiting for BUT...

Price is now stupidly high,
mainstream gained a lot more cores for the same money
while HEDT is getting the same number of cores with really increased price.

On the HEDT segment we don't expect more performance at the same price, we expect more cores at the same price!

Now that Intel is selling the 18 cores i9-10990XE at only $999 an Intel core costs 55USD while the crappy AMD 3960X boosts the price at 58USD per core with a whopping 1400USD.
Leave alone the stupidly high priced 3970X at 2000USD

AMD is missing a big opportunity here, no enthusiast will spend that much for a dying platform like TRX40,
a platform that must die soon due to the USB4 and DDR5 arrival.

BIG FAIL AMD

AMD for the first time in a while will have a legitimately better HEDT platform, why sell cheap when you are good, people will buy it like fresh hot cakes on a saturday morning.
 
Is it a rip off? people paid money to buy Intel HEDT for a specific reason were they ripped off or when you have what people need you have the right to get paid what it is worth

Have you seen the AMD forum here?

"Finally, AMD will save us after Intel ripping us off all these years!"

And then, as predicted by literally anyone not sworn to team green, AMD proceeded to price like Intel. Just as they did last time.
 
Have you seen the AMD forum here?

"Finally, AMD will save us after Intel ripping us off all these years!"

And then, as predicted by literally anyone not sworn to team green, AMD proceeded to price like Intel. Just as they did last time.
16 cores for $749?!? Dude once again you aren't even bring serious. For threadripper you're likely talking about 64 cores under 2,000. My ivy bridge 6 core was $600.

Do you have a mode where you talk realistically? I guess you're good for entertainment.
 
You're speaking of 'now' versus 'then'. We can go further back if you like when we paid more than that for a single core.
Um since all of us live in space time I think now versus then always applies. The problem is that my example was only 4 years ago and even today Intel charges more per core so pick your battles.
 
Um since all of us live in space time I think now versus then always applies. The problem is that my example was only 4 years ago and even today Intel charges more per core so pick your battles.

And generally have faster cores, so?
 
Yup. Which is why it should be done on a GPU if it's not done on another system.

If you had literally watched the video they covered that. You can't stream with webcam at that resolution using NVENC. Only streaming on the CPU works in that set up.

So again more cores mattered more than just high clocks. The extra cores are needed there's no getting around that.

That's why I said a while ago that streaming RT based games creates an additional need for more cores as the CPU utilization is much higher. So something like a Intel HEDT or 3900 or even 3950 would be considered the new minimum as people stream more RT based content.

With AMD and Intel releasing these high core count processers more and more scenarios are coming into play that relies on them.

I didn't make that claim.
You can be pedantic on this but essentially ST performance per core is synonymous enough.
 
Last edited:
If you had literally watched the video they covered that. You can't stream with webcam at that resolution using NVENC. Only streaming on the CPU works in that set up.

Which is why you'd want a separate machine if you're serious about it.

You can be pedantic on this but essentially ST performance per core is synonymous enough.

Specificity isn't being pedantic.
 
Which is why you'd want a separate machine if you're serious about it.
So build another machine, increase TDP dramatically, instead of dropping in a better CPU? That's pretty..... wasteful....lol. In the video a 12 core was enough to solve the problem.


Specificity isn't being pedantic.
You're not being specific you're just being pedantic and difficult for probably no other reason than some sort issue found within the DSM.

Can't help you with that but I can provide benchmarks that prove a point if you're willing to comprehend them.
 
So build another machine, increase TDP dramatically, instead of dropping in a better CPU? That's pretty..... wasteful....lol.

And when another streaming-centric game comes along that starts dropping frames?

Right tool for the job. Encoding on the CPU is a stop-gap for not having the right tools in the first place.

You're not being specific you're just being pedantic and difficult for probably no other reason than some sort issue found within the DSM.

You can keep your insults.

Can't help you with that but I can provide benchmarks that prove a point if you're willing to comprehend them.

That don't support your point? Cool.
 
And when another streaming-centric game comes along that starts dropping frames?

Right tool for the job. Encoding on the CPU is a stop-gap for not having the right tools in the first place.
That's why we have these things called "upgrades". A software package is released that requires more power so people upgrade. Maybe you've heard of it.


You can keep your insults.
Not an insult, nor diagnosis, just an observation.

That don't support your point? Cool.
If it didn't you wouldn't have to keep responding against the facts found within that link.

In fact the information is so salient you're now down to recommending people build an entire additional system instead of just using a 12 core processor. It's a good thing people are making high core count processors so we don't need to double our electric bill.

It's a blessing truly. Hopefully one day you'll celebrate with us. I pray you find peace in this new generation of cpu innovation.
 
That's why we have these things called "upgrades". A software package is released that requires more power so people upgrade. Maybe you've heard of it.

Good, now you have the base for a second system!

In fact the information is so salient you're now down to recommending people build an entire additional system instead of just using a 12 core processor. It's a good thing people are making high core count processors so we don't need to double our electric bill.

If they're serious about streaming, yeah.
 
Have you seen the AMD forum here?

"Finally, AMD will save us after Intel ripping us off all these years!"

And then, as predicted by literally anyone not sworn to team green, AMD proceeded to price like Intel. Just as they did last time.

If you are getting 64 cores for 1.5k that means you are getting more than Intel offered for 18 cores for less and you are now getting Xeon core counts for a fraction of the price, that is not really a rip off.
 
  • Like
Reactions: kac77
like this
Back
Top