AMD XT Round Up

And here I was thinking AMD did all of this just to make more money.....silly me.

I'm with N4CR here.

The XT launch is clearly meant for people buying a NEW system. You'd be insane to "upgrade" to a 3600XT if you already have a 3600 or 3600X.

This launch coincides with the B550 launch, so people buying B550 are going to ponder this deal. Not X570 owners. But of course the old ones are still available, and available for a great price too, for anyone who wants value.

If anything, AMD is helping their other CPUs go down in price by basically competing with themselves. It's a great deal and a really exciting time for CPUs. But again, upgrading from an older version of a 3000 series chip to an XT is ludicrous. You'd have to be rich AND stupid to think this is an upgrade a 3000 CPU owner should actually buy. This is for NEW systems for long-term usage.

This is what you buy when you're replacing a PC that's already more than five years old and you want a refined Ryzen experience.
 
The only reason I've actually retired systems lately has been due to not having a role for them, or due to something like the motherboard degrading stability.

Not for being too old; I have an old Sandy laptop that will still do everything that a desktop user needs, so long as the grinding fan doesn't drive them away first :D
I still have a core 2 quad and a P4 system kicking around... I did finally retire my phenom ii x3 720 BE... I just didn't have a use for it anymore. The P4 system is probably next since my core 2 quad is taking over the job of running my desktop CNC machine.
 
I looked at the benchmarks and it didn't seem like there was a big difference.

But, yeah, if you are building a new system, why not get an extra 100 - 200 MHz guaranteed?
 
  • Like
Reactions: N4CR
like this
Well if they changed the model number I could see that being a problem.

They just added an XT to the end, so they are clearly implying it's a better bin of the same product line, which is fine to me.
 
I looked at the benchmarks and it didn't seem like there was a big difference.

But, yeah, if you are building a new system, why not get an extra 100 - 200 MHz guaranteed?

3900X vs 3900XT, you pay ~$70 more for the same base clock and +100 MHz boost clock. The performance difference is in margin of error territory.

Why throw money at worthless parts? Who cares if the system is new or old, the XT parts are stupid and a terrible value. The extra money spent on literally no performance gains for an XT processor is better spent on literally anything else. Motherboard, RAM, SSD, Case or using it to light your cigar.

businessman_cigar_money.jpg
 
3900X vs 3900XT, you pay ~$70 more for the same base clock and +100 MHz boost clock. The performance difference is in margin of error territory.

Why throw money at worthless parts? Who cares if the system is new or old, the XT parts are stupid and a terrible value. The extra money spent on literally no performance gains for an XT processor is better spent on literally anything else. Motherboard, RAM, SSD, Case or using it to light your cigar.

View attachment 260400
Same argument for the 10900 vs that as well in most cases. With good cooling you might get 3-5% max but that is something I'd rather have than not for 70bucks, at that point it's a sunk cost. Each to their own.
 
10900K has 2 more cores for the same price as the 9900K.
Was meaning the 10900k is a very similar product position and SC performance gap is effectively nullified stock for stock. It has two less cores than the 3900XT though.
 
Was meaning the 10900k is a very similar product position and SC performance gap is effectively nullified stock for stock. It has two less cores than the 3900XT though.
Well, if you're in the market for building a PC and looking at a CPU then it would make no sense to choose the 3900XT over the 3900X as it offers no advantages for the higher cost. It does make sense to get the 10900K over the 9900K because Intel is charging the same amount of money for 2 more cores with a 100 MHz increase in base clock and a 300 MHz increase in boost.
 
Well, if you're in the market for building a PC and looking at a CPU then it would make no sense to choose the 3900XT over the 3900X as it offers no advantages for the higher cost.

There is a slight performance advantage and you know some people don't care about paying extra. The T is added value, and I'm not really joking.
 
There is a slight performance advantage and you know some people don't care about paying extra. The T is added value, and I'm not really joking.
Where is the performance advantage? From the reviews I've seen it's within 0.2%, which is a rounding error. The data shows no statistical performance difference between the X and XT.
 
Extra boost. Even if the real-world benefits are slim today, it's there, and it's better, so people will pay for it.
 
  • Like
Reactions: N4CR
like this
Extra boost. Even if the real-world benefits are slim today, it's there, and it's better, so people will pay for it.
While I agree people will pay for it, I don't see that boost as actually being realized. The measurable performance differences (both in number crunching and in power draw) are so minimal that they themselves approach the margin of error. I wonder if that difference would even be consistent between say 1000 unit batches of each.
 
I'm confused. Doesn't seem like a problem to me. Intel released binned chips with the 8086k and the 9900KS. If you want a binned Chip, you pay a little extra for it.
 
While I agree people will pay for it, I don't see that boost as actually being realized.

There's also the better overclocking headroom and its associated benefits.

But AMD could put a black anodized heatspreader on these and change nothing else and still charge more.

That's a good idea AMD should get me to do marketing.
 
  • Like
Reactions: N4CR
like this
I'm confused. Doesn't seem like a problem to me. Intel released binned chips with the 8086k and the 9900KS. If you want a binned Chip, you pay a little extra for it.
Those are provably binned higher, the XT line falls short of the standard Intel set.
 
You got it in one, I would go XT if I had to do a non-TR build for long term. It's just a process ramp, same reason my late production V64 can do what a full power one does (and more) at 160-190W, simply better bins later in the manufacturing run.

I have a feeling that you're going to get the same process improvements in the regular lineup until they chips are EOL. These are just the ones binned for the specific single core speeds.
 
Well, if you're in the market for building a PC and looking at a CPU then it would make no sense to choose the 3900XT over the 3900X as it offers no advantages for the higher cost. It does make sense to get the 10900K over the 9900K because Intel is charging the same amount of money for 2 more cores with a 100 MHz increase in base clock and a 300 MHz increase in boost.

I looked at this for my upgrade but I have no idea why I want to pay a power and money penalty for some crap onboard graphics. I have no idea why the hell they decided to include onboard graphics. Maybe this time around they will give us an actual lower price on a KF version.
 
It can come in handy. I've had to fall back on integrated graphics on a few occasions including during parts swaps and stuff. If it doesn't affect the performance or tremendously change the price, then, meh.
 
It can come in handy. I've had to fall back on integrated graphics on a few occasions including during parts swaps and stuff. If it doesn't affect the performance or tremendously change the price, then, meh.

Same.

As to the rest, this is clearly addressing the new purchase market segment. This wouldn't appeal to an upgrade person - most likely. If not for cost, then simply because it isn't worth my time to take my machine apart for such marginal differences. That being said, if you're getting those brand new boxes of gear...why not?
 
I looked at this for my upgrade but I have no idea why I want to pay a power and money penalty for some crap onboard graphics. I have no idea why the hell they decided to include onboard graphics. Maybe this time around they will give us an actual lower price on a KF version.
Intel's GPUs are certainly not compute monsters, but I'll be damned if they're not extremely functional; more than needed for a desktop user, at that.

If you're a one or two monitor person, I get it; honestly the only thing keeping me from replicating and likely expanding on the four monitor setup I had at my last place is the WAF involved, and that plummets alarmingly when monitor count exceeds two!
 
  • Like
Reactions: Axman
like this
It can come in handy. I've had to fall back on integrated graphics on a few occasions including during parts swaps and stuff. If it doesn't affect the performance or tremendously change the price, then, meh.

Is there any tests on an Integrated graphics Intel CPU's when installed with a dedicated card that shows whether the onboard still drives the wattage up? I just am not in love with 125W instead of 95W. If the chip doesn't use the onboard is power consumption down, because then I'd be more inclined.
 
None that I can think of, but that being said, it's never going to not consume a little bit of power. It's going to take a a couple of watts just run enough power to it to shut it off, if that makes sense.

Probably not enough power to affect overclocking or boost when not in use.

And it might actually improve power delivery across the chip in theory depending on how it's connected.
 
Is there any tests on an Integrated graphics Intel CPU's when installed with a dedicated card that shows whether the onboard still drives the wattage up? I just am not in love with 125W instead of 95W. If the chip doesn't use the onboard is power consumption down, because then I'd be more inclined.
If you've looked at reviews of either CPU, you'll have had the opportunity to note that '95W' and '125W' are pretty arbitrary when you take the training wheels off of the CPUs. Something worth going back and looking at just for your own awareness on the subject.
 
3600XT and 3800XT are the worst offenders of this release. A 3700X costs $20-30 more than the 3600XT and will walk all over it in non-gaming tasks while being very similar in gaming. Same with 3800XT vs 3900X. At least the 3950X is quite a bit more expensive than the 3900XT, so it’s not as easy to make the jump. This release just seems like a waste, I’m not sure who exactly is buying these either since none of them perform meaningfully better than the Non-XT/Non-X versions.
 
3600XT and 3800XT are the worst offenders of this release. A 3700X costs $20-30 more than the 3600XT and will walk all over it in non-gaming tasks while being very similar in gaming. Same with 3800XT vs 3900X. At least the 3950X is quite a bit more expensive than the 3900XT, so it’s not as easy to make the jump. This release just seems like a waste, I’m not sure who exactly is buying these either since none of them perform meaningfully better than the Non-XT/Non-X versions.

my guess is the yields are so good now that they could justify doing something like this because they have an excess of what's considered the top percentage of dies. i do agree the 3800XT and 3900XT seem to be a waste but meh might as well fill the price gaps if you have the dies to spare. i still blame the rumor mills and reddit for overhyping these processors before AMD even announced the clock speed differences.
 
i still blame the rumor mills and reddit for overhyping these processors before AMD even announced the clock speed differences.
AMD themselves marked them with '4.7GHz boost'; had that been an attainable all-core overclock target, the hype would have been justified.

...it isn't. Not even close. Even the all-core gains reported for the XT CPUs, which are higher, don't result in corresponding real-world increases in performance, and they don't alter the competitive landscape.
 
If you've looked at reviews of either CPU, you'll have had the opportunity to note that '95W' and '125W' are pretty arbitrary when you take the training wheels off of the CPUs. Something worth going back and looking at just for your own awareness on the subject.

I have no plans to overclock. This is just a rendering computer for me. The reviews I've seen show a massive 295W on full core load for the 10900K while the 3900X is 225W ish. But I wanted to know if some of that overage is due to the on chip graphics or if that power draw is insignificant and the intel is just power hungry.
 
I have no plans to overclock. This is just a rendering computer for me. The reviews I've seen show a massive 295W on full core load for the 10900K while the 3900X is 225W ish. But I wanted to know if some of that overage is due to the on chip graphics or if that power draw is insignificant and the intel is just power hungry.

Intel is on 14nm++++, AMD is on TSMC 7nm. There's the difference. Go with whichever gives you best performance for your use case.
 
Back
Top