Socket 423 was never supposed to be long lived. It was entirely transitional.Heh, I remember that craziness. P3 1ghz vs P4 at 1.4ghz. And the 1.7ghz or so chips needed a new mobo months later. That led me to Duron then Athlon XP.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Socket 423 was never supposed to be long lived. It was entirely transitional.Heh, I remember that craziness. P3 1ghz vs P4 at 1.4ghz. And the 1.7ghz or so chips needed a new mobo months later. That led me to Duron then Athlon XP.
I think at this point you would have to be an Intel employee to not understand the competition is better in all metrics.You wonder if people are still going to buy this? Might be a few thousand reviews saying it's the best thing since grilled cheese. Raptor lake isn't very desirable because of the voltage issue so people might be more than happy to pick this up.
They definitely still will buy it. Some because they only buy Intel, others because perhaps the pricing can make them attractive. I won't even mention the uninformed, they always make up a solid % as well I'm sure.You wonder if people are still going to buy this? Might be a few thousand reviews saying it's the best thing since grilled cheese. Raptor lake isn't very desirable because of the voltage issue so people might be more than happy to pick this up. Eventually Raptorlake is going to be discontinued. OEM builds will have no choice to use this chip if the customer wants it to build and doesn't want AMD.
Intel is at least going in the right direction with power draw now after it has been out of control.Looking at Gamers Nexus power draw charts now... as he properly hooked up the 12volt lines as apparently Intel is now sucking CPU power from there. Jesus this is really bad for Intel. I mean looking at MIPS per watt in zip compression the 285 is only 25% better then the 14900.... which would be great if the 14900 wasn't so terrible in this metric. I mean in that metric the 285 is pushing 1184 mips at 161 watts.... the 7950x is pushing 1936 mips at 133 watts. The one thing Intel said they fixed... eh sort of. If you care about power efficiency AMD is still not just dominant its not even a contest.
Did you watch the tech jesus review. Intel is now drawing power from the 12 volt lines. I doubt Phoronix was properly measuring power use. I doubt many people have been measuring it properly.
This is the setup Tech Jesus setup... smart way to get proper power results;
View: https://www.youtube.com/watch?v=nmK1rCyKbgQ
View attachment 687120View attachment 687121
While true, they went from crazy town, to somewhat less crazy town. And hiding some of it by pulling more from the 24pin connector... sneaky.Intel is at least going in the right direction with power draw now after it has been out of control.
At this point I'm curious to see if the people who were declining to use Ryzen's first 3 or so generations because the top Intel chips had 2-3 more FPS are going to switch to X3D chips.I think at this point you would have to be an Intel employee to not understand the competition is better in all metrics.
To me, argument is ridiculous. Sadly though, as a much lesser volume producer, the whole AMD line is scarce by definition, and the coveted class, like the X3D ones, have gone up in price due to supply/demand.At this point I'm curious to see if the people who were declining to use Ryzen's first 3 or so generations because the top Intel chips had 2-3 more FPS are going to switch to X3D chips.
I definitely remember people here during the Zen 1-2 time frame insisting on the top Intel CPUs over any Ryzen one because they had higher FPS, even when it was down to under 1% in reviews, and regardless of price, because they wanted the absolute best. That's why, of course, anyone spends money on a 90-series video card for gaming. And there are people who are willing to buy the top chip as it comes out to get a few more FPS. Maybe not every generation but certainly at least some of the time. So that would imply they should be willing to (say) replace a 14900K with a 9800X3D for an extra couple of percent, regardless of how much it'll cost to buy a new CPU & motherboard...unless that was just a cover for "I only by Intel". Which is fine, as long as they are willing to acknowledge it.If you "can't tell", is it worth the (sometimes) much much higher price?
There can be "other reasons" having to do with buses, lanes, USB... etc. That is, tech advances that can be important. But, for most gamerz... they don't care so much about that. One that did matter, is storage. So the move to SSD and fast SSD, as games and texture load times and shader compiles, etc came into play, sometimes that storage path became a factor. But, YMMV (greatly) on that one though. GPU cards that have to have resizeable bar or they suck... etc... can make that upgrade a necessity. Lower end cards that assume higher end buses (what??!!??) so that they nuke the available lanes assuming your lanes are faster, etc.I definitely remember people here during the Zen 1-2 time frame insisting on the top Intel CPUs over any Ryzen one because they had higher FPS, even when it was down to under 1% in reviews, and regardless of price, because they wanted the absolute best. That's why, of course, anyone spends money on a 90-series video card for gaming. And there are people who are willing to buy the top chip as it comes out to get a few more FPS. Maybe not every generation but certainly at least some of the time. So that would imply they should be willing to (say) replace a 14900K with a 9800X3D for an extra couple of percent, regardless of how much it'll cost to buy a new CPU & motherboard...unless that was just a cover for "I only by Intel". Which is fine, as long as they are willing to acknowledge it.
I remember doing the testing on this at the time. The difference between Zen 1/2 and Intel CPU's in games was a lot bigger than that............at 1920x1080. The argument was that the Zen CPU's were often better at everything else once their core counts went up and of course, when you were in a more GPU limited scenario such as 4K gaming. Even Zen 3 didn't achieve parity with Intel in gaming though it was a lot closer.I definitely remember people here during the Zen 1-2 time frame insisting on the top Intel CPUs over any Ryzen one because they had higher FPS, even when it was down to under 1% in reviews, and regardless of price, because they wanted the absolute best. That's why, of course, anyone spends money on a 90-series video card for gaming. And there are people who are willing to buy the top chip as it comes out to get a few more FPS. Maybe not every generation but certainly at least some of the time. So that would imply they should be willing to (say) replace a 14900K with a 9800X3D for an extra couple of percent, regardless of how much it'll cost to buy a new CPU & motherboard...unless that was just a cover for "I only by Intel". Which is fine, as long as they are willing to acknowledge it.
I have to be honest, it took MUCH longer for games to make use of NVME drives than I thought.There can be "other reasons" having to do with buses, lanes, USB... etc. That is, tech advances that can be important. But, for most gamerz... they don't care so much about that. One that did matter, is storage. So the move to SSD and fast SSD, as games and texture load times and shader compiles, etc came into play, sometimes that storage path became a factor. But, YMMV (greatly) on that one though. GPU cards that have to have resizeable bar or they suck... etc... can make that upgrade a necessity. Lower end cards that assume higher end buses (what??!!??) so that they nuke the available lanes assuming your lanes are faster, etc.
Zen2 was the first chiplet gen (3000) so AMD pretty much nailed chiplets out of the gate. I think to be fair they were experimenting with Zen+ Threadripper.How long did it took for AMD to resolve those issues back in the day ? I feel by Ryzen 3000 series ?
Totally agreeZen2 was the first chiplet gen (3000) so AMD pretty much nailed chiplets out of the gate. I think to be fair they were experimenting with Zen+ Threadripper.
Zen3 was a home run.
It’s not enough for a machine to just have an NVMe drive, the drive needs to have a controller that has some specific functionality.I have to be honest, it took MUCH longer for games to make use of NVME drives than I thought.
was talking about the infinity fabric latency issue.... I imagine that the closest AMD tech to the new intel interconnect tech, both using a package substrate, but that could only be two company using similar words for different techs (really do not know anything about cpu), connecting tiles vs ccx.Zen2 was the first chiplet gen (3000) so AMD pretty much nailed chiplets out of the gate. I think to be fair they were experimenting with Zen+ Threadripper.
AMD had a good handle on the problem by the 3000 series. Prior to that, not so much. The Threadripper CPU's in particular were horrible in that regard.Looking at many benchmark, there is not just a strange deviation from run to run (and/or chips to chips), but there also many case where the 285k-265k-245k have the exact same performance for task other cpu sku family scale naturally, indicating an important bottleneck going on (probably the 3 cpu have a very similar latency issue).
Seem to show, why the launch was so difficult (pushed back many times, last year just fully cancelled) and maybe why Lunar Lake has soldered ram the closest it can be, to counterbalance the added memory controller title latency.
How long did it took for AMD to resolve those issues back in the day ? I feel by Ryzen 3000 series ?
https://techporn.ph/wp-content/uploads/AMD-Ryzen-5-1600X-Review-5.png
Intel better reduce that issue faster, by next launch, and it is better be in 2025.
100% agreed on the importance of minimum frames. Gaming-focused reviews really need to focus on the Min FPS, as that’s the only thing that really matters. It takes more work, since some games will noticeably stutter from 0.1% dips and others will seem smooth even at 1% dips. But average FPS in gaming is almost completely meaningless, but is the only number most people focus on. When HardOCP was a review site, they recognized this problem and reported maximum playable settings instead of meaningless averages and benchmarks. I really wish a modern reviewer would take up the mantle and use that method, or at least focus most of the review on minimum FPS and leave the averages for a graph at the end.I remember doing the testing on this at the time. The difference between Zen 1/2 and Intel CPU's in games was a lot bigger than that............at 1920x1080. The argument was that the Zen CPU's were often better at everything else once their core counts went up and of course, when you were in a more GPU limited scenario such as 4K gaming. Even Zen 3 didn't achieve parity with Intel in gaming though it was a lot closer.
Now I do remember a lot of sites showing similar average FPS numbers, etc. on Zen 2 CPU's and even Threadripper. But I had a 2920X and it was garbage even at 4K gaming. When I dug into that, I realized that Zen was giving me much higher maximum frames but also much worse minimum frames than the 9900K did at the time. The average reports would have led you to conclude parity, but in the actual gaming experience this wasn't the case. This is why I went back to Intel despite otherwise really liking my Threadripper 2920X.
IMO, the lack of Zen 4 X3D supply, is artificial.To me, argument is ridiculous. Sadly though, as a much lesser volume producer, the whole AMD line is scarce by definition, and the coveted class, like the X3D ones, have gone up in price due to supply/demand.
Zen 5 X3D will be expensive at launch. But, prices will fall quickly. Margins on desktop Zen 5 are really high. I don't think it costs much more (if anything extra at all), to produce Zen 5 VS. Zen 4. Zen 5 non-X3D prices have already dropped a lot and its only a month old. I bet we see $350 9800X3D by tax season.So, I'm with "the rest", that say to wait for the 9000 series X3Ds and see if the whole market stabilizes. I mean, even today, for single percentage gains, the cost of Zen4 vs older Zen3 may not be worth it. That is there might be huge savings for going Zen3 if you're on older hardware (older than Zen3) vs. going with newer (Zen4), yet soon to be displaced gens.
Plenty of data out there, graphs make things "look bigger". Consider the cost of those few extra fps, especially when the cases are fps where you can't really notice the difference anyhow.
Two words: memory controller. Zen's IMC/IF arrangement have continued to be the showstopper for me, since I have too many apps that scale with memspeed (image and video processing, data analysis, local LLM's). I'd hoped AMD would finally improve it with Zen5, but nope, they decided they're comfortable coasting and that only Enterprise dies would get that needed improvement.I definitely remember people here during the Zen 1-2 time frame insisting on the top Intel CPUs over any Ryzen one because they had higher FPS, even when it was down to under 1% in reviews, and regardless of price, because they wanted the absolute best.
Not updating the I/O Die and infinity cache speed for Zen 5 desktop was so stupid.I have too many apps that scale with memspeed (image and video processing, data analysis, local LLM's).
Assuming they do it for Zen6 then it'll replace my current 14900k systems, probably my Zen3 systems, and I give big props to AMD for continuing to carry forward a design decision that Intel abandoned on consumer dies after 10th/11th gen: PCIE bifurcation down to 4x4x4x4 (i.e. Quad NVME cards). The misconception is that PCIE bifurcation is a motherboard design decision, but no, it's a CPU die design decision that trickles down.Not updating the I/O Die and infinity cache speed for Zen 5 desktop was so stupid.
That said, some of this stuff is a software issue. For Example, Photoshop recently updated and now AMD has clear wins in just about everything in Photoshop. Whereas Intel had held the top position in Photoshop for a long time.
There's a huge amount of room to optimize for the 9k series cpus and I expect we will see more and more of this.Not updating the I/O Die and infinity cache speed for Zen 5 desktop was so stupid.
That said, some of this stuff is a software issue. For Example, Photoshop recently updated and now AMD has clear wins in just about everything in Photoshop. Whereas Intel had held the top position in Photoshop for a long time.
Not as new as you think... I can think of two other instances of this:OOF.
It's almost as if they were so hyperfocused on AI they forgot about traditional performance.
How on earth does it perform worse than previous gen? That's a new one.
Us old folk...Not as new as you think... I can think of two other instances of this:
1) When Intel switched from Pentium III/P6 to Pentium 4/NetBurst
2) AMD's FX series compared to Phenom II
Ugh...Us old folk...
So in other words, probably not games, which I had hoped was implicit in my mentioning FPS. And that's fine! I'm just saying, if the people who wouldn't buy Ryzen before because Intel was a few FPS faster--and I guess now I'm talking closer to the Zen 3/4 timeline--should be willing to upgrade to the 9800X3D *now*, because it's almost assuredly going to be faster than Intel. It only makes sense if you have to have the best, right? (Admittedly, you could argue around the edges about 1% lows, I guess.)Two words: memory controller. Zen's IMC/IF arrangement have continued to be the showstopper for me, since I have too many apps that scale with memspeed (image and video processing, data analysis, local LLM's).
I've said it elsewhere on this board, but the Zen 5 launch was so comparatively bad that it drove up the price of Zen 4. If it were not for the degradation and power concerns of the prior generation and its refresh, then Arrow Lake would probably re-inflate the price of 13XXX and 14XXX chips.Being budget focused, I actually see the AMD 9000 and Intel Core Ultra series as a positive. Not much performance gains for a high price that people will still upgrade to, and all the rest of the market will issue price drops for the 2nd to latest generation, as well as the used market getting better deals as well.
Eh, prices have been up and down and up right now. On September 5th, I posted Amazon links for 7700X $210 and 7950X $410. That was 1 month after Zen 5 launch.I've said it elsewhere on this board, but the Zen 5 launch was so comparatively bad that it drove up the price of Zen 4. If it were not for the degradation and power concerns of the prior generation and its refresh, then Arrow Lake would probably re-inflate the price of 13XXX and 14XXX chips.
Damn I forgot all about being able to buy a 3 core chip. LolUgh...
It felt like it was just yesterday I was unlocking the 4th core to my Phenom II x3 720 BE...
I'm having good luck with my mobo, 1st Gen and now running 5700x3dThe issues are not necessarily performance related or even necessarily something that's an explicit engineering challenge. Sometimes the issues do impact users. In the case of AM4+, there wasn't enough room on most BIOS ROMs to support all the CPU's that were compatible with the boards electrically. Not to mention all the cases where you'd need to keep your old CPU to do the BIOS update with before installing the new CPU.
These problems weren't an issue on the high end, but with mid-range and more budget type boards they were.
That was a good deal, AMD had the same growing pain with AM5 as in needing a new board made it a hard sale, June 2023 is when I bought my combo from Best Buy with a 7600x because the B650 board was free in a Starfield combo = $289 for everything but the ram and by that time it was cheap for 32GB of G-Skill Trident Z5 Neo RGB DD5 6000Mhz Cl30 was $109.Eh, prices have been up and down and up right now. On September 5th, I posted Amazon links for 7700X $210 and 7950X $410. That was 1 month after Zen 5 launch.
Planetside 2 was the first game that I actually saw inprovments with Sata 3 SSD. HDD took a good 2 minutes to load.I have to be honest, it took MUCH longer for games to make use of NVME drives than I thought.
I remember upgrading my FX6300 to a ryzen 1600. Was a pretty good cpu.MLD actually did a flashback to Zen 1 within his Ultra 9 285 review. It's cool to see how performance was back then. I forgot that Ryzen's lows in gaming were often better this was why there was such a nuanced view of it.
View attachment 687569View attachment 687570