I agree that the 7970 MSRP was too high at launch, they wanted some $ before nvidia's launch 3 months later. Also agree that 680 was on average around 7% faster with the margin shrinking at 1440p at launch. But I guess it's debatable as to whether 7970 and 680 were the respective flagships at...
The fact that we're debating whether the 7970 should be in the list is interesting because, had most people known just how far ahead the GCN arch was over the GTX680 during launch would drastically change the sales ratio of those two products. Most reviewers just failed to see that, ignored the...
So you change your argument from "Ryzen 7 doesn't beat i5-13600k in ONE benchmark" to i5 evenly matches 7700X in some benchmarks and beats it in others because you got proven wrong on multiple arguments of yours. Price adjustments are so common in the industry that it has nothing to do with...
Stay away from ASUS with a ten foot pole for AM5. They dropped the ball hard. Look at the $200 ASUS Prime vs Gigabyte Aorus Elite AX. The prime looks like a $50 board and performs like one too with CPU throttling, suboptimal timings even with EXPO, BIOS being borked causing issues with X3D chips...
You know this was AMD's roadmap all along right which they can't magically change gen on gen? CPU archs take a few years from design to actually being materialized. When they designed the CCD's, they built everything around it starting from laptops to EPYC's many years back. And the way the...
Yeah it's kinda sad that even with 40% generational uplifts in recent years the IQ doesn't really change much. RT does little to nothing as well in most games. I think my last blown away moment was when I first ran Crysis and wanted a GPU upgrade. But it still pales in comparison to what I...
Yeah your admittedly mediocre devices seems to be beating intel for a number of years now and came out of nowhere to almost match nvidia last gen. That other thread about GPU history got me thinking - they also soundly beat nvidia in the three generations starting from 9700 pro all the way till...
Yeah man the 290X was an absolute beast. No idea why they launched it with a terrible af cooler but that's AMD for you especially back in the days. I put it on water at launch and it was the best GPU I owned and approaching it's 10th anniversary. It was the best balanced GCN architecture among...
HD 7970 scaled for way longer than it had any reason to. If people knew how their card will perform two years down the line the 680's wouldn't sell nearly as much as they did. Great card, I had an ASUS Matrix 7970 back when they actually made decent AMD cards.
Worst is some random tech channels releasing articles like the 7950X3D is a scam and it's only 8 cores bla bla. It used to be the case that reviews used to check for accuracy for their sake and reputation. Now a decent chunk of them are just incorrect data worded a certain way to create clicks...
I think the point of testing 1080p, aside from trying to utilize those 480hz panels is simply to extrapolate what might happen with a GPU upgrade down the road. Sure, there's merit to testing 4K too as it shows how low a CPU you can go at present without being CPU bottlenecked at that...
Man I wish heatkiller launched their blocks a bit sooner. Their build quality is so far above EK that it's strange how they don't sell more than they do (I guess regional availability plays a part). Also maybe cos they don't make stuff filled to the brim with RGB lights.
Their rads are also...
Techradar review
When the cache works, holy hell does it strike hard. Factorio, Returnal, Microsoft FS just to name a few have the X3D some 30-40% faster. That's actually bonkers.
When it loses, it's by a slim margin but when it works, it destroys everything else out there. Only difference this...
Cool, latch on to the mistake I made and ignore the rest of my comment, or all the other people who corrected you with your argument that AMD and Intel have the same power/performance. Let me correct myself, on power normalized graphs, intel has way less performance for the same power. Or vice...
AMD and intel aren't even close to equal on performance/power right now. There's a lot of data on the net, even on power normalized graphs outside of these Cinebench and AT's game benchmarks you showed (where its entirely GPU limited btw) that Intel consumes way more power for the same amount of...
Check out HW unboxed's latest review, the 7xxx actually stomps the nvidia's in raster and RT performance is back in the usual pecking order with the 7900xtx matching 3090ti and arc's way behind.
Arc isn't all that great tbh, it has good RT performance but raster is nowhere close to even the rx...
AMD pushes it to 95'C because the chiplets are small and can't transfer as much heat to the cooler so at 230W it's really pushing it past an unnecessary point. Running it at 180W causes no real performance loss while the temps are 85'C or less . Also, while Der8auer's testing was great he used a...
Funny how some people are already deciding to get RPL after hearing about the cache being on one CCD or are making it a big deal. Even if windows messes up the scheduler for certain games and sends data to the other CCD it won't make much of a difference anyway. It'll be a Zen 4 CCD which are...
Great read, was fascinating to see the 7950X have insane power scaling. These new 7045HX CPU's are going to kick some major butt in that 55W range.
AT article on power scaling
Great article over at Chips & Cheese, the dual issue part corresponds with what I mentioned earlier. It'll be interesting to see the performance of the XTX with newer games.
7900 XTX deep dive
This is typical in recent post mining busts, not sure what the big deal is as the numbers are all over the place. Next quarter they'll gain a few % to Nv and it'll stabilise a bit.
Arc 750 at 4k with RT on? For budget gamers what matters more is 1080p and at a stretch qhd screens where the 6600...
Went into BB Amherst yesterday in the hope of finding a 7900 XTX and they tried to sell me a 4080 for the longest time. I think what finally stopped the rep from talking was me telling him i've been playing a lot of MWII recently and the 7900XTX brutally murders the 4080 in that game and showed...
When you say "due to their negligence" you eliminate the issues of the power connector entirely and straight up dismiss everyone who built PC's with it. See there's no tactile, or audible click when the power connector is fully seated. When the connection needs to be that tightly plugged in, you...
Really doubtful that there are any hardware issues, but the amount of changes in the arch, especially the dual issue compute units and need to extract ILP are causing performance to be all over the place in benches. I do agree with the others - they should really have waited to launch this after...
And what real world title exhibits similar behaviour? I asked earlier, you just seem really keen to keep reposting this which makes zero difference to anyone using this card other than running this bench which obviously has bugs.
I don't understand the broken product part. Sure, it performs rather erratically with some random synthetic benchmarks. But do games exhibit this behaviour? What real world use case exhibits similar behaviour?
They really should fix the multi-monitor power consumption issue and release drivers...
It's actually the most apt comparison I could think of and this is actually very similar. Another one that springs to mind is R9 290X vs GTX 780. 290X was slightly faster than the GTX780 when it launched, but nvidia countered with GTX 780 Ti which was 10% faster. Looking at results from 2 years...
Correct. It's still 14% over the stock XTX, and going by launches in the last few years that's a lot (ASUS was 15%). These seem to be great overclockers provided they are fed by 3x8 pins. The reference can only muster around 8% and gets power limited rather quickly. So i'm torn between getting...
I think you misunderstood me. So as stated in the AT article linked above, AMD went away from its reliance on extracting instruction level parallelism with RDNA, something they had with GCN. With RDNA 3 they re-introduced, albeit differently, the need for extracting ILP in order to have maximum...
Ah yes thanks, 20% more but 16.7% less. Either way, I can't justify the extra and would rather take the extra raster performance, overclocking headroom and future driver improvements of the 7900 XTX over the added RT performance of 4080. DLSS3 is useless for me as most of the games I play are...
Okay, so I expected the 7900 XTX to be faster, but it's very apparent that on titles that can extract ILP this is much, much faster than the 4080. On titles that don't, it's about the same or slower than the 4080. On average, it's about 5% faster in raster and 15% slower in RT compared to the...
Same. Or a heatkiller block for the reference GPU. I'm sure there will be a liquid devil which are usually great but I just don't like the 'devil' on the front of the GPU. Either way i'm getting rid of the 3090 and going 7900XTX. Jensen can eat shit with the stupid pricing of the 4090 and the...
Faulty 8 pin? Sure. Out of the gazillion of those there will be ones that melted etc. However, it's not even remotely comparable to the terrible new connector and adapter with the 4090. It's been a month and there are 100 cases. So don't even bring melting 8 pins into the equation.
You know...
That 15% safety margin has to take into account QC variance, end user errors in terms of connection, abnormally high ambient temps, sharp bends etc. With 8 pins, we were used to 100% safety margin, and GPU's were pulling 300W from 2x8 pins when it could theoretically pull >= 600W through them so...
From what I understand, the 12VHPWR standard itself, which nvidia surely had a major part to play in the design of it, runs closer to the limit of what the cables are rated for. The classic 8 pins are officially rated at 150W but can theoretically pull 300W, giving a safety margin of 100%. This...
5% faster in games but at lower res it lays the smack down on comet lake pretty hard
Easily 15% faster at single core and what, 100% faster when all cores are utilized? what's rocket lake going to do again?
6800 and 3070 aren't even in the same class. There's a 10%+ performance gap and double the VRAM. Don't even know how that argument dragged on for so long
A lot of the performance uplift comes from having DRAM so close to the chip as well as a huge 12MB L2 cache. They also devoted a ton of transistors for the large CPU cores, to the point where the total transistors is 60% higher than Renoir.
I bet if AMD decided to go full on SOC (which wont...
Umm..no. The people who buy stronger CPU's use them mostly for encoding/rendering/folding etc where Intel gets trampled on. That statement of yours is so ignorant. Higher end CPU's make very little difference to gaming, its essentially for the usage cases as mentioned before where CPU...