That would imply a 16GB 5080 at the price of the current 4090 with slightly more performance, which is just...ugh. But expected if AMD won't be doing anything other than midrange cards next gen.
Run Timespy Extreme. There's all kinds of factors that could cause a lower Timespy score that don't have anything to do with a "tired" RTX 4090 as the overall GPU load isn't very high.
This used to be an issue one some AMD systems because USB was fighting for bandwidth with PCI-E lanes and the solution was the reduce GPU PCI-E generation down one generation (i.e. from gen 4 to gen 3)
I prefer the sharpening effect of DLSS, so I usually turn it on if available compared to native resolution. However if DLAA is available at native resolution as an anti-aliasing option, I would take that over DLSS, since it doesn't tend to have the added graphical artifacts of DLSS.
MSI cards have been having fan grinding issues since the new design with the 30 series, after hearing what a 3090 Suprim sounded like, I intentionally avoided the card when looking for a 4090, even though in "theory" they have the lowest acoustics per TPU reviews.
If you are going SFF you probably ought to power limit your CPU first and foremost. As for 4080 or 4090 class cards, due to the size of the coolers they are not going to fit in most SFF cases unless your plan is to deshroud. In which case you can target a card with a smaller heatsink and pair it...
Neither the Iris Xe nor the fastest iGPU (7940hs) is anywhere close enough to meet those minimum specs (either a GTX 970 or a RX 480).
He would need a laptop with discrete GPU, or an e-GPU.
It seems like the problem with Cablemod is they cheaped out on the 90 degree connectors. Their first pre-order run all used the 4-pin NTK connector, which is recommended now by Intel. As far as I can tell, not a single 4-pin NTK connector has ever been shown to cause a 12vhpwr connector failure...
There are multiple generation of redmere cables and have their own specs because they are active cables. As I recall the first generation redmere cables were labled as capable of 4k60 but my experience with them they were only able to pass 4k60 at 8bit for 4:2:2 chroma. I generally avoid...
Since the latching process isn't fully secure, the cable can also wiggle out over time from vibrations in the case, and it's speculated to be the issue with a lot of these GPU connectors that are burning months or even half a year afterwards of use. That's why MSI's yellow tipped 12VHPWR cable...
Yep, NAND and RAM prices are all crashing like most PC components. The only thing holding up is GPU prices, and that's because neither AMD nor Nvidia are particularly concerned about gaming GPU revenue at this time, so its a low priority for them and allows them to keep supply low and prices...
I think the issue is Intel can't keep it's power consumption/temps under control though with more P cores. A 10P + 24E configuration or 12P + 16E configuration would probably be impossible to cool without a redesigned IHS + an insane liquid cooling system, and a 16P configuration would probably...
I turn it on with DLSS if available. Unless it causes weird graphical glitches. Not sure if its improved now but Witcher 3 Next Gen at launch had all sorts of weird visual artifacts with RT on.
He also said he had over 250 4090s and 4080s waiting to be repaired, so if anyone thinks having 8 related to cablemod adapters means there's something uniquely bad with cablemod, I think that's an overreaction for sure.
Here you go, have fun:
https://www.dell.com/en-us/shop/nvidia-rtx-6000-ada-generation-graphics-card/apd/ac442879/parts-upgrades?gacd=9684992-1102-5761040-266906002-0&dgc=ST&gad=1&gclid=CjwKCAjw67ajBhAVEiwA2g_jEFu357_VmLRiGokxwV8wD7qzGNWIaqXpYGA1o2i53MAVdw7XoAVeGRoCtBYQAvD_BwE&gclsrc=aw.ds
Technically its been out for 6 months, just in a pro SKU. Nvidia will launch it whenever they feel like it. (Which, given the current lack of demand in gaming and the ample demand in AI research and ML, is probably a while).
Once you have enough physical cores to cover all workloads, HT only becomes a detriment. If you try gaming on a 13900k for example you'll find in 95% of cases you'll see a 2-5% framerate regression with HT on vs off. Most games only use 8 threads but some (like the frostbite engine) scale up to...
It only makes sense if money is not a concern for you. Upgrading to a CPU refresh generation isn’t great bang for the buck. The other thing is there is even less of a difference while using DDR4, and upgrading to a new platform with DDR5 just to buy a CPU refresh seems kinda crazy. Arrow Lake in...
Depends on the era of the game. The 6700K was the fastest CPU when the 1080 Ti was released, and it was perfectly fine for the games of its generation, in fact even a Haswell CPU wasn't really a bottleneck back then. Depends on the game engine and how many draw calls it really throws at the CPU...
Well you're choices is DIY a fan system or slap a AIO hybrid cooler at this point. Buying things that aren't fully functional isn't necessarily a mistake, overpaying for it certainly is.
Depends on the age of the games. Remember the GTX 1080 Ti and the 4770K certainly co-existed in many gaming systems, and the aftermarket 1080 Tis are about 33% faster than a stock RTX 3060. When the 1080 Ti came out the fastest gaming CPU was the 6700K, which was only about 10% faster than a...
Or his case fans are so loud that it's completely masked the sound of coil whine. It reminds me of the high frequency whine of the gas turbine engine of the M1A2 Abrams. It sounds like nails on a chalkboard until the vehicle starts moving, at which point you don't really notice it anymore...
It means he has a golden sample card more likely than not. Frame Chasers (whether you like him or not) tested a bunch of 4090s running Furmark. AFAIK one card pulled like 460W, another pulled 430W, but one of them only pulled 320W, all at 100% power target with out of the box settings. 70% power...