I was running a lower power target when I first got my 4090 but I cranked it up to its default settings again. These cards aren’t drawing nearly as much power as I thought they would and they are among the coolest running GPUs I’ve ever owned. Outside of benchmarks and stress tests it’s pulling...
Hmmm… what resolution? I performed literally the same upgrade. 1080ti paired with a 3770k and carried it over to a 3900x. performance was certainly better but none of my frame rates doubled, much less all of them. That’s at 1440p
That was a great video! Thanks for posting
I ordered one for the same reasons. If my AIO springs a leak or pump takes a shit this will be a more than adequate interim cooler.
The card had multiple known problems, why would you spend $500 on a card you knew was trash? Should be no surprise if you find more issues. Heck, I stay away from cards from sellers on here if they replaced the cooler with a water block and include the stock cooler.
It was declared (kinda) OP was under the impression there was a bad HDMI port but it looks like it was actually a DP instead. Ultimately this is OP's fault through and through. He knew he was buying a defective item from the start and chose to roll the dice. Kind of baffling if you ask me. If I...
i would recommend editing the post to reflect the actual make/model of hardware you’re selling. No one wants to click on multiple links to figure out what they’re buying.
Everybody or nobody? Says who, exactly? what country and form of government are you under the impression these company’s are working out of/under that can force this bad take?
Asking the same questions in multi quote format will give you the same answers. Scroll up. I mean, as far as Apple amd nvidia are concerned, their profits are Doing the exact opposite is your prediction so your entire Argument is based on something that not only doesn’t exist, but trending in...
Well they’ve already done it several times and they aren’t any smaller. Just like nvidia has raised prices on all their cards and people still buying. Customers don’t appear to be nearly as fickle as you might imagine.
Because Apple has shown Time and again, they are not afraid of taking their business elsewhere even if it means moving to a completely different processor and rewriting their OS. They did it to IBM, they did it to Intel and they’ll do it to ARM if they [ARM] gives them a reason.
I *think* I had this happen today. I just chalked it up to windows doing windows things in the background. Didn’t bother checking the processes tab to see what was using those cpu cycles.
I don't think using two GPU's instead of one qualifies as KISS, quite the opposite. You also risk running into HDCP issues if you play HDCP content and the audio device doesn't see a compatible HDCP display attached to it.
If you're gaming on all three monitors and your reaction time is that good where passthrough latency on a non-primary display will make a difference, you're the most impressive specimen I've heard of. Heck you'll probably introduce more latency from driver overhead with what you're trying to do.
The real question is why not use the actual video card? I get not connecting your primary display via the pass through especially if you have gsync/hdr but your secondary displays shouldn’t be an issue. It’s all digital so it’s not like you’re gonna lose quality
https://www.theverge.com/23538558/nvidia-rtx-4070-ti-review-gpu-graphics-card-benchmark-test
out of 18 4K tests the 3090 did better on 3 of them. Link to what you looked up?
4070ti even if it’s $100 more. Even at 4k 12gb holds up fine. Performs better, runs cooler, uses far less power, takes up less space, has better features. There’s nothing the 3090 does better unless you’re doing production work that actually needs the vram.
Every ups I’ve ever had that did this simply needed new batteries. I’ve gotten into the habit of setting reminders to replace every 2 years. Not much good if an uninterruptible power supply gets interrupted when you need it.
Gaming X Trio is 450 watt vbios which can be flashed to the Suprim (520w) or even the Strix (600w) but you lose a display port connection with the Strix. After doing a reasonable amount of research, I determined it’s a negligible performance boost.
I was monitoring power usage closely. Other than stress tests like furmark and static benchmarks like timespy extreme my 4090 only pulled about 20 watts more on average than my 3080 for actual gaming. The 750 was plenty. Even at 106% power target and maxing out v core it pulled about 460 watts...
I don’t think there’s a practical reason for *most* people with a 5800x/5800x3d/5900x/5950x to upgrade tbh. I was pretty excited for this release but the x3d compromises, potential scheduling issues and the less than amazing performance increase has dampened my excitement.