Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.
Yes, fixing it at the source with Reshade is the correct approach. The display should be left in HGIG mode whenever possible.
Actually, even if the game outputs up to 10000 nits, it should still look correct in HGIG mode on an 800-nit display. You should lose detail only in the extreme...
HDR10 static metadata is reliably sent only by Blu-ray players. I'm not aware of any other sources which can be trusted to always send HDR10 static metadata. LG TVs will use the 4000-nit curve for these sources, wasting most of the peak brightness. Most Dolby Vision sources correctly send...
My only concern is my browser profile with cookies being stolen. Firefox profiles are unprotected and should be moved to an encrypted drive. Edge and Chrome cookies (but not the whole profile) are encrypted with the Windows password so are relatively safe.
You can buy a device like the HDFury 8K Arcana or 8K VRROOM. It takes a single input from the GPU and splits the audio off to the AVR while retaining full 4K 120 Hz VRR to the display.
The Zen 4 desktop CPUs don't support bifurcation on any existing motherboard. You can split the first x16 lanes from the CPU into two physical x8 / x8 slots, but you can't split the second physical x8 slot into x4 / x4 to add say two NVMe drives. You have to use a PCIe adapter for whatever...
There's no benefit to using a non-admin account. Everything runs with standard account privileges unless you approve the UAC prompt. If your account is non-Admin, the UAC prompt asks for an admin username and password. An admin account does not display any less UAC prompts than a standard...
I had to increase the power and current limits on my 7950X before it hit 95 C. Out of the box it was power limited. Prime95 SSE hits 5.4 GHz all-core. Prime95 AVX hits 4.9 GHz all-core. Any kind of undervolting or negative curve optimizer results in crashing at idle so I left the voltage at stock.
Overclocking is also increasing power limits and boost settings even if you don't increase the peak clock speed. That still provides a performance boost without requiring stability testing. You need a high-end board for that.
There is absolutely a difference even in stock turbo boosting capability. Just setting the power limit to 200 W allows a Ryzen 7950X to boost to 5.4 GHz all-core which you cannot do on a board with a weaker power stage.
Today, expensive high-end boards are typically loaded with useless features and add-on cards. A true high end board like the EVGA X299 DARK could be had for < $300. Quality and OC wise, it was the best of the best. Older no-nonsense high-end boards like the Rampage IV Formula were also in that...
The leaked Sapphire Rapids SKUs show `X` models which are likely unlocked like the old Xeon W-3175X. The new Xeon W5-3435X is a 16-core model and would be a great HEDT part even if it doesn't have the full 80 PCIe lanes. AMD did the same thing with with the DIY market Zen 3 Threadripper PRO...
I too upgraded from an Optane 900p 280 GB to an Optane DC P5800X 800 GB. Unfortunately the real-world performance of the P5800X in games and on the desktop is identical to Phison E18 PCIe 4.0 drives like the FireCuda 530 and Corsair MP600 XT.
The 900p destroyed my old Samsung 950 Pro in...
Play through your backlog of games and appreciate the classics which will take a couple of years.
For the first time in console history, PS5 and XSX games are 85% as good as my 5-figure priced PC with an RTX 3090, while costing 1/20th the price. The only games which are unplayable are FPS...
Yeah, I also randomly found a 3090 in stock at Best Buy at MSRP soon after it launched and casually checked out! Now they are never in stock even if you use alerts. At the time I thought the price was insane over the 3080 but now it's the best purchase I made in the last 1.5 years.
Refresh rate increases smoothness independent of pixel response time. I use a G1 OLED as my primary desktop monitor. My 240 Hz TN monitor is way smoother than my G1 when just moving the cursor on the desktop. There is definitely more smearing between frames on the 240 Hz, but there are more...
The Sony has a better motion interpolation algorithm. It is generating frames in-between, instead of holding a single 24 fps frame for 5 frames. That's all it is. If you crank the slider up to 10 you get the so called soap opera effect with maximum smoothness.
That's smearing. It's less accurate even if you prefer it. The correct solution to sample and hold blur is to increase the frame rate of the content. Sample and hold is neither good nor bad - it's an exact representation of the source frames over time. Non-sample and hold displays smear frames...
PQ is absolute. The only HDR displays that follow PQ are the LG in HGIG mode, the G-SYNC Ultimate monitors, and $30 k reference monitors. Everything else is non-standard tone mapping and is inaccurate. You actually prefer the inaccurate oversaturated colours of the Sony OLED over accurate PQ. It...
You can toggle between SDR and HDR in Windows with the SDR content brightness slider set to 10 which is guaranteed to be 120 nits in HGIG mode (0 is 80 nits). It looks about the same as SDR with OLED Light set to 35. This is consistent across my CX, C1, and G1.
All VRR monitors synchronize at 1 fps with LFC. LFC is done in the GPU. There is no advantage to hardware LFC in the G-SYNC module either, because once the module begins to double a frame, it cannot respond to new frames until the doubled frame is completely drawn, resulting in the same...
The OLEDs are effectively hardware G-SYNC, just not made by NVIDIA. They need LG's custom controller to drive the individual pixels with G-SYNC certified VRR, unlike LCD monitors from manufacturers who make half-baked VRR firmware for existing controllers. NVIDIA's hardware G-SYNC module just...
Use Win + P to disable the C1 and move all windows to the other display. The problem is the C1 doesn't disconnect its HDMI input after powering off - it takes a few minutes.
The SDR content brightness slider has an effect on how Auto HDR looks with Intensity at 0. You may want to try switching off the Auto HDR checkbox completely and increasing SDR content brightness to 30 (200 nits) or above. If you use Auto HDR, zero or low values on the SDR content brightness...
You're losing detail in the highlights because the Max Luminance is too high - the game thinks your display can hit 1500 nits when it's actually hard clipping below that. Reduce it to 1200 nits for the X27 or 800 nits for an OLED and use HGIG.
DTM On compresses and expands different parts of...
By DTM I assume you mean DTM Off versus HGIG. DTM On is incorrect in all situations. I use HGIG mode for everything unless I know the content is hardcoded to 1000 / 4000 / 10000 nits, in which case I use DTM Off and override the HDMI Mastering Peak on the TV. If you set the Max Luminance to 800...
No. The initial models of DDR5 have unknown overclocking / tightening performance. It's not clear how overclocking will work with the on-die voltage regulator - early motherboards won't have this ironed out. I would even wait a year and buy the 2nd generation mainstream platform or 10 nm HEDT...
I can confirm that Max luminance in CRU limits the peak brightness when Auto HDR Intensity is set to 100%.
I used values of 130 / 56 / Blank for Max luminance / Max frame-avg / Min luminance to get 798 & 160 nits for peak & frame average brightness.
Blank for Min luminance resulted in a PQ value...
The 7.1 base layer in Atmos for games is rendered using the spatial sound engine on Windows whether or not you have height speakers. Even in movies the base layer already contains all the sounds present in the height layer. The AVR uses the object metadata to subtract these sounds from the base...
If you don't have height speakers, you don't need an AVR even for Atmos or bitstreaming. The software renderer in games will render up to 7.1 quite well even with spatial audio. For upmixing stereo content with Dolby Surround, you can capture the impulse responses and use them to upmix on...
You need HDMI to an AVR only for bitstreaming or Atmos games. For anything else just use analog 7.1 output and run room correction on your PC with Dirac.
Buy a 120 Hz TV and use eARC. That’s the only solution apart from those HDMI sound cards.
Cloned display has no issues if your AVR supports the resolution at 120 Hz over HDMI.
I am even considering the PA32UCG which has HDMI 2.1. The professional version will probably have superior QC as well. I wonder if VRR works with NVIDIA cards over HDMI 2.1 or DisplayPort. With Dolby Vision support, I could plug it into an Apple TV or media player for watching shows.
Extensive...
Has anyone gone from a PG27UQ / X27 to an OLED and then to the PG32UQX? I went from an X27 to a CX 48" and could never go back to the X27. HDR even with lower peak brightness is superior on the OLED because of the pinpoint brightness in small areas.
I recently upgraded to a G1 65" but am now...
Incorrect. Cloning a drive works at the logical bit level, which is above all the hardware and firmware abstractions. There's no way to clone a drive at the NAND chip bit level with any of the software mentioned here - this obviously wouldn't work on anything but identical hardware / firmware.
This is the issue. Windows is trying to boot with the Intel RST driver, whereas the new drive uses the generic Microsoft driver. Uninstall the Intel controller from Device Manager and choose to delete the driver software. Uninstalling the RST software is not enough as it won't uninstall the...