Logitech G733 - Higher bitrate and audio format compatibility

Himalayas

n00b
Joined
Jan 14, 2023
Messages
8
I bought G733, few days ago, and I can see, that the default audio format maxed is 16 bit, 48000 Hz (CD Quality).
I use VoiceMeeter, to modify the audio output, that can increase the bitrate and format, up to 192000 Hz 24 bit, as you know.
However, I want to be sure, that doing so has no risk, to wear off, or damage the headset. Can someone be assertive, that it is, or it is not.
 
No it won't hurt anything, you also won't hear any difference. The headset can't actually reproduce sounds that high. Like most headsets it rolls off hard in the high frequencies and is down about 30dB by 20kHz. It isn't reproducing any ultrasonic frequencies at levels that could be audible, even if your ears could hear them.

Likewise, most if not all of your source material doesn't have any ultrasonic content. It'll all be 44.1kHz or 48kHz meaning it cuts off at 22.050kHz or 24kHz (sample rate is double the highest frequency you can output). If you run a device at a higher sample rate than the content, it doesn't introduce any frequencies higher than were there before, so long as the resampling is done properly.

Also, you say you are using VoiceMeeter to "modify the audio output". Unless you are actually changing what the output setting is to the device itself in Windows, you may just be having VM do resampling. In that case it will take your audio, resample it to 192kHz, mix it, then resample it back down to 48kHz and output it to the device.

Finally, it would not at all surprise me that even if Logitech did allow you to set higher sample rates just because, it would resample them before sending them to the headset itself. Higher sample rates take more bandwidth and these are wireless. I wouldn't be surprised if they sent all audio at 48kHz, regardless of setting.

Just don't worry about ultrasonics. They aren't a thing your setup can usefully reproduce and even if it could, you can't hear them. Hence the "ultra" in ultrasonic. It is "above the limit of hearing".
 
Set everything at 16bit 44kHz Red Book standard and forget about it.

The only people that say they can tell the difference are boomers that spent $50,000 on their hi-fi but havent heard anything over 15kHz since 1995.
 
Set everything at 16bit 44kHz Red Book standard and forget about it.

The only people that say they can tell the difference are boomers that spent $50,000 on their hi-fi but havent heard anything over 15kHz since 1995.
Even if you are younger and have your high frequency hearing, it is something I doubt you can hear at all. When I was in my early 20s I interned at the recording studio on campus. It is a really good one with real high-end gear. It could legitimately reproduce a signal way up in the ultrasonic range, I can't remember precisely but it was flat out to like 60-80kHz or something. I got to hear recordings done at 192kHz and A/B it with them at lower frequencies... And I sure couldn't tell. Now I'm certain my hearing isn't the best, but still, if a 21-year-old sitting in front of 6 figures of high-end equipment can't easily tell the difference... then what is it going to matter at home?

Goes extra for headphones since nearly all of them roll their high end pretty hard, as the G733s do. For a long time it was just done because it sounds good, but Harman did research and discovered why and it has been formalized as the Harman Curve. They've done more precise research over the years and tweaked it, but what you see in all cases is that they have 20kHz down by about 15dB or more compared to 1kHz which is the zero reference. Headphones just don't do ultra-high frequencies for a number of reasons.
 
Back
Top