Dolby Digital Live vs 6 Channel Analog for Games

Jimb0

n00b
Joined
Jun 5, 2007
Messages
12
From what I've read, no PC game is programmed to produce a Dolby encoded bitstream via its own software - this is a separate and independent encoding function of your soundcard known as "Dolby Digital Live".

Now, since "Dolby Digital" is a compressed and "lossy" digital audio format I'd like to know how much information is lost (if any) during this encoding process, or otherwise altered from the original intent of the game designers in order to conform to the Dolby spec?

Doesn't the prevailing audiophile wisdom state to always aim for least amount of conversions taking place from your source to your speakers?

As far as I can tell, the only reason you would want to digitally encode an analog source is to conveniently carry the signal over a single optical/coaxial cable, or for receives that lack multichannel analog input.

Also, (and this is a different case altogether here) - if you own a quality soundcard and a below-average receiver, when playing a natively encoded digital Dolby source (such as a DVD) would it not be better to let your soundcard decode the Dolby bitstream and then pass this as an analog signal directly to the receiver, thereby bypassing the receivers own Dolby decoding ability completely?

Perhaps there are some factors in Dolby's favour here though, such as analog signals being degraded by interference during transmission while digital is immune. I'm really not enough of an expert to answer my own questions here, so any other opinions would be greatly appreciated:)

Cheers.
 
first of all, "audiophile wisdom" is kind of like saying Bush knew what he was talking about, just kindly put that out of your mind

regarding "will DDL kill my games quality oh noes its gonna kill us all", no, because video games have crap sound quality to start with, and have massively compressed audio data to start with (because PCM is not practical for storing voice overs or whatever other junk (background music for example))

so you'll lose nothing

I have heard accounts of the Creative factory X-Fi drivers having some quirks with EAX/CMSS being enabled into DDL (and DDL only, not DTS:C (which is another encoding option)), although I have not confirmed this, I do know the Auzen X-Fi drivers have no such issue, and Asus/HT Omega drivers have no problems either

as far as advantages of using DDL, given that the only multi-channel data you're likely to see from a personal computer in a home setting is going to be from movies (which is already AC-3 or DTS encoded, oh god forbid, we're all gonna die, compression has happened (yeah, I have an issue when people whine about compression or go all audiophile snob on it)) or games, it makes more sense to carry all of that over a single line, if possible, hence S/PDIF and just decode the AC-3 at the receiver/processor/whatever

getting into which device has a better D/A stage really depends on what devices you're talking about, basically every soundcard since Audigy 2 will easily do >100 dB SNR into multi-ch, and can also pass bit-perfect stereo/multi-ch (including DDL), most inexpensive modern receivers will also post ~100 dB SNR or higher for their D/A stage (because thats what almost any production DAC is capable of doing, if not far better), and I'm not gonna bother explaining why this means next to nothing (because I doubt you'd listen, since it means you can't just have a simple linear relationship or an "opinion" (because facts are totally overrated))

in summary:
DDL is fine, its a very nice feature when you've got the hardware to support it (receiver/processor external to the system), and doesn't really "cost" you anything in performance, multi-ch analog will be comparable, although in some cases, is less than ideal (because many receivers will not allow you to apply any adjustments, or very minimal adjustments, to multi-ch pre-ins)

oh, and get the whole "analog signals being degraded by interference while digital is immune" audiophile crap the hell out of here
 
Perhaps there are some factors in Dolby's favour here though, such as analog signals being degraded by interference during transmission while digital is immune.
Digital isn't as immune to errors as you might think, but these errors (jitter) generally fall under the classification of "inaudible". Contrary to popular belief, sound cards don't fare poorly with respect to digital clocking.

As far as degradation over analog cables, yeah, it's possible. Line level signals are indeed susceptible to degradation via EMI, but it shouldn't be an issue unless you're routing cables inappropriately or using excessively cheap interconnects with effectively zero shielding. Any halfway-decent interconnect will be fine for short and medium-length runs.

Go ahead and use DDL if you find that it's more convenient.
 
as for original question, the general rule is

If you own a sound card with good DACs (like X-Fi Prelude) and a cheap receiver, it is better to use analog cables.
If you own a high-end receiver with superior DACs, it is better to use a digital stream using SPDIF or DDL.

Personally I'm using X-Fi Prelude and analog cables. I tried DDL initially and it did not work correctly (produced cracking, sound artifacts and weird noises).
 
Jimb0:

You have some pretty specific questions that I'm not sure if I can answer them clearly. For instance, how much is lost using live encoding to DDL?

Firstly...the source that gets encoded is PCM digital audio.

The soundcard does _digital_ processing of the audio data available to it...generally, this means doing fun stuff with soundbytes for games that use hardware accelerated audio, and if not, it just gets PCM audio direct.

Either way, once live encoding starts, it's all a matter of converting PCM to DDL. Now, DDL is 16bit, 48 khz when it tops out...according to what I can find. Since we're dealing with 6 channels for 5.1 surround, that's 16 * 48000 * 6 bits per second, or a total of 3.84 megabits/sec uncompressed.

Now, Jean Michel Jarre, audio wizard extraordinaire, released an audio dvd with some of his music in 5.1 AC3, which is Dolby Digital - as it so happens, I was able to grab it and look at the total size of it. It has literally 10 frames of video content, making the size of that negligible. The size of the track divvid up with the playtime gave 0.488megabit/second. Now, this has been greatly compressed from the original PCM audio it was based on, in order to be transmitted over SPDIF, because SPDIF it bandwidth limited. There are likely better codecs around for transmitting over SPDIF, and the cap of bandwidth for SPDIF is not necessarily this low - but since DDL must meet the same format spec of DD, it'll be inhibited by roughly the same bandwidth.

Now what does this mean, exactly? A good modern codec can losslessly compress stuff down a lot.
http://flac.sourceforge.net/comparison.html

Now, apple Lossless was designed to be hardware decoded, as one of the few formats listed - so using it's ratio is a good estimate for how far you could compress something meant for a receiver without going lossy. My calculation comes out to 2.112 megabits a second...so DD is 7.87:1 compressed using uncompressed as benchmark, and using compressed it's only 4.3:1.

To give you some idea of how bad that is, regular cd's are at 16bit * 2 channels * 44.1 Khz, or 1.411megabits uncompressed. Apple currently encodes all their audio at most at 0.256megabits per second. The compression is, in other words, 5.5:1. Comparing to lossless, it's 3:1.

However, in the good old days, apple compressed all the way down to 0,128megabit, so it's pretty clear that you can get good audio with less space than what DDL allows.

There's on big caveat to DDL that isn't revealed by these calculations: The environment in which decoding and encoding is done. Back in the day when DD was introduced, receivers didn't do both video and audio, and asking users to manually lipsinc their devices wasn't an option, so DDL has to be played back as a bytestream. Similarly, it has to be encoded as a bytestream to avoid delay...now, obviously, this puts a bigass strain on both ends of the equipment, meaning it needs to be done really fast...so DDL is likely not as effective at encoding stuff as DD. As such...I wouldn't be surprised if DDL turned out to yield only as good results as 128kbit mp3's...but then, games aren't based on high quality audio in the first place, and hey, perhaps DDL gives more bandwidth to the front channels than the rest.

There are clearly some other advantages to DDL over analog - for instance, if I used analog in my setup, sure, I might avoid compression, but I would only impose an analog stage from the computer to the receiver, which would then convert back to digital to do it's fancy digital processing, and then convert it back to analog for the speakers...so I'd get two DAC stages rather than one, which people insist is one of the most important parts of audio processing, and I'd get an ADC stage on top of that. That won't be the case if you only use amps though, and not a receiver.

Anyway, this whole thing is the reason I ended up springing for an HDMI connection instead... it cuts down mixing, compression, etc. to one single digital stage - something very near the game - and the rest just bubbles along at high bandwidth without compression. Feel free to call me silly though - in the end, my game sound might not even audibly sound better.
 
Aemsere, Multichannel PCM over HDMI, I never even thought about it!

This technology would seemingly solve any issues at all with transferring audio generated from PC games to a receiver.

Here's a few questions that immediately spring to my mind though:

1.) Since were keeping the signal PCM all the way, does that eliminate the DAC stage of the soundcard? Would be great if it did.

2.) My videocard (Nvidia 8800 GT) supports HDMI while my soundcard does not, can I pass PCM audio generated by my soundcard (Creative X-Fi Elite Pro) over the HDMI output of my videocard?

Cheers.
 
Last edited:
Aemsere, Multichannel PCM over HDMI, I never even thought about it!

This technology would seemingly solve any issues at all with transferring audio generated from PC games to a receiver.

Here's a few questions that immediately spring to my mind though:

1.) Since were keeping the signal PCM all the way, does that eliminate the DAC stage of the soundcard? Would be great if it did.

2.) My videocard (Nvidia 8800 GT) supports HDMI while my soundcard does not, can I pass PCM audio generated by my soundcard (Creative X-Fi Elite Pro) over the HDMI output of my videocard?

Cheers.

oh you're confusing quite a few things there
even via S/PDIF output you aren't using the DAC, just digital to digital, HDMI has no great advantage in that regard

as far as "massively improving the sound of the game", why? crap is still crap, it just has a wider highway to drive on (the game won't magically have lossless audio or anything like that)

and your X-Fi is going to pass S/PDIF to your 8800's HDMI, so yeah, not gonna happen

seriously, y'all need to get off this PCM/HDMI hysteria bandwagon
 
Jimb0:

Currently, the only solution that allows you to do hdmi would be getting an ASUS HDAV (either deluxe or slim), or the Auzentech HomeCinema card...

It is technically possible to pass hdmi audio out through the hdmi of your graphics card, but it'll only be upsampled spdif, so that doesn't clear up the problem, and you need a weird internal cable for it and stuff... but both this, SPDIF, and an hdmi soundcard will bypass the DAC of the soundcard.

You're in for a fair bit of money with either sdcard option, and there's all sorts of things you need to clear up before going that route, like, ensuring that you have an HDCP capable reciever, that the graphics card is on the QVL for the soundcard, that you have an extra dvi or hdmi out port that you're not using on your graphics card...

So it's a pain in the ass to go this route, but I have it working now =]

As for whether it's worth it for games...the highway is _a lot_ wider...so if anybody wanted to, they could definitely make a game with better audio. Obobski if obviously of the oppinion that noone has, and I honestly don't know....so who knows?
 
Game developers still use uncompressed PCM audio, albeit sparingly. The difference between DTS Connect and pure, unmolested output via HDMI would be so negligible as to be almost entirely worthless, however. That's not really an avenue worth chasing unless an HDMI output would benefit you in some other way.
 
Obobski if obviously of the oppinion that noone has, and I honestly don't know....so who knows?

not so much opinion, more experience/factual knowledge
and please don't misunderstand me as attacking you, as you do make a good point regarding having HDMI, I'm just trying to explain that it isn't required, and going S/PDIF isn't going to somehow "Cramp" everything, presently theres little to no advantage in having HDMI for L-PCM over S/PDIF for DTS:C or DDL, or analog, it would all ultimately sound the same, although for OTHER reasons, it may make a positive difference (read below)

Game developers still use uncompressed PCM audio, albeit sparingly. The difference between DTS Connect and pure, unmolested output via HDMI would be so negligible as to be almost entirely worthless, however. That's not really an avenue worth chasing unless an HDMI output would benefit you in some other way.

and yeah, I'm sure PCM audio does exist at some points, but you gotta consider, if you have 4-9GB at the top end for storage, if you use half of that for all of your sfx, you better have some pretty tight code and ridiculous texture compression schemes, and since most people judge games based on visual quality, not sound quality (consider when HL2 came out, how much press Source's audio system got, vs how much press havok got, or how much press HDR got, etc)

I'm not saying theres anything wrong with L-PCM via HDMI, I'm just saying theres no need, especially when DDL and DTS Connect do everything so well

NOW
if you're going to be passing other L-PCM or MLP type sources via HDMI, sure, grab the card, and have your lossless multi-ch that actually takes some advantage of that bandwidth

if I had to take a guess, I'd say 10 years or less, we'll actually see a shift towards higher in-game audio, in terms of PCM for most everything (BD will help with this, if it ever really comes to PC, at least thats my guess), and HDMI being more and more common on CE devices, so it isn't like the tech will just exist for a while and then be passed over, the mass of things just aren't "there" yet
 
First...I actually do have other uses for having hdmi for other reasons - some of them are clearly silly and, well...hopetimistic, at best. Some of them actually make sense (I enjoy hi-def movies with great audio, for example). Probably the nicest feature is that I don't have to switch between spdif passthrough and ddl depending on my audio source...this may sound stupid, but when your music is in flac, your games are in surround, and your movies have AC3 soundtracks...it's a real bother =P
Also, with hdmi, everything simply sounds like lots of pretty.

Obobski: No worries, I didn't mean to say that you don't have credible sources to back you in your assessment - just that I can't actually tell, from all the way over here, what you base it on :)

I don't see your post as an attack either, but I hope you'll forgive me for thinking you haven't personally compared music from all the newest titles out there using both interfaces :) If you have some other source that you can point your finger at, but otherwise, I'll figure that you probably came to your conclusion through a mix of listening to music and people who are trustworthy...so I will take your word that it'll be impossible to hear a difference between the spdif and hdmi interfaces in a lot of games, probably most. I'm not sure it goes for all of them tho...

I have doubts about DDL being as good because I don't know what channels get priority, if any. If all 6 channels receive equal priority, and games use lossless audio, if you use spdif, you downmix all the way from a possible 96khz 24bit 2ch audio stream, at 2304kb/s/ch, well beyond regular cd's, and all the way down to something like 48khz/16bit, 64kb/s/ch....

Being part of a smalltime gamedeveloper myself, I know our music is mastered in this format...as is most music while it's still in the studio, I believe. It likely never gets published in this format, but since you can compress the audio pretty well using the open source ogg format very easily, it's not ending up all _that_ big. An hours worth of music comes out to something like 1.1gigabyte....Clearly, if games came out on bluray there would be no doubt that almost all game music would be in this format. As it stands, with steam being the most popular way of getting games, game music probably isn't at this level at all. However, CD level resides at about 400megabyte/hour, which certainly could be used in some titles, and a lot of people will swear they're able to tell the difference between even good mp3's and cd's...64kb/s/ch, live stream-encoded probably isn't anywhere near a good mp3, especially since stream encoding cannot favor certain frequencies and audio-levels and still remain neutral...

So yeah I honestly think hdmi is poised to make a difference compared to DDL if gamedevelopers want it to...I can't compare to DTS Connect, since I don't know the bandwidth and so forth of that, but since DTS 96/24 is also an SPDIF format, I wouldn't be horribly surprised if DTS Connect turned out to be better than DDL...
 
yeah, as you've posted (aemsere), HDMI does have alternate uses, and I do know that disable/enable for DDL gets very old (if only it were as transparent as Pro Logic IIx, eh?)

as far as the quality differences (obviously I'm more credible if I give you a subjective opinion than factual information eh ;)), I'm speaking JUST to modern games, even titles like Fallout 3 do not have any ridiculous MLP audio track or similar, yeah they have positional audio, and yeah they have good sound quality, but it isn't approaching what an HD-DVD soundtrack is capable of (its barely approaching what a DVD soundtrack is capable of), so the 448kbit/s 6ch solution isn't going to be the end of the world (and I'm gonna say (and probably be right, even if I get nitpicked a bit) that there is no bandwidth preference to any channels, as AC-3 does not allow this, and decoders don't know the difference between AC-3 "pre-encoded" or DDL, DDL only matters on the encoder side (although there are probably a number of clever tricks in there, it is Dolby after all :)))

and yeah, if you have 24/96 audio from the studio, thats great, but consider that you're talking about a total hour of audio at 1GB, thats MASSIVE when you're talking about a DVD release title which needs at least an hour of soundtrack, thousands of lines of voiceover acting, hundreds of sound effects, FMV/cutscene audio, and any other game specific stuff (for example noises when you select things in a menu), along with all of your textures, and the application itself, and you're talking about 1/4th to 1/8th of the entire distribution media, just for the background music, starts to get problematic, like I said, if BD goes anywhere, a 50GB disc does have advantages (in other words, simply reducing the compression level of textures and audio in existing games would make a big difference, quality wise)

regarding DTS:
DTS 5.1 is a 1.536mbit/s stream for 5.1, it has considerably more bandwidth, although it requires the DTS licence and associated hardware, which is missing for a lot of somewhat older equipment

as far as 96/24, I've never got a clear answer from DTS regarding the specifics, physically they cannot pass a 6ch 96/24 stream via S/PDIF unless it was running something in the range of 30:1 compression (each sample is otherwise too large), so the best theory I've heard regarding 96/24 is that its essentially an upsampling flag, the decoding end is instructured to upsample from 48/16 or 48/24 to 96/24 as some sort of smoothing operation (or some equally clever transform is taking place, to "unfold" some extra data, or remove aliasing), but DTS 96/24 is considered supported on any device which supports DTS 5.1, which is the standard 48/24 6ch stream

in other words:
DTS Connect is a better option than DDL in terms of available bandwidth (and most any equipment which implements DTS has better channel seperation when decoding DTS (usually by about 20 dB), I'll admit, DTS tracks from DVDs and HD-DVDs sound considerably better than their Dolby counterparts (excepting TrueHD/MLP)), although support is generally less widespread

and like you said, its really on consumers and developers to push for higher quality audio in situations like this, as the technology does clearly exist (either an encoding scheme, like DDL, or just going with straight L-PCM), and isn't any absurd burden to existing processors (its not like X-Fi would have issues handing off 8ch L-PCM for external decoding), its simply a lack of demand/support for features (and this is slowly begining to change), hopefully a few years from now, we'll actually see games with CD quality sound to match the high quality visuals we've got today
 
Thanks for all the info on DTS !

And yes...music is massive in size by comparison to a lot of other content, if it's layered in high quality.

It made for a good read...I'm definitely better informed on this stuff now =]
 
Regarding HDMI for use in a HTPC setup:

Thanks everyone or your informative replies in my thread, I've just got a couple more questions if you would be so kind.

You might be aware that the Radeon HD 5870 videocard has been released, which seems to be an all- in-one solution to my HTPC problems, supporting 8-channel LPCM, TrueHD & DTS-HD MA bitstreaming over HDMI, see here for details: http://www.anandtech.com/video/showdoc.aspx?i=3643&p=10

What I'm not sure about is whether this device can actually function as a soundcard for processing general audio tasks on the PC (like games, mp3 ect) or if it can only pass the bitstream produced by a blu-ray player, in which case you would still need a stand-alone soundcard.

Vice-versa with HDMI soundcards, can they pass any video at all or does it throw it all back to the videocard to be outputted?

Another issue I have is that the receiver I intend to purchase (Yamaha RX-V663) "clips" the HDMI video signal to 16-235, rather than just passing the full range of 0-255 - this issue is explained quite well in the following article: http://www.audioholics.com/tweaks/ca...vels-xvycc-rgb

Ideally what I'd like to do is have one HDMI cable connected to the receiver, responsible for passing audio only, then pass video over a seporate DVI or HDMI cable connected directly to my PC monitor at the same time.

There's so many variables at play here that I thought I should get your advice before I sink my money in.
 
If you're on windows vista or 7, audio can always be done in software, as far as I'm aware.

As such, yes, the 5870 seems to be the solution to everything.

If you _do_ go for a dedicated hdmi soundcard, then the way it works is that you use a dvi out port or an hdmi out port on the back of your gfx, you feed that via external cable into an hdmi in port on the sfx, then you feed your receiver with the signal from the hdmi out port on the sfx....at least, that's how it works on the asus hdav cards. The Auzentech Homecinema card might be different...

It's important that everything supports the same hdcp version tho - all the way from your graphics card, to your receiver, to even your monitor...

Once things are set up, what happens is, the audiovideo feed is eaten up by the receiver - but you don't have to actually use the video feed you send to the receiver if you don't want to, but the video _does_ need to be there, as far as I'm aware (I'll check later to make sure)...

In other words, right now, windows thinks I have two displays connected, an Onkyo 705 a/v receiver and my lg w2600hp-bf monitor. I can't get the bloody thing to _not_ output video to the receiver in my current setup, so I've actually had to allign my desktop in such a way that the mouse doesn't disappear off into the receivers part of he desktop by putting it to the lower-right of my screens desktop in the screen resolution windows control panel widget...

So...certainly possible, and it's the only workable solution, since my receiver is unwilling to spit out a 1920x1200 video feed, so I can't just route the video signal through it on the way to the display.
 
Back
Top