Planning Home Theater

Gen.Ben

Limp Gawd
Joined
Aug 1, 2007
Messages
411
Hey guys,

I am in the stages of planning a home theater for our new house that we will be moving into. It has quite the large media capable room that will be perfect for this application. I would love some critique on the components of my system, mainly the speakers, HTPC, and receiver.

Thanks.


For the receiver I was thinking the Onkyo TX-SR606

For my desktop speakers I currently have X-LS from www.av123.com their products have been nothing but gold to me so I was thinking

x-statik's for the fronts
x-voce for the center
x-omni's for the rears and maybe sides
MFW-15 subwoofer to top it all off :)


For the many damn cables required, I'm thinking to hit up monoprice.com because they're stuff is great quality and value.

To check out all the decora's, cables, connectors that I will be using to make my setup a clean, simple looking one go here http://benkipper.com/files/monoprice/cart.asp.htm


For my HTPC I already have the Case, HDD, Optical and RAM. I am going to order the CPU and motherboard soon.

Case - Awesome little HTPC case I got for $5 brand new
dsc8776.jpg


HDD - Hitachi 160GB SATA
PSU - Case has FSP 275w pre-installed
Optical - DVD-RW for now, Blu-Ray down the road
RAM - 2GB of Crucial Ballistix 1066
CPU - AMD Athlon 64 X2 5050E 45w
Mobo - ASUS M3A78-EM AMD 780G mATX AM2+ HDMI

Is the CPU and mobo a good choice? The CPU looked appealing because it is pretty fast, doesn't put out much heat and is cheap. The mobo I picked because it supported the proc, memory and has a respectable HDMI output. Comments please.



This is the layout I was thinking of
(sorry for the very crude ms paint drawing)
hometheatre.png


If you guys could give me some feedback it would be very appreciated!

Thanks.
 
For my desktop speakers I currently have X-LS from www.av123.com their products have been nothing but gold to me so I was thinking

x-statik's for the fronts
x-voce for the center
x-omni's for the rears and maybe sides
MFW-15 subwoofer to top it all off :)
Just don't buy anything that's not in stock and ready for immediate shipment. Oh, and be prepared for some potential drama.
CPU - AMD Athlon 64 X2 5050E 45w
Mobo - ASUS M3A78-EM AMD 780G mATX AM2+ HDMI

Is the CPU and mobo a good choice? The CPU looked appealing because it is pretty fast, doesn't put out much heat and is cheap. The mobo I picked because it supported the proc, memory and has a respectable HDMI output. Comments please.
I'd go with
Intel Pentium E5200 Wolfdale 2.5GHz 2MB L2 Cache LGA 775 65W Dual-Core Processor
GIGABYTE GA-E7AUM-DS2H LGA 775 NVIDIA GeForce 9400 HDMI Micro ATX Intel Motherboard
instead
 
CPU - AMD Athlon 64 X2 5050E 45w
Mobo - ASUS M3A78-EM AMD 780G mATX AM2+ HDMI

Is the CPU and mobo a good choice? The CPU looked appealing because it is pretty fast, doesn't put out much heat and is cheap. The mobo I picked because it supported the proc, memory and has a respectable HDMI output. Comments please.

I recommend swapping out the mobo for this mobo instead:
Asus M3N78-VM GeForce 8200 HDMI mATX Motherboard - $100CAD

Unlike the 780G/790G chipsets, Nvidia's 8200/8300/9300/9400 chipsets support 7.1 LPCM audio whereas the 780G/790G chipsets support at most 5.1 audio.

@ Stereodude
The OP is from Canada. So those parts you recommended are significantly more money.
 
I recommend swapping out the mobo for this mobo instead:
Asus M3N78-VM GeForce 8200 HDMI mATX Motherboard - $100CAD

Unlike the 780G/790G chipsets, Nvidia's 8200/8300/9300/9400 chipsets support 7.1 LPCM audio whereas the 780G/790G chipsets support at most 5.1 audio.
Yes, but in order to get the best video quality from the 8200/8300 for HD content you need a CPU that supports HT3.0 (Phenom). I think the same is true of the 780G chipset the OP suggested. By the time you add a Phenom to the mix you've got a more expensive, higher power consumption setup.
@ Stereodude
The OP is from Canada. So those parts you recommended are significantly more money.
Sort of, but they have better performance. They're cheaper or on par with an AMD solution with similar performance, while consuming less power.
 
Yes, but in order to get the best video quality from the 8200/8300 for HD content you need a CPU that supports HT3.0 (Phenom). I think the same is true of the 780G chipset the OP suggested. By the time you add a Phenom to the mix you've got a more expensive, higher power consumption setup.

WTF? You don't need a Phenom CPU nor a CPU that supports HT 3.0 for HD content.
 
WTF? You don't need a Phenom CPU nor a CPU that supports HT 3.0 for HD content.
If you want to deinterlace it you do. Why is everyone so in the dark about the limitation of the video processing of HD content when using a processor without a HT3.0?

You get a HD HQV score of about 25 if you use a CPU without HT3.0 with a 8200/8300 With a CPU that has HT3.0 you get a score of 90-100 (100 being a perfect score). A 9300/9400 with any CPU will get you a score of 90-100.

Reportedly the 780G has the same type of issue.
 
I would not grab the tx-sr606 for that speaker setup unless you are planning to run it in pre/pro with an external amplifier. I would suggest looking for a leftover tx-sr805 from onkyo for some more power for all your equipment. I have one running B&W 683s and other 600 series equipment and still sometimes find power lacking a bit
 
what about getting a cheaper motherboard and cpu combo and getting a video card that does the encoding? It needs to support Video and 7.1 Audio over HDMI.

what about something like a ASUS Radeon HD 3450? how would that perform?

or a Phenom 8650 triple core? does that support the HT 3.0 stereodude was talking about?
 
If you want to deinterlace it you do. Why is everyone so in the dark about the limitation of the video processing of HD content when using a processor without a HT3.0?

It simply hasn't come up AFAIK. In fact I can't recall anyone actually trying to deinterlace it or having problems deinterlacing it without a CPU using HT 3.0.

@ Gen.Ben
The HD3450 does not support 7.1 audio through HDMI AFAIK. However the HD4350 as well as the HD4550 are the cheapest cards I know that can support 7.1 audio through HDMI.
 
If you want to deinterlace it you do. Why is everyone so in the dark about the limitation of the video processing of HD content when using a processor without a HT3.0?

You get a HD HQV score <snip>

I'm having trouble finding anyone who can honestly say they have a IQ issue with the 8200 nvidia video. The only reference to IQ being low is in regards to HD HQV, which returns a score of 0 nearly across the board. Don't you think it's odd that it doesn't even register a score? I have a feeling there's a driver or software issue. Admittedly, since the 8200 uses the system ram for vram, it's possible the higher bandwidth of the HT3 is the reason, but it seems unconfimed to me...

Have you seen anyone actually explore IQ that doesn't reference HD HQV and displays the same issues?
 
You don't need a Phenom to deinterlace... and 99% of the HD content you'll run into is progressive to begin with, so it doesn't matter.
 
You don't need a Phenom to deinterlace... and 99% of the HD content you'll run into is progressive to begin with, so it doesn't matter.
You're right. You need it for IVTC / reverse 3:2 pulldown and other video processing on HD content.

Last time I checked most broadcast HD is 1080i and sourced from film, so it needs to be properly deinterlaced via IVTC. If you're only watching Blu-Ray and HD-DVD, then I suppose it doesn't matter.

Take a look at this and you'll see both the 780G and 8200 don't light the world on fire when the CPU used lacks HT3.0.
it's possible the higher bandwidth of the HT3 is the reason, but it seems unconfimed to me...
AMD said that exact thing about the 780G in the above review... "Addendum 26th October 2008: AMD has contacted us to inform bit-tech that the performance of the 780G is limited by using a HT 1.0 CPU. We were told that using HT 3.0 CPUs (AMD K10 architecture like a Phenom) will improve performance in HD HQV, so in our next HTPC motherboard test we plan to also test with one of these CPUs."

The same thing applies to the 8200/8300. It is limited by the lower bandwith of a CPU that lacks HT3.0. Which is why reviews that use a Phenom with them get a perfect or near perfect HD HQV score, and they don't when used with a HT1.0 CPU.


Hey, you guys can do whatever you want, but if you want an IGP that matches a stand alone nVidia 8600 (or better) or a nVidia 9500 (or better) [or ATI 4550 or better]. IE that means they get nearly perfect HQV and HD HQV scores you need 8200/8300IGP with Phenom, or a 9300/9400IGP.
 
Okay, so we get to go from a 45W to a 95W cpu if we want deinterlacing... luckily I don't have any hd broadcasts to deal with, I'm still using analog tuners. :rolleyes:

Seems like the Phenom II x3 would be the safest route... But they're significantly more expensive. AMD really needs to puke out a 45W or 65W dual core with HT3.0 in a hurry.

Actually, it's too bad their platform designed to have cpu and gpu on a single package never got anywhere, this would be the ideal use of such a beast.
 
You're right. You need it for IVTC / reverse 3:2 pulldown and other video processing on HD content.

Last time I checked most broadcast HD is 1080i and sourced from film, so it needs to be properly deinterlaced via IVTC. If you're only watching Blu-Ray and HD-DVD, then I suppose it doesn't matter.

All the broadcast I've watched does perfect hardware deinterlacing on my nvidia chips.. no need for a Phenom.
 
Have you seen anyone actually explore IQ that doesn't reference HD HQV and displays the same issues?

Take a look at <link to HQV benchmark> and you'll see both the 780G and 8200 don't light the world on fire when the CPU used lacks HT3.0.

So... still waiting for a reference to something OTHER than HD HQV which is is considered by many to be somewhat arbitrary.

Nowhere else have I seen anyone making claims about visual degradation or other issues with the 8200 + 5050e except with a single benchmark... which makes me think there's something fishy going on.
 
All the broadcast I've watched does perfect hardware deinterlacing on my nvidia chips.. no need for a Phenom.
Deinterlacing 1080i is not the same as doing a proper IVTC on film based 1080i to recover the original progressive 1080p frames. Just because you don't know what you're missing doesn't mean you're getting the the best possible image quality.
 
So... still waiting for a reference to something OTHER than HD HQV which is is considered by many to be somewhat arbitrary.

Nowhere else have I seen anyone making claims about visual degradation or other issues with the 8200 + 5050e except with a single benchmark... which makes me think there's something fishy going on.
So you want to throw out the objective test for supported features and go with subjective testing done by people who don't know what they're looking for because they're used to testing for frame rates in Crysis. There's a recipe for success. :rolleyes:

Besides, if you were to dig deep into what's been posted by users on AVSforum you'd find first hand accounts detailing the same thing I'm telling you.

http://www.avsforum.com/avs-vb/showpost.php?p=15584751&postcount=2182
http://www.avsforum.com/avs-vb/showpost.php?p=15219346&postcount=2112
http://www.avsforum.com/avs-vb/showpost.php?p=14240949&postcount=1717
http://www.avsforum.com/avs-vb/showpost.php?p=14235861&postcount=1713
http://www.avsforum.com/avs-vb/showpost.php?p=14202596&postcount=1673
http://www.avsforum.com/avs-vb/showpost.php?p=14199175&postcount=1670
http://www.avsforum.com/avs-vb/showpost.php?p=14003777&postcount=1353


It's unfortunate nVidia doesn't keep this chart (PDF) updated anymore. Then you could see the features that are missing from certain chips.
 
Alright, so from what I've read,TELECINE is taking 24fps film and adding 2 interlaced frames every 5 to "step up" to 30fps video. (29.997 but who cares). Interlaced video is actually 60 fields a second, but each field consists of either odd or even lines of video.

So IVTC, inverse telecine, is just reversing the process.

Deinterlacing is video processing every frame to smooth out jagged edges, whereas IVTC is just recombining 1 frame from 2 fields every 5 frames.

It seems to me IVTC should be much more easily accomplished than deinterlacing.

As an aside, I wonder why we haven't moved away from 30fps anyway, and just accepted the film 24fps as standard? It's not like we're tied down to analog 30fps broadcasts anymore.
 
Alright, so from what I've read,TELECINE is taking 24fps film and adding 2 interlaced frames every 5 to "step up" to 30fps video. (29.997 but who cares). Interlaced video is actually 60 fields a second, but each field consists of either odd or even lines of video.

So IVTC, inverse telecine, is just reversing the process.

Deinterlacing is video processing every frame to smooth out jagged edges, whereas IVTC is just recombining 1 frame from 2 fields every 5 frames.
More or less... IVTC on 1080i content will give you a sharper image because you effectively get back to the original progressive frames. This is one of the features missing from the 8200/8300 if a CPU lacking HT3.0 is used. I watch a ton of recorded 1080i network TV (CBS and NBC) which is film based on my HTPC so IVTC on 1080i content is a go / no-go for me.
It seems to me IVTC should be much more easily accomplished than deinterlacing.
Apparently it's not due to pattern detection and other issues.
 
I created a new thread for this discussion, I think it'll probably be better than continuing to hijack this poor guy's thread. It's here
 
Deinterlacing 1080i is not the same as doing a proper IVTC on film based 1080i to recover the original progressive 1080p frames. Just because you don't know what you're missing doesn't mean you're getting the the best possible image quality.
Honestly, that's not going to be run into in the HTPC arena.. Even if you do, the difference is so minimal, you're likely not to notice with hardware deinterlacing.
 
Okay guys,

So I still haven't decided on what is the best route to go for for perfect HD playback yet.

My options are

5050e with 8200 board?
Phenom Triple Core with 8200 board?
Cheap processor and cheap board with good HDMI video card?
E1400/E1500? with 9300 board?


What would be the cheapest way to get the job done properly?

Thanks.
 
I'm starting to sway towards recommending e5200 + nvidia 9300 now, given what I know about HT3.0 requirements for post processing on the AMD side. Hopefully in a few weeks or a month there'll be decent power envelope AMD cpus with HT3.0 and the point will be moot, but for now, it's an issue, if only for post-processing of 1080i source video.
 
I'm starting to sway towards recommending e5200 + nvidia 9300 now, given what I know about HT3.0 requirements for post processing on the AMD side. Hopefully in a few weeks or a month there'll be decent power envelope AMD cpus with HT3.0 and the point will be moot, but for now, it's an issue, if only for post-processing of 1080i source video.
You don't need it... that's only for 30fps-->24fps interlaced content, which you won't run across unless you need to run 24hz/120hz and your HDTV supports that, and you happen to have 30fps interlaced content. Even so, I have my HTPC running that just fine with no ill effects.
 
No, Intel doesn't use HT. Intel's memory controller is in the northbridge, and so is the 9300 IGP... so the memory can be accessed from the 9300 without complications.

It gets more complicated with AMD because the memory controller is in the CPU, which is good for the cpu > memory access but adds a step for the IGP>CPU>RAM>CPU>IGP.

This is a negligible effect if there's sufficient bandwidth, but apparently the HT1.0 (1GHz) of the 48/5050e CPUs isn't sufficient, whereas the HT3.0 (1.6-2.0GHz) of the Phenom/Puma is.
 
Dude, just completely forget the crap that Stereodude posted and the resulting posts after it. You don't need a Phenom in an HTPC, it's a total waste. The difference, which is only going to be noticeable in, MAYBE, 5% of OTA HD watching is nearly nothing. He even posted screen shots of the "problem" and the two screenies are nearly 99% identical but if you only watch things like CBS's logo then go ahead and get the Phenom/5200+9400 combo platter and waste that extra money on something that you're not even going to notice.

Seriously, it's like buying a 4870 X2 to game at 1024x768. It's that fucking ridicules.
Okay guys,

So I still haven't decided on what is the best route to go for for perfect HD playback yet.

My options are

5050e with 8200 board?
Phenom Triple Core with 8200 board?
Cheap processor and cheap board with good HDMI video card?
E1400/E1500? with 9300 board?


What would be the cheapest way to get the job done properly?

Thanks.
 
Dude, just completely forget the crap that Stereodude posted and the resulting posts after it. You don't need a Phenom in an HTPC, it's a total waste. The difference, which is only going to be noticeable in, MAYBE, 5% of OTA HD watching is nearly nothing. He even posted screen shots of the "problem" and the two screenies are nearly 99% identical but if you only watch things like CBS's logo then go ahead and get the Phenom/5200+9400 combo platter and waste that extra money on something that you're not even going to notice.
Just because you have low standards doesn't mean everyone else does. It is noticeable and I proved it.

You, and people like you, crack me up. Spend thousands of dollars on a TV or projector and audio system to go with it and then try to save $50 (or less) on the HTPC instead of trying to get the best possible picture. I guess people should just buy 720p TVs and projectors too right? I mean they're cheaper and almost as good and you might only see the difference 5% of the time. :rolleyes:
 
Dude, just completely forget the crap that Stereodude posted and the resulting posts after it. You don't need a Phenom in an HTPC, it's a total waste. The difference, which is only going to be noticeable in, MAYBE, 5% of OTA HD watching is nearly nothing. He even posted screen shots of the "problem" and the two screenies are nearly 99% identical but if you only watch things like CBS's logo then go ahead and get the Phenom/5200+9400 combo platter and waste that extra money on something that you're not even going to notice.

Seriously, it's like buying a 4870 X2 to game at 1024x768. It's that fucking ridicules.

I may be a noob around here, but I've been a forum whore for a long, long time. People come here to discuss different setups & builds and highlight their pro's and cons, regardless of how "insignificant". To ridicule someone for pointing out a valid flaw (regardless of how small) in the commonly accepted wisdom is only hurting the community as a whole. The whole point of forums is to present the info, discuss it, and then allow each of us to make decisions for ourselves. Trying to destroy and suppress someone's reputation and argument simply because you dispute it's significance is well... just wrong.

You could argue stereodude's certitude is misleading, and perhaps it is, but you have voiced your dissension you can let others decide its value for themselves.

Besides, 4870 X2 at 1024x768? More like geting a 4870 over a 4850 for that resolution ;)
 
I created a new thread for this discussion, I think it'll probably be better than continuing to hijack this poor guy's thread. It's here

It's still there, and this thread is still being hijacked. <sigh> Move along, nothing to see here... :D
 
Sorry if I miss it. What projector are you planning to use with your HT? That layout looks really nice for HT.
 
Sorry for the late response, I have been busy lately.

The projector I will be using will be something like a

Optoma HD806 1080P
or a
Epson Powerlite Home Cinema 6100 1080P.



comments?
 
I've got the Epson 1080p projector, and it is great. The colors are very good, and it's fairly bright.. just make sure you are watching in a dark room. It's a pretty quiet projector too. I know that matters if you have it directly above your head when you're watching a movie. I don't even notice mine, and it sits 5 feet over my head.
 
I've got the Epson 1080p projector, and it is great. The colors are very good, and it's fairly bright.. just make sure you are watching in a dark room. It's a pretty quiet projector too. I know that matters if you have it directly above your head when you're watching a movie. I don't even notice mine, and it sits 5 feet over my head.

Would this be Epson 1080ub or 6500ub?
 
all right, im learning little by little here, but i have one question. if one was to switch from a normal x2 to a kuma or phenom, would one have to change any settings in playback software to take advantage of HT 3.0?
 
Back
Top