Nazo
2[H]4U
- Joined
- Apr 2, 2002
- Messages
- 3,672
I've been trying to work out just what would be needed for a DVD upsampling HTPC. My dad was thinking about doing something like that as he's interested in setting up an HTPC, and DVD upsampling would be a great feature. Yes, there are standalone players that do this (though few do it with component output on copyrighted videos apparently.) The trick is, the kind of things that I'm imagining here would take a bit much. Firstly, the resolution alone is a bit much. The tv is a component HDTV utilizing 1080i, which means resizing to 1920x1080 to avoid any weird video stuff going on. Obviously it'd have to be deinterlaced first, but, as I understand it, the HDTV that doesn't use DVI/HPMI (or whatever those initials were) is interlaced, so it'd then have to reinterlace, right?
Right now my 2.3GHz barton is running in the upper percentiles just resizing to 1440x1080 with no other filters (I was able to turn on other nice stuff when I had it at 2.5, but, it won't do that anymore apparently.) I know it's at least the equivalent of a 3200+ barton, thing is, I'm using ffdshow, and I've just recently run into Celtic_Druid's builds which might just get more out of an Intel or A64 via SSE2 if I understand it correctly (the implication is it does sse1, 2, and 3. No mention of 3dNow1/2.) I've tried it out on my mother's laptop, which is a mere 1.6GHz celeron (ugh, hate celery sticks) and it's doing 1024x768 (the screen's native resolution) lanczos without any skipping using this month's build. I think it's a bit close, but, still, not bad for such a low end system.
I can't compare 1024 with 1440 exactly obviously, but, the implication is that a P4 combined with that build runs more efficiently than a Barton. From what I hear, SS3 may not help much with this sort of thing, so maybe an A64 would benefit just as much. (I know the new Venice cores support SSE3, but, that's getting a bit high in the costs... So I assume SSE2 at the most if he stayed with AMD.) Since I can't really compare the two cases well enough to tell anything, I just don't know. Besides, overclocking is out, so that means no barton is powerful enough probably.
I've also glanced through this information about the new nVidia PureVideo. He'll be getting a GF6200 if he does come up with such a system due to it being the lowest end video card he could find on newegg with component output built in made by a company we'd heard of (XGI had some card using a chip by a company I've never heard of and neither of us is willing to trust.) It looks like PureVideo is explicitely designed for DVD viewing on a PC though. Firstly, it deinterlaces, and, as I understood it, 1080i is interlaced. Since you HAVE to go through WMP and PureVideo, there'd be no option to reinterlace afterwards via ffdshow or whatever. Unless it's handled in the component output hardware or something. Only reason I looked seriously at this is the fact that if it were able to do all that via hardware of such a video card, it'd save him a LOT of money in that he'd only need a processor powerful enough for the other HTPC stuff he'll want to do. I don't think it looks like PureVideo can do what he'd want though, and he isn't fond of the idea of having no choice but to go through their software with nothing else but that working.
Right now I'm thinking it'd need to be a >=3.2GHz P4/3200+ Athlon64 when considering it getting optomizations, which I admit I'm not 100% sure about since I don't have enough to work with for comparing here unfortunately. I've had sort of the impression that video resize filtering tends to be more exponential than linear in the CPU utilization per pixel increase field, so 1024 may just need so much less power that my guess it's getting such power out of SSE2 is mistaken. Plus, I haven't had a chance to try the very latest build (using the one just before this month) on my PC at home yet. The resizing alone takes a huge chunk of power, but, he'll definitely have to deinterlace and I presume reinterlace, plus he'd probably enjoy turning on post processing filters such as denoise and such, so even if my CPU were an indicator, it's still not quite enough. It may be just too cost ineffective, but, I thought I'd try to see if anyone here might have more experience playing around with this sort of thing and might maybe know a little closer what it'd take to get such filtering up and running at such an insanely high resolution.
Oh, and overclocking is out. The system would need to be powerful enough to do it all with stock clocks. That's why I don't even consider something as cheap as my setup ($150 for CPU+MB) since he'd need water cooling and the will to overclock (he isn't fond of the idea) to get it powerful enough probably... (Gets hot in there, and I suspect a barton would need at least to run at 2.5, which I know can get you 1440x1080 lanczos with some basic post processing.)
Right now my 2.3GHz barton is running in the upper percentiles just resizing to 1440x1080 with no other filters (I was able to turn on other nice stuff when I had it at 2.5, but, it won't do that anymore apparently.) I know it's at least the equivalent of a 3200+ barton, thing is, I'm using ffdshow, and I've just recently run into Celtic_Druid's builds which might just get more out of an Intel or A64 via SSE2 if I understand it correctly (the implication is it does sse1, 2, and 3. No mention of 3dNow1/2.) I've tried it out on my mother's laptop, which is a mere 1.6GHz celeron (ugh, hate celery sticks) and it's doing 1024x768 (the screen's native resolution) lanczos without any skipping using this month's build. I think it's a bit close, but, still, not bad for such a low end system.
I can't compare 1024 with 1440 exactly obviously, but, the implication is that a P4 combined with that build runs more efficiently than a Barton. From what I hear, SS3 may not help much with this sort of thing, so maybe an A64 would benefit just as much. (I know the new Venice cores support SSE3, but, that's getting a bit high in the costs... So I assume SSE2 at the most if he stayed with AMD.) Since I can't really compare the two cases well enough to tell anything, I just don't know. Besides, overclocking is out, so that means no barton is powerful enough probably.
I've also glanced through this information about the new nVidia PureVideo. He'll be getting a GF6200 if he does come up with such a system due to it being the lowest end video card he could find on newegg with component output built in made by a company we'd heard of (XGI had some card using a chip by a company I've never heard of and neither of us is willing to trust.) It looks like PureVideo is explicitely designed for DVD viewing on a PC though. Firstly, it deinterlaces, and, as I understood it, 1080i is interlaced. Since you HAVE to go through WMP and PureVideo, there'd be no option to reinterlace afterwards via ffdshow or whatever. Unless it's handled in the component output hardware or something. Only reason I looked seriously at this is the fact that if it were able to do all that via hardware of such a video card, it'd save him a LOT of money in that he'd only need a processor powerful enough for the other HTPC stuff he'll want to do. I don't think it looks like PureVideo can do what he'd want though, and he isn't fond of the idea of having no choice but to go through their software with nothing else but that working.
Right now I'm thinking it'd need to be a >=3.2GHz P4/3200+ Athlon64 when considering it getting optomizations, which I admit I'm not 100% sure about since I don't have enough to work with for comparing here unfortunately. I've had sort of the impression that video resize filtering tends to be more exponential than linear in the CPU utilization per pixel increase field, so 1024 may just need so much less power that my guess it's getting such power out of SSE2 is mistaken. Plus, I haven't had a chance to try the very latest build (using the one just before this month) on my PC at home yet. The resizing alone takes a huge chunk of power, but, he'll definitely have to deinterlace and I presume reinterlace, plus he'd probably enjoy turning on post processing filters such as denoise and such, so even if my CPU were an indicator, it's still not quite enough. It may be just too cost ineffective, but, I thought I'd try to see if anyone here might have more experience playing around with this sort of thing and might maybe know a little closer what it'd take to get such filtering up and running at such an insanely high resolution.
Oh, and overclocking is out. The system would need to be powerful enough to do it all with stock clocks. That's why I don't even consider something as cheap as my setup ($150 for CPU+MB) since he'd need water cooling and the will to overclock (he isn't fond of the idea) to get it powerful enough probably... (Gets hot in there, and I suspect a barton would need at least to run at 2.5, which I know can get you 1440x1080 lanczos with some basic post processing.)