HD videos lag since Windows 10

pinoy

Limp Gawd
Joined
Dec 8, 2010
Messages
447
Previous to my Windows 10 upgrade I was able to watch 1080p videos full screen. I can watch youtube and netflix without issues. After the upgrade I can no longer do so as the videos play at 1 fps if not slower. If I scale down the video window to useless proportions then the frame rate goes back to normal. CPU usage is maxed out as well which tells me the video card acceleration isn't helping if at all.

I have an AMD Phenom II x4 925 (2.8GHz) and Geforce 6200LE video card. Granted the video card is weak, but isn't my quad core processor fast enough to handle all video acceleration on it's own? I am wondering if Windows 10 is to blame for this and I should go back to Windows 7.
 
No you still need the DXVA from the video card. When my last card went bad my current cpu could do it but it used a lot more cpu than i ever thought possible. You should be getting more than 1fps either way i would think. To me it sounds like you don't have actual gpu driver installed even if that is an ancient gpu. What does this tool say? http://bluesky23.yukishigure.com/en/DXVAChecker.html
 
It would appear Windows 10 screws up video acceleration through CPU or GPU with this Geforce card. I tried the video card in Windows 7 and the video was playable. The CPU did most of the decoding if not all. I'm not sure if Purevideo is available on the Geforce 6200LE. I would at least expect the CPU to be capable of decoding the video in Windows 10. It's not.
 
It should be. The fact that resizing the window, which has no impact on decoding speed, increased the video's frame rate tells me there's something else at play as well.

But you're probably right about Windows 10 and your card. My guess is Windows 10 requires WDDM 1.1 or higher for DXVA, where as your card/driver only support 1.0.

Edit: Yep, your card isn't supported:
http://nvidia.custhelp.com/app/answ...2LzEvdGltZS8xNDQwMjEyMzA0L3NpZC9RZjNsQ3d1bQ==
 
I ran a Windows XP virtual machine inside Windows 10 host. I was able to watch HD videos without a problem in XP and the host CPU utilization was only 40%. I ran GPUZ on the host side and it showed the GPU was not doing any acceleration at all. So the virtual machine was doing it all through the host CPU.
 
Unless you changed it, the default renderer in XP is Overlay which would be all CPU.

10 does support WDDM 1.0 but you are basically using the Vista graphics model where GPU data is also copied to system memory for better performance. You can try setting the visual performance to the minimal in system, but you'll do a lot better getting a supported card. I think it's been a year or two since nvidia updated drivers for that card in 7.
 
Back
Top