Facebook Reality Labs (FRL) has announced DeepFocus; an AI rendering system that works in conjunction with its upcoming eye tracking hardware called Half Dome. Half Dome has a varifocal design that that will recognize when users focus on objects in the VR world. DeepFocus AI will recognize this action and perform a render blur on the surrounding areas to simulate the way a human eye works. The "rendered blur" effect that the DeepFocus AI performs is essential to creating a convincing VR image. The more realistic the rendered blur effect is, the more natural the scenes in VR appear to the user. When our eyes focus on an object, other objects at different depths appear blurry. The researchers at FRL are attempting to recreate the same effect with DeepFocus. The goal of the project is to create a more comfortable VR experience that can be used all day and "work for every single title in the Oculus Store, without asking developers to recompile." "This is about all-day immersion," says Douglas Lanman, FRL's Director of Display Systems Research. "Whether you're playing a video game for hours or looking at a boring spreadsheet, eye strain, visual fatigue and just having a beautiful image you're willing to spend your day with, all of that matters."