I think designing games using memory and past experience is kinda if a health hazard because dreaming hard is not good health wise. I can render fast in my dreams but if I had to physically create if with pencil and paper its too demanding which us why stopped drawing comics. I still render with markers on the easel though paint is too slow airbrushing is labor intensive. Rendering digital is foreign to me.
From a purely game or (known) virtual world perspective, BCI of some sort is astonishing As portrayed in sci-fi from cyberpunk classics of Gibson to Shadowrun, to the Matrix, to Sword Art Online - the ability to somehow "jack in" (by varying degrees of invasive plugs to something as friendly as putting on a headset/glasses and laying down in bed) and then wake up in a virtual world with full bodily presence is the holy grail. Everything from the clunky headsets of the 90s to even the high res VR / AR we have to day are primordial stepping stones to that ideal. Of course, almost all of these sci-fi futures that feature such a wondrous world talk about potential dangers of BCI - being unable to tell meatspace from something programmed, the control of those who create the world could be potentially harmful to those experiencing it (ie you can affect someone's brain directly, you may be able to harm it intentionally or otherwise), to varying levels of control and power.
There are a ton of neurological not to mention ethical issues with such a thing, be it used for games or anything else . We don't want a cyberpunk world where people can be hacked into say... thinking they've been voting when they've been standing in their living room or have never left their bed because they did the BCI equivalent of clicking on a Facebook ad or virus the night before ; or worse a "buffer overflow" that is targeted to have a negative physical consequence on certain demographics. That's not even talking about certain other situations such as say... a government, police force, or other organization using BCI to visually alter their soldiers' vision so that "bad people" to look like monsters or mutants, literally dehumanizing them , or proposing "training exercises" that are actual harm being done to others ; fiction over the years have warned of such things. While its not likely that such elements are to be a real issue anytime soon, we should still keep in mind as we build these technologies how they can be used by the unscrupulous and try to mitigate them without undermining the promise of the good they can do.
However, as an aside I have to take the time to say this is another reason why I continually find Valve (and Newell) worth supporting over almost any other game stores / platform. They're willing to take on long-term forward looking projects (not likely to provide a quick ROI, unlike the rest of the industry's focus) such as this but perhaps even more importantly, they support an open source (including Linux friendly) path to such technology. As bad as it is now with proprietary software and the like, when we get into BCI the risk is multiplied by thousands running unsigned, proprietary code, implants, BCI hardware etc. Its bad enough we have proprietary medical devices these days (and an element that needs to be made illegal ASAP), but BCI should be open from the hardware, firmware, up to software so that people can be certain of what they're plugging into their head. At least Valve is getting things moving the right direction, but we're going to need a legal component ASAP to stop other proprietary offerings from being developed faster by those who have tons of venture capital on the "We own the way into everyone's head" premise.
Reminds me of that Black Mirror episode with the nursing home and the residents who choose between the shared virtual world (and staying in it) or the real world. Like a balls deep version of Second Life.
I think it's a pretty cool idea, tbh, and could really improve 'quality of life' in quadriplegics, elderly, isolated communities, etc.