Graphics 'Indistinguishable From Reality' in 10 years

Only if LG/Samsung get off their asses and make mainstream panels with decent PPI. Desktop PPI hasn't seriously risen since CRT's stopped being produced, and ~100 PPI is never gonna look "Indestinguishable from reality".
 
Probably would have happened years ago if there were more than two companies competing in the graphics arena.
 
Vague vague vague.

A piece of plastic, a ceramic cup, a concrete wall heck even a tea pot might be indistinguishable from reality when rendered in 2022. A piece of Borneo or a serving of fries and a burger, completely distinguishable.

Marketing plug for 1313 mostly. But yes it is going to be interesting to see pre-rendered content and real time content on the same level. Possible, likely. Will it be done? Not very likely. Too much work, not worth the effort (expense) for the game devs.
 
Yeah, I agree it's kinda vague. Although he has the credibility of working within the industry, anyone can pull a number out of there ass with a prediction. In 10 years, who will remember of even give a shit.

You could probably pass off a still shot of today's pre-rendered graphics as a real photo but as soon as you add animation, the veil begins to crumble.

If you were to take today's level of pre-rendered graphics & animation and say it will be a possibility in real-time in 5-10 years, it still isn't what I would classify as "photo-realsitic".

We shall see :)
 
Kinda funny, I remember back in like, 2000, I was looking at a PC game magazine article about some nascar game when someone I knew came up to me and commented "I didn't know you like were into cars." It took him about a minute to understand that it was pictures from a game and he took the magazine away from me and stared at the pictures, trying to wrap his mind around it.

For some people, that time has already arrived, it would seem.
 
I'm willing to bet we could generate something like Shrek 1 or ToyStory 1,2 ( a CG movie from that time) in realtime in a tech demo scenario today without too much trouble.
 
Not likely. The nextgen consoles will probably still be in service then. So there won't be much in terms of improvements. I'd say 20 years.
 
I think this has been "coming in the near future" since forever! :D

Your sig is no longer happy!!!! :eek:

Not likely. The nextgen consoles will probably still be in service then. So there won't be much in terms of improvements. I'd say 20 years.

Nonsense! Reality's polygon count will increase drastically in the next 20 years and far outpace the rate at which computers are becoming more powerful.
 
The reason why we don't have the realistic graphics today promised ten years ago, is that reality has been drastically producing better graphics each year. By 2022 I think people are going to look so amazing that cameras and their internal computers will explode when attempting to catch such magnificence.
 
I'd rather they lower the graphical fidelity of the real world to match current consoles. Consolified the WORLD
 
How do you know you're not playing a game right now, and our bodies are just thermo-batteries for the machines, man.
 
Bull. I predict they won't even be able to get hair right.

Well, there was that Nvidia Fermi hair demo about 2 years back. Problem was, that the single head of not that realistic hair took a whole GTX 480 to render at not quite 60fps.
 
Yeah right, human body photorealism will be just as advanced in 10 years as audio is today for speech recognition and vocal synthesis, which is just as advanced as OCR was 10 years ago.

I think it's time to recognize the limitations of current CPU design when you try to compare it to equivalent human functions. We probably need to design a whole new CPU architecture closer to how the human brain works because it's just not working. It took maybe 10 years to reach a 90% success rate, now it takes another 10 years for each extra 0.1% or so.

Plus this technology is extremely costly. CG is fine when there is no real life equivalent, like a spaceship, but for human characters, filming actors is much cheaper.
Not even mentioning the weird virtual world feeling you'd get watching these lifelike avatars, like we reached the stage where everything started becoming virtual.
 
I could see the hardware being powerful enough and potentially immersive enough (VR). But I don't see games hitting that mark just based on development costs. Think of how much time and effort it would take to create the textures, animations, etc at that level of detail.
 
Still picture wise, maybe. Dynamics wise, doubtful.
 
Desktop PPI hasn't seriously risen since CRT's stopped being produced, and ~100 PPI is never gonna look "Indestinguishable from reality".

As in Pixels Per Inch?
If so, why bother devoting development dollars to increasing PPI when that can be solved easily for free by just sitting farther away?
If they did anything at all they would just need to add more pixels (resolution) and inches to a display and this seems like a much cheaper and overall better alternative to increasing PPI.


My vote is for higher fps source material and higher native refresh rates for displays.

Playing Quake 3 (or any other game) at 180fps on a CRT with an actual refresh rate of 180hz does more for realism and immersion than any other technology GPU makers have added in the last 10 years. It's the fluidity of the movement that does more for realism here.

Real life doesn't flow by in noticeable still frames, so having a frame rate so high that it matches the fluidity of real life is a logical step in adding actual realism and more enjoyment for movies/tv/games etc.

As far as I am concerned 60hz is unacceptable when I can see the screen flashing. It is most easily noticeable with any image that uses a lot of white. For myself the screen doesn't start to look like a "solid" image until after at least 80hz. I used to keep my old CRT's at a minimum of 85hz. But still, even at that rate I could still notice the lack of frames in fast action scenes.

A minimum of 120hz (not the gimmick used by LCD panels, the actual refresh rate of the display is what I am talking about) should be the bare minimum.

Apparently plasmas can do up 600hz. If that is possible then why don't they release plasmas that accept anything higher than a pathetic 60hz? Such a waste.
 
Weren't we talking about this 10 years ago?

I was gonna say...

Though I have to admit we are getting closer every day. In a lot of still screenshots it's hard to tell it's not a photo a lot of the time with games like BF3. However, in motion it still has quite a ways to go.
 
no way

10 years from not PS4 and XBOX 720 or what ever it is called will be in its "prime" and 99% of games will just be ports from those to PC

lol ;(
 
In ten years, we might very well have CG-quality visuals, with some scenes that completely fool even the sharpest eye for reality. And people will still disable vertical sync and stare at a hundred tear lines every second.

It's an unwinnable, almost purposeless pursuit. There's still value to be had from it, but we're really chasing the wrong dogs here.
 
I think it depends on what is in the scene. There are certain shots in games that, for a split second, can look almost photoreal even when done today. It is stuff like people up close or hair etc that blow the illusion, and while I think it will be better in 10 years, I don't think it will be fully realistic. Look at characters made just for movie CG in something like zbrush. Those look great, but a lot of those are not photo real and they would require something WAY beyond what we will likely have within 10 years to render in stereo at 60fps and high res. A 3rd person racing game or something, on a well kept track though? I think something like that could be there by then.
 
Avatar did a good job at realistic CGI. Whole movie was done in a studio.
 
I like the fact he's talking about distinguishing between CG and "live action" in a completely fictional sci-fi universe.

But yeah, as far as games are concerned, not a chance.
 
Llke the Matrix?

Yes, in the sense that you don't know how anything outside of the actual Matrix would look in real life, because it doesn't exist (the ships, the squid things, the pod towers etc). The difference is there's nothing in Star Wars which even pretends to represent the real world as we know it. And neither series nailed photoreal humans.
 
A minimum of 120hz (not the gimmick used by LCD panels, the actual refresh rate of the display is what I am talking about) should be the bare minimum.

Apparently plasmas can do up 600hz. If that is possible then why don't they release plasmas that accept anything higher than a pathetic 60hz? Such a waste.

TV's often use the gimmicky interpolated frames horse shit, but 120hz LCD monitors will accept and output proper 120hz, mine does. Also that 600hz is once again, marketing bs for interpolating frames.
 
Maybe in 100 years. In 10, I think we'll be lucky if we see anything significantly different than what we have now. Stuff is crazy expensive now, they'll be looking to make things cheaper/easy to produce if they want there to be anyone around to sell the technology and develop the software for the consumer.

Personally I'd rather have graphics that can handle a tornado moderately realistically hitting a large city and look presentable throughout instead than a tiny room that looks like an exact reproduction but nothing can be done to it or it loses the illusion.

There's a lot of things beyond "graphics" that make things seem more real. Audio, animation, physics, destruction physics, how objects react..

Just imagine picking up a glass and throwing it against a wall, then imagine throwing it against a mirror. What do you expect to happen? Has it ever happened in a video game so well that you thought "If the graphics were better, that would have convinced me."

I can think of some cool moments, but nothing where I was shocked at how convincing it was in it's ability to handle a mundane action. Most games aren't about mundane, most movies aren't either.
 
This was a fun discussion to have back when we cared how many bits of color our game consoles had. There are much more relevant game related discussions to be having now.

However, while we're having it, photo realistic graphics in 2 console generations? I'm sorry, but I really don't see this happening. We're not nearly as close as we think we are, and there is simply no way affordable game consoles will be pushing photoreal processing power 2 generations from now. Then after that 2nd generation I'm not even sure gaming will look like it does now. The processing power may reside entirely in "the cloud" [1950s UFO sound effects here] at which point anything is possible.
 
Back
Top