ShadowVlican
Limp Gawd
- Joined
- Nov 12, 2006
- Messages
- 420
reminds me of KRYO graphics card... too bad that didn't keep going...
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Im sorry Mr.Bennett, but you are wrong here.
First off, let me say that what i know of computer graphics has been learned from [H]....so I dont know a whole lot lol.
If my understanding is correct, the way current engines would render a baseball is that they render the entirety of the object, even the parts of it that you cannot see. Euclideon's approach is to say, we will only render what can be seen...if im right on this assumption...then why the hell did it take so long for someone to think this up?
Isn't it curious how they have not shown any realtime demonstrations or a demo of the engine released...just youtube videos?
Ok then, id like to see him release a demo and it not be over 100 GB which i can promise you it is right now.
And I enjoy your usual articles, but you are supporting the wrong company here. They are never going to make anything worthwhile.
And this is where you stop postingIts because a demo to download would take up probably a terabyte of space...even with such a small number of unique items in the world (not counting how many times they are copied).
I am not trolling Mr.Bennett. I appreciate reporting on news such as this. I only disagree on your bias. It is my opinion that the reported should present the news in a completely unbiased way.
Um.... he seems pretty unbiased to me. Why are you blasting out crazy assumptions on everything?
Remember if something is to good to be true, it probably is.
He is defending this company when clearing they are all smoke and mirrors.
Like the other poster said, they have had 8 years to put out a tech paper...just a simple document that would get rid of all doubt. If they can not do that...there is no reason to believe this is viable.
Trust me I would love for this to actually work and for developers to pick this up and turn it into a game. But I am not going to blindly believe them.
Remember if something is to good to be true, it probably is.
Where does he defend the company? Do yourself favor and re-read his posts... you have misunderstood him somewhere along the way.
ALSO, watch the freaking interview already. You cannot claim "smoke and mirrors" when they have some evidence that their technology is viable. Seriously just watch it man. And re-read some posts. Holy hell, haha.
Bravo Kyle, for getting an interview on what I thought last year to be a complete pipe dream. I think seeing that real time run through was enough to make me have a little glimmer of hope. Water might be tricky for the engine though it looks like, but I'm sure with time
Ok then, id like to see him release a demo and it not be over 100 GB which i can promise you it is right now.
Its because the motion would take up much more space on the harddrive.
Oh he is....I Sincerely hope you aren't PMing everyone who posts here.
I Sincerely hope you aren't PMing everyone who posts here.
Last Activity: Today 05:41 PM
Current Activity: Private Messaging
Join Date: 08-11-2011
Referrals: 0
QFT. They wont even take people's offers to invest in their business because they don't want people on their ass. Their action of taking on the interview and for him to open up about what has happened ever since he started is a sign that they're trying to calm the "scam" talk down. He took Notch's blog and answered it point by point. This thread is getting too serious for technology that we were given a sneak peek to.*breaks out pom poms*
Seriously tho, those saying "Give us the demo." and "They will never release anything" might want to stop, step the fuck back up and look at history. How many tech demo's out there are there? Is everyone one of them immediately released to the public to be dissected, reverse engineered, critiqued etc? Do games use everything from these tech demo's? For every tech demo out there, it is there to demonstrate something. And alot of them are not released until consumer hardware is profligate enough or even released (think nvidia) before we get our hands on it. But we still 'ooh' and 'ahh' at it.
Why?
Because it is familiar. Everyone is used to their circle as it were. show them a sphere and 'It cannot be done', 'I know more than you and your working tech demo', 'It would need too much XXX'.
Instead of naysaying things that I guarantee you have no clue about (unless you have all Euclideons data, algorithms etc) then perhaps you should just let the tech evolve. PhysX was once upon a time like this
I Sincerely hope you aren't PMing everyone who posts here.
QFT. They wont even take people's offers to invest in their business because they don't want people on their ass. Their action of taking on the interview and for him to open up about what has happened ever since he started is a sign that they're trying to calm the "scam" talk down. He took Notch's blog and answered it point by point. This thread is getting too serious for technology that we were given a sneak peek to.
Actually he answered it point by point by saying "no... not true.. nope... not at all".
The interview didn't really reveal much on the tech side that we hadn't already seen except prove that it's running in realtime. That's something I guess.
Still no animation, still no talk about memory (when that came up he quickly jumped away from it).
Sure you could say "but that's just one or two issues". But it only takes 1 major, unavoidable issue to bring this down. If you don't believe that, just go ask Intel about Larrabee.
A certain company, who "didn't like the idea of abandoning polygons".
I think its reasonable to assume he meant a GPU company - it is not reading too much into the sentence.
Im sorry Mr.Bennett, but you are wrong here. Not about this quoted text, but by supporting this "developer". The engine shown at 19 minutes, Atommontage, is exactly the same as this engine. The only difference is that this engine repeats the same area over and over again. They use the same piece of dirt for every piece of dirt...they use the same palm tree for every palm tree...it is like his video from 2.5 years ago...it is just this small number of objects repeated but this time they disguised it better. There is 0% chance a full game with thousands of objects can be made from this engine. Also movement while possible, takes up incredible amounts of storage space, so while you could see that demo with movement there is zero chance you would see more then that. Isn't it curious how they have not shown any realtime demonstrations or a demo of the engine released...just youtube videos?
Its because a demo to download would take up probably a terabyte of space...even with such a small number of unique items in the world (not counting how many times they are copied).
Did anyone else find it strange that the guy interviewing was an apologist for the Euclideon cause, and was repeatedly talking about how it was better than everything he'd ever seen, including the Unigine benchmark?
How is it possible to make that claim at 1024x768, without any high quality lighting, and with a massively copy-pasted, instanced world like that?
I don't believe their claim that the copy-pasting is purely because of lack of artists. I'd guess they do it to avoid using too much memory.
If that's true, one other thing to keep in mind is that currently, the majority of gaming PCs are actually just average PCs. They're not enthusiast, hardcore, high end watercooled rigs. They're mainstream PCs that double as gaming rigs. If this engine is super memory intensive, even though system memory is relatively cheap, the average PC is not going to start selling with many times more memory on average just because of this rendering technology. Currently the amount of memory works fine for Windows and running general apps, *and* happens to work well for games (more like games are forced to fit into that 2-4GB amount of memory that makes sense for average users on Facebook and listening to music). The average use case (not gaming) is what decides how much memory HP sticks into a system. You might argue that you could always build a custom rig with tons of memory for this, but then you're basically reducing your potential install base to a size that will guarantee there's no way to turn a profit on a game. So if this is a memory hog due to it's design, it will not make sense economically for users/developers.
Catch 22? GPU's wont support more GPGPU stuff until it is used. It IS happening today, but not sure how well it goes with CUDA etc but I can imagine, that is right imagine even dream, what the extra horses would do for tech like this speed-wise. Why? Because I prefer to look on the bright side of life when I do not have absolute damning proof otherwise.I also find it weird that Bruce Dell has repeatedly tried to make the graphics vendors out to be "the man holding him down because they have a vested interest polygons". The graphics vendors don't give a shit what primitive you use... as long as you make use of the GPU they're happy to adapt to what developers want and they actually already attempt to do that with every new architecture.
That said, even if this is feasible for creating immersive detailed geometry in worlds, if it comes at the expense of things that people have come to expect from games today (i.e. ability to shadow, animate, etc), then it's DOA. Sure developers want more of anything they can get... and if you can give a ton more of one thing (in this case geometric detail) without much sacrifice, then they'd love it. However if you say "hey all this geometry detail comes at the expense of high quality animation and shadowing" then it's unlikely to be attractive.
At the end of the day, you're arguing about potential technology with no idea of if/when it will be ready. This thread isn't going anywhere until another update is released.
EDIT: John, is there anything you're willing to share that was not included in the video? Anything else about the demonstration that you'd like to mention? You seemed to really like it, but being there and seeing it on video are two different things.
When I called John Gatt and asked him to do this interview for HardOCP, I never dreamed we would get this kind of access. Big kudos to John for doing a superb job. And obviously we are not professional video produces either, so you will have to give us a bit of a pass in that department. We did our best on what was a tight budget and a very short timeline with limited equipment. If anything, I hope it points somewhat to the interview being genuine.
Thanks John. I'm sure we all wish we could've been there with youAs I just said that demo will fit on a DVD.
I have a copy of the map layout for the demo and it lists 24 major models like the palm tree cactus etc but what blew me away was at 1024x768 you could not tell it was not running at 1920 it was amazing man, it scales up like nothing I have ever seen. Apart from the 24 major items we had more rocks weeds and other stuff I just didn't have time to show! A wild guess I found 80 to 150 other items that I noticed. I counted 20+ leaves alone. And it was all done in under 3 weeks.
The problem you don't understand, is that point clouds don't need TEXTURES. Textures are purely a POLY thing. Point cloud data includes the texture as part of the point cloud, and also includes the COLOR.
You don't NEED terabytes of texture data. However, unlike a poly, which is basically three coords and a bunch of texture data, you instead have 10k point objects.
The key, AGAIN, is that he's not requiring the system to render the ENTIRE point cloud for an object, and then bi-sect, and tri-sect the result to get the 3D projection, he's culling the point cloud based on the pixel projection right from the start. That results, in the only data necessary to render the screen is only the points which are visible within the pixel frame.
A similar process for poly-casting would be to imagine each pixel as an individual 3d projection screen, and ray casting against the poly based upon what's in front of the pixel. Your projection is built into the process.
Carmack mentions these processes as REAR projection as opposed to FRONT projection. Most current poly techniques use a FRONT projection technique which requires a fully rendered item to then be projected from the object to the screen. Euclidon likely (we're speculating here based upon available information), is that he's using a REAR projection technique which limits the rendering to only those points of the point cloud falling into the search algorithms result.
Not saying REAR projection doesn't have it's problems. Shadows, smoke, and other environmental effects become more difficult since they are procedural systems which aren't part of the normal point cloud, and instead, modify the point cloud data. It's not an intractable problem, just one that requires more thought than current poly techniques do.
At the same time, as mentioned in the video, the physics problems become infinitely EASIER since your resolution is high enough to encapsulate some of your physics problems (bounding boxes, perimeters, and other poly techniques for finding edges). So, while making some things harder, it makes other things EASIER.
The biggest hurdle is going to be moving the mindset to a different series of techniques. Most of the graphics professionals in the medical industry are well versed in these systems since they are actually used quite effectively in systems such as your 3-D MRI scanners and LIDAR based rendering. We'll see some crossover here in those techniques are being applied here into the game-world...
Not at all. But saying that this is the most amazing thing ever, including vs. Unigine is a bit far fetched. Any demo running with 1/4 of the necessary systems to look "complete" will run at reasonable framerates. I have no doubts they can take advantage of the GPU eventually. But let's be honest here:Fire up your best game in terms of graphics. Get a camera 4 feet from it. Now tell me that picture on the camera is the same as on screen. Big difference. JG was stressing this point. We have seen the demo. We know what it looks like in that perspective. JG saw it real time. Saying what it looks like when the camera won't do it justice is apologist now?
Wait.. so because they claim something it is as they say now? That's what someone who is trying to get the world to notice them is going to do. They'll claim their product has 0 flaws and minimize them and talk about their strengths all day. Marketing 101 dude. Sorry but I'm a natural born skeptic, and will need to see more than 2 instanced trees and rocks to be sold. Memory is memory, and there is only so much you can compress things before you have to face the music of data requirements. There's no magic that can change that.Perhaps because it is more about detail and it has more of it than the mentioned benchmark? And on memory there, it is a guess as you say. They have the tech and say another thing. So which is more accurate? Your guess or their knowledge?
Sorry but your view is fairly warped, maybe because you're a more hardcore gamer by definition (after all you're on a gaming-centric forum). Most PC gamers are using the family PC with a low end GPU that came with it.Kinda true, but not. Anyone who plays games more often than not buys the PC to fit those needs. Rarely does it end up the other way around. Also, when you look historically memories impact on gaming, even today, 2-4gb is more than enough even on games that stress high end hardware. Also, 2 years ago, did PCs ship with 2-4 gb of ram? Or 512-1gb? So in 2 years time, how much will they on average ship with then?
I'm not saying it's not a cool concept, but I choose not to live with my head in the clouds. This is the same reason that I don't believe in psychics, aliens, ghosts, or monsters under my bed.Catch 22? GPU's wont support more GPGPU stuff until it is used. It IS happening today, but not sure how well it goes with CUDA etc but I can imagine, that is right imagine even dream, what the extra horses would do for tech like this speed-wise. Why? Because I prefer to look on the bright side of life when I do not have absolute damning proof otherwise.
As I mentioned once before, look at the history of demo's. Not all of those had all those things mentioned above even when they were available. They were done to showcase a CERTAIN technology.
If I understand the technology properly, there will be no need for AA or AF. The only difference in quality is going to be resolution.
John, was the demo running on all 8 threads of the mobile i7 or did you see?