The game engine wars are now over, EA/DICE/Frostbite won

I'm going to address this particular point about UE4 because I see a lot of misinformation thrown around to disparage UE4 over vendor favoritism.

Firstly, do you know why UE4 is popular?

In today's videogames development, how many platforms do you think videogames developers develop for? If publishers are involved, any large budget game you make had better appear on at least 2 platforms: Xbox One and PS4. Windows PC is actually secondary to these 2. So if you are a developer who needs to develop for multiple platforms, do you want an engine that works on all 3 platforms? Of course!

Unreal Engine is a well known engine that works with Xbox One, PS4 and Windows PC. However, that's not all the Unreal Engine supports. Off my head, I remember it is also supported on iOS, Android, Mac OS and Linux. How about that? If we want ports on more platforms, Unreal Engine is a great start point to ensure easier portability between the platforms.

Can Frostbite do all that? I don't know. As far as I can tell, it's only Xbox One, PS4 and Windows PC, I can't tell you more. Frostbite is not available to me or my company.

And finally, to just hammer home a point that has annoyed me for a while. Unreal Engine 4 is supported on PS4 and Xbox One. It is one of the most popular engines and it works fantastic on the PS4 and Xbox One. The PS4 and Xbox One runs AMD GPUs, yes? So what is the excuse for AMD's poor showing? Bottom line is AMD needs to step up. Apologists for AMD are just giving AMD an excuse not to step up.


Yes, I know why it's popular, I know frostbite is locked down. Just talking about technical superiority based on released games in terms of visuals. Sometimes people will trot out UE4 as the be all and end all in terms of being the state of the art engine in the universe as far as real time rendering of game environments is concerned, and I have yet to see a game with better looking environments and visual effects compared to this new battlefield.


As far as which engine is the best that is able to be used by anyone, UE4 might be the best, I don't know if it's a technically superior engine to something like cryengine 5 or just has better support. People say Unity 5 is technically less than UE4, but is perhaps even more forgiving in terms of support and ease of development.


As a consumer, if I had to play a game and got to choose which engine it was developed with, I'd prefer frostbite 3, because not only does it look good, it performs well across more hardware. This is not a fair fight, part of the increased performance likely comes from heavyweight professionals that are hyper experienced when it comes to game optimizations, I don't think bioware has that kind of technical bench when it comes to performance, but dice does, and they get to leverage that experience to make the game run better and look good.

So again, as an end user, and not a developer, I'd rather see game released on frostbite 3. Even though yes, I'm sure it makes a devs life easier to use UE4 to get a good looking game that at least runs on most hardware, passably well if not as well as frostbite 3.
 
Well, just to be controversial. :)

How "good" a game can look, by my reckoning, it's 85% artists, 15% engine. Your game's producer/art director/chief designer/whoevermakesthecall determines the artistic direction of the game first. After that, the artists delivers that. You will be surprised at how good all the engines in the market are at making things look good.

The way I see it, making a game look good is less engine and more people involved in making the game. Artistic direction and capable artists with sufficient budget goes much further than the choice of engines.
 
I reckon how good a game can look now is less based on engine and more on managerial things such as overall budget, polygon budget, target platform/target FPS and resolution.
 
Well, just to be controversial. :)

How "good" a game can look, by my reckoning, it's 85% artists, 15% engine. Your game's producer/art director/chief designer/whoevermakesthecall determines the artistic direction of the game first. After that, the artists delivers that. You will be surprised at how good all the engines in the market are at making things look good.

The way I see it, making a game look good is less engine and more people involved in making the game. Artistic direction and capable artists with sufficient budget goes much further than the choice of engines.
Exactly. Basically all engines available now are capable of very high quality. The question in deciding which is the better engine breaks down to how easy it is to make a game look good in each engine.
 
Bah. It's not a Madden title. So EA will fuck it up royally.
 
End consumers buy and experience games not game engines.

Game engine characteristics is something the designer of said game should be concerned with.

I have the same issue with people discussing this as I've previously brought up with people debuting the merits of different manufacturing processes and memory technology for graphics cards on here.
 
That's partly why I'm more excited, not because of battlefield, but mass effect andromeda. I have NEVER played a mass effect game I did not love. The ME series has a better looking art style than any other bioware game by far, even on a dated engine it looked good. The sci fi space opera theme hits WAY more of my preference buttons than some mundane realistic real world sim. And if they made standard reality look as good as they did with that engine, imagine what could be seen in a world where they are not constrained by the real world.

Couldn't agree more!

I think ME Andromeda will massively benefit from an engine that does, imho, afford more realistic visuals.

I'm not saying Frostrbite is technically superior necessarily, but I couldn't name a single game running UE that does a good job of just capturing reality. DICE's engines always aim for that squarely and generally seem to get it right on the current hardware at the time.
 
OP knows you cannot use FrostBite outside EA...but still declares it "the" winner in an event...it cannot participate in.

/thread
 
As much as the tech and the trailer looks good, you fail to understand why the industry does not seem to want to pick up Frostbite.

I'll try to list as many reasons as I can based on my experience.

1. Is the engine available for 3rd party studios to license? (I know mine did not even put Frostbite as a choice ever.)
2. Are the tools for the engine freely available?
3. Are the tools for the engine easy to use?
4. Are the resources and assets for Frostbite abundant and freely available?
5. Is the engine easy to use, understand and code for?
6. Is the engine optimized for large scale or small scale?
7. What is the object instancing limit for the engine?
8. Does the engine scale with smoothly with increased resource and asset use?

As far as I'm concerned, Frostbite fails on #1 and therefore nobody even considers it as an engine. Feel free to quote any studio that uses Frostbite but I will stake my reputation and say it now: Nobody I know outside of EA uses the Frostbite engine. I do not think it is possible to license it at all.

Therefore, Frostbite cannot win anything, other than push their own EA sales.
the vast majority of people don't care about any of that. op was talking purely from a consumer standpoint, and from that standpoint frostbite games generally tend to have a much better graphics:performance ratio than games on any other engine.

the only games i can think of that have graphics and performance on par with sw: battlefront/bf1 (besides other dice frostbite games) are mad max and doom. nothing else comes to mind.
 
Engines are a lot more than pretty graphics. Although the beta looked decent enough for how it performed, it wasn't ground breaking. Certainly not nearly as good as what is shown in the trailers. Doubt the final game will look much different than the beta either.

this...engines are just tools. It comes down to the "artist". Hell look at Earths Special Forces. That shit is made on half-life one. www.esforces.com
 
Frostbite is definitely the best game engine (visuals, performance, scalability etc)...SW: Battlefront looked stunning-- the textures on the rocks, the snow, foliage etc...best looking game ever...maybe Battlefield 1 will surpass it but I doubt it
 
Well, just to be controversial. :)

How "good" a game can look, by my reckoning, it's 85% artists, 15% engine. Your game's producer/art director/chief designer/whoevermakesthecall determines the artistic direction of the game first. After that, the artists delivers that. You will be surprised at how good all the engines in the market are at making things look good.

The way I see it, making a game look good is less engine and more people involved in making the game. Artistic direction and capable artists with sufficient budget goes much further than the choice of engines.

Exactly.

Also as a developer I value documentation above all else when picking a component. It can be the best game engine in the world but if it doesn't have sufficient documentation to work with it, I'd rather go with something that does a little bit less but is easier to work with.
 
I'm going to address this particular point about UE4 because I see a lot of misinformation thrown around to disparage UE4 over vendor favoritism.

Firstly, do you know why UE4 is popular?

In today's videogames development, how many platforms do you think videogames developers develop for? If publishers are involved, any large budget game you make had better appear on at least 2 platforms: Xbox One and PS4. Windows PC is actually secondary to these 2. So if you are a developer who needs to develop for multiple platforms, do you want an engine that works on all 3 platforms? Of course!

Unreal Engine is a well known engine that works with Xbox One, PS4 and Windows PC. However, that's not all the Unreal Engine supports. Off my head, I remember it is also supported on iOS, Android, Mac OS and Linux. How about that? If we want ports on more platforms, Unreal Engine is a great start point to ensure easier portability between the platforms.

Can Frostbite do all that? I don't know. As far as I can tell, it's only Xbox One, PS4 and Windows PC, I can't tell you more. Frostbite is not available to me or my company.

And finally, to just hammer home a point that has annoyed me for a while. Unreal Engine 4 is supported on PS4 and Xbox One. It is one of the most popular engines and it works fantastic on the PS4 and Xbox One. The PS4 and Xbox One runs AMD GPUs, yes? So what is the excuse for AMD's poor showing? Bottom line is AMD needs to step up. Apologists for AMD are just giving AMD an excuse not to step up.

In general what developers can and can not use is something different then what I see in this game trailer. The idea is that you produce games EA/DICE just made their engine look really good.

I find your last comment worrying on consoles you do all the hard work , it is using a low level API, that means as a developer you can and need to do the optimizations yourself.
AMD does not do this for you.
 
I find your last comment worrying on consoles you do all the hard work , it is using a low level API, that means as a developer you can and need to do the optimizations yourself.
AMD does not do this for you.

I am not understanding your point here. PS4 and Xbox One have low level APIs. PC has DX11/12.

If developers can optimize for the APIs, they will. PS4 and Xbox One currently are still new, most haven't mastered them yet to get the best performance out of them. However they don't seem to be having significant issues with performance.

With PC, we try to optimize as best as we can on DX11/12 but with DX11 in particular, the abstraction is higher but not vastly higher. Most developers are not capable enough to be able to optimize their code and modify the Unreal Engine to contribute significant optimizations to the public repository. This requires more indepth knowledge of the hardware. Who has the best knowledge of the hardware to make such code optimizations to the engine?

I have probably emphasized a few times on this forum, developers are more concerned with getting their game done. Any time a developer decides to optimize the engine or optimize for a particular hardware, is only when there is a significant performance challenge that we cannot overcome. Otherwise, we get the game working right first, any of those optimizations are purely luxury.

The AMD GPU is the same(really? 100% identical?) across PS4, Xbox One and PC. The only difference is their APIs and drivers. If PS4 and Xbox One don't seem to be having significant performance issues, then it's not the hardware at fault. If it's the API, Nvidia should be seeing significant performance issues but they don't. That leaves drivers. That's entirely AMD's call.

This is where AMD has to step up and optimize their drivers if they don't want to look bad. I'm not saying they must, I have no bone if they do or don't. I'm one of those jaded people who cannot be bothered anymore if AMD looks good or bad.


Edit: I've probably gone a lot further than I want to with respect to taking sides with either Nvidia or AMD. I don't like either company because both give me grief, albeit different types of grief. This will be my last post(hopefully) where I show any bias for either company.
 
I know there is a lot of love for UE4 here, but nothing I have ever seen from any UE4 game looks as good as what was displayed there. I just hope the gpus can play that at 4k. Though seeing as that the dice guys know how to actually optimize their engine, it should have a better shot at both running smoothly, and looking better.

IMHO You are mistaking assets quality for engine quality.

BF has huge budget - typical UE4 game is made by smaller studios with much less money at their disposal.


And UE4 isn't even biggest contender for best engine title - newest Idtech from Doom is technical marvel and best implementation of lower level API released so far. We will see it BF1 can change that.
 
I like the RED engine they used in Witcher 3. BF1 looks great. Can't wait to play it at the end of the month.
 
NIce but, have they bothered fixing the oops, your saved games are corrupted, please, here, start all over again and have fun? :mad: That happened on ALL systems for Battlefield 4.
 
I am not understanding your point here. PS4 and Xbox One have low level APIs. PC has DX11/12.

If developers can optimize for the APIs, they will. PS4 and Xbox One currently are still new, most haven't mastered them yet to get the best performance out of them. However they don't seem to be having significant issues with performance.

With PC, we try to optimize as best as we can on DX11/12 but with DX11 in particular, the abstraction is higher but not vastly higher. Most developers are not capable enough to be able to optimize their code and modify the Unreal Engine to contribute significant optimizations to the public repository. This requires more indepth knowledge of the hardware. Who has the best knowledge of the hardware to make such code optimizations to the engine?

I have probably emphasized a few times on this forum, developers are more concerned with getting their game done. Any time a developer decides to optimize the engine or optimize for a particular hardware, is only when there is a significant performance challenge that we cannot overcome. Otherwise, we get the game working right first, any of those optimizations are purely luxury.

The AMD GPU is the same(really? 100% identical?) across PS4, Xbox One and PC. The only difference is their APIs and drivers. If PS4 and Xbox One don't seem to be having significant performance issues, then it's not the hardware at fault. If it's the API, Nvidia should be seeing significant performance issues but they don't. That leaves drivers. That's entirely AMD's call.

This is where AMD has to step up and optimize their drivers if they don't want to look bad. I'm not saying they must, I have no bone if they do or don't. I'm one of those jaded people who cannot be bothered anymore if AMD looks good or bad.


Edit: I've probably gone a lot further than I want to with respect to taking sides with either Nvidia or AMD. I don't like either company because both give me grief, albeit different types of grief. This will be my last post(hopefully) where I show any bias for either company.

AMD already said they would do no major things to the DX11 drivers it seems with the performance of the Frostbyte engine they don't have performance issues either so why does AMD deserve any blame?

The work revolving around DX12/Vulkan parts of engines has to be only done once it should not happen on every new game using the same engine.
 
AMD already said they would do no major things to the DX11 drivers it seems with the performance of the Frostbyte engine they don't have performance issues either so why does AMD deserve any blame?

The work revolving around DX12/Vulkan parts of engines has to be only done once it should not happen on every new game using the same engine.


Pieter, haven't you noticed, AMD with the last iteration of their drivers, their driver overhead has dropped in DX11.... Still has a ways to go but but its better than before. Look AMD's DX11 drivers, are their own fault, they screwed the pooch, people were talking about the problem since 2 years ago? You would think AMD would have known well before the general populous they were CPU bottlenecks in their drivers don't you think? Its not like AMD doesn't know what they are doing, or is that what you are getting at, AMD's driver team doesn't know their own inabilities?

No the work has to be re done on a per game basis, because game developers use different shaders and different ratios, added to this, game code (AI, physics) changes quite a bit from game to game even on the same engine, and yeah LLAPI's will affect that too.
 
Pieter, haven't you noticed, AMD with the last iteration of their drivers, their driver overhead has dropped in DX11.... Still has a ways to go but but its better than before. Look AMD's DX11 drivers, are their own fault, they screwed the pooch, people were talking about the problem since 2 years ago? You would think AMD would have known well before the general populous they were CPU bottlenecks in their drivers don't you think? Its not like AMD doesn't know what they are doing, or is that what you are getting at, AMD's driver team doesn't know their own inabilities?

No the work has to be re done on a per game basis, because game developers use different shaders and different ratios, added to this, game code (AI, physics) changes quite a bit from game to game even on the same engine, and yeah LLAPI's will affect that too.
I agree, just switched from a 390x to a 1080. Can't wait to play bf1.
 
I played BF1 beta on my asus rog 4k monitor and it performed very well for a beta. Never dropped below 50 fps and it looked amazing. I'm sure gsync helped as well, but still looked amazing. Though I really didn't see a huge difference when I would change from ultra to high. Definitely a nice piece of eye candy though.
 
Wow. Too bad this was obviously done on a console.

But damn, even on a console it looks amazing. It will look a helluva lot better on the PC.



and if you are a PC master race member, you will cringe at this video. Holy shit thumbsticks are horrible.
 
Last edited:
This is launching the same weekend my wife is out of town for a retreat. She'll come home to me glued to this computer playing BF1 in my underwear surrounded by pizza and bourbon.

And it will be glorious!
 
This is launching the same weekend my wife is out of town for a retreat. She'll come home to me glued to this computer playing BF1 in my underwear surrounded by pizza and bourbon.

And it will be glorious!
I'm impressed you are even considering putting on underwear. I sure as hell wouldn't and I would force the pizza guy to deliver the pies through the dog door so he wouldn't see me.
 
This is launching the same weekend my wife is out of town for a retreat. She'll come home to me glued to this computer playing BF1 in my underwear surrounded by pizza and bourbon.

And it will be glorious!


I'm actually curios about this as I'm not married. What happens if this kind of game releases and she is in town. Do you just lock yourself away and tell her the marriage is on hiatus for a few days/week while you play the game, or do you marriage guys just have to not ever dive as deep when something you really want to play launches?
 
I'm actually curios about this as I'm not married. What happens if this kind of game releases and she is in town. Do you just lock yourself away and tell her the marriage is on hiatus for a few days/week while you play the game, or do you marriage guys just have to not ever dive as deep when something you really want to play launches?
Well if you are married long enough, you don't have to worry about such matters.
 
This has less to do with the engine and more to do with photogrammetry DICE is using for their latest games.

See this -



This is why these latest games from DICE look so good while also running so good. These highly realistic textures/models don't necessarily demand more compute, but because they are captured off of real objects they look better then anything an artist would do manually.

Shame GPU's don't have more VRAM available because IMO it seems we've reached a point where we are less constrained by what shaders / compute require, and more constrained by the amount of memory a GPU has available. Imagine if GPU's had nearly 1TB of VRAM and they could load not only high res textures for even the smallest objects, but have thousands of different high res textures for different pieces of foliage, etc.. With current rendering tech i'd venture to say you'd have photorealistic graphics with that much VRAM available and assuming developers took use of it.

If you think i'm wrong look at 39:21 in that video, and how few textures/models are used in that one absolutely stunning scene. Imagine if you had the same scene, but instead of only like 16 different models/textures had thousands.. The game would be photorealistic.
 
Last edited:
I'm actually curios about this as I'm not married. What happens if this kind of game releases and she is in town. Do you just lock yourself away and tell her the marriage is on hiatus for a few days/week while you play the game, or do you marriage guys just have to not ever dive as deep when something you really want to play launches?
Honestly, the last game I was this excited for was Call of Duty II, given how long that was ago...it hasn't really been an issue.
 
I'm actually curios about this as I'm not married. What happens if this kind of game releases and she is in town. Do you just lock yourself away and tell her the marriage is on hiatus for a few days/week while you play the game, or do you marriage guys just have to not ever dive as deep when something you really want to play launches?

Better yet is to have a spouse who loves to game and then you both can do marathon gaming sessions together. :)
 
Better yet is to have a spouse who loves to game and then you both can do marathon gaming sessions together. :)


and then no one in the family works lol. j/k Always good to have your significant other interested in your actives.
 
cryengine still looks the best to me. the only engine I truly hate is unity. the world would be a better if that thing just died.
 
the only engine I truly hate is unity. the world would be a better if that thing just died.

Whoa! Why do you feel this way? What is it about the Unity engine do you not like?

CryEngine is very powerful, I have seen some of the most amazing effects and scaling with it but it has the reputation of being possibly the hardest to work with.
 
I want to see that IN game play, no pre-rendered videos. Remember the Final Fantasy movie back in like 2002? And we still don't have that detail in real time gaming...
 
Back
Top