Far Cry 3: E3 Demonstration vs. Retail PC

It seems like they had visual assets that were later removed (the underwater scene esp. obvious). Likely due to the fact that consoles (which this game is designed for) couldn't run it faster than a slide-show. The game was repeatedly delayed (first reports were of a 2010 release, then early 2011, then late 2011, then early 2012, then Fall of 2012, and then it was delayed once more for a few weeks before its North American release in Dec 2012, citing unspecified "tweaking" was needed). Perhaps the time between the development of the demo shown at E3, and the gold master, they just couldn't mange to optimize their code for playable frame rates with all the additional eye candy they had in there (even on PC)? Maybe they decided to nix some of the visuals rather than risk a reception like Crysis ("$12,000 monster 16-core 5GHz gaming rig with quad-SLI and 1TB RAM-disk!" "Yeah, but can it play FarCry 3?")
 
Sunk 80 hours into the bastard since I got it. Don't understand what revelation the video is supposed to be showing. The demo version from months ago played on a console is different from the version on a computer of unidentified specs with unidentified settings on? Wow. Knock me ovah with a feather.

umm if you view this on youtube he has the specs of the retail pc listed!!!!
 
Most games get chopped. Doom 3 and HL2 both had amazing stuff that ultimately got shafted for performance or other reasons. Quake 3 had some pretty cool maps.

It's not just about the empty promises, it's about all the stuff they already had running. Damn shame. Not that I would play this scripted qte fest piece of crap, but graphics really looked better back then. Also Ubi... never ever bought a game published by those guys. Everything they publish is bs.
 
My best guess is when they did the E3 demo, they went all out on one specific hardware setup (probably an i7 or Xeon, and an NVidia 680), and made sure all the graphic effects were the best possible for the level, environment, and lighting situations presented.

As they started trying to port those effects to other hardware (AMD, dual core i5
s, lesser NVidia GPUs) other level environments and lighting situations, they ran into issues, either things looked weird, or weren't stable, etc. And so started having to dumb things down to meet their launch window, instead of taking the effort to work thru the problems encountered.

At least that's my best guess of what happened.


Lesson to be learned. Don't give a shit about previews, or pre-released vids or other marketing hype. The quality of the shipping product is they only thing that should count toward purchasing decision. I'm saddened that it looks like they could have done more if they put in some more effort, but those are the breaks.
 
While the video was a bit melodramatic, I don't think the point was just certain gameplay elements and parts of the area that didn't make it. There is a pretty distinct difference in lighting, shadows, some of the textures, and other aspects of visual fidelity.

Take a look at the marketing mode texture fiasco for swtor as an example. The game had a "high" detail texture setting available in beta, and even after launch it was used in all of the marketing screenshots and videos(for things like upcoming content, advertisements, etc.), yet the retail version of the game had the high resolution textures disabled(your options in the menu were low->medium->high, but for a while it actually functioned as low->medium->medium, and then the devs removed the "high" option completely from the drop down menu) with the excuse being that some players PC's couldn't handle the high resolution textures(isn't that the point of a user select-able option?). Only months later did they finally add a "high" option back to the game and even then it still functioned weird(dunno if they actually fixed it completely later).

A few other games over the years have had this issue as well. The guy who made the video is probably just someone a bit irritated by developers/publishers over-promising what the game could look like, and then somehow(for whatever reason) under-delivering.

"Hey, check out what our lighting and water effects look like!"
"Here's your version that will never look as good as it did in a demo from 6 months ago"

Maybe there is indeed some huge technical problem with having shadows, lighting, shaders, etc. look like they did in the demo. But at some point people are going to wonder if they were deceived. No one really likes false advertising.
Me, me here!

With an i7-3930K @ 4GHz, 32GB DDR3-1600 RAM, a 128GB SSD, and a single HIS 6850 I certainly expected much more at 1440x900 on the highest/maximum/Ultra graphic settings. Am kind of disappointed -- keep asking myself if these are the same people that made Crysis or Far Cry 2, because it doesn't seem like some things are "up to par".

Nonetheless, I am enjoying my game. :)
 
it would have been nice to have that dense and diverse foliage. The foliage in game is really repetitive. Underwater looks loads better in e3 so does reflection. Particles too. Too bad the game doesn't look better, i feel like we are playing on console quality at a high resolution.

Even then, the graphics in teh game are good enough to not complain about, but the voice acting repetition and other bugs and gameplay mechains are.
Speaking about water and repetition, underwater is REALLY HORRIBLE. When swimming underwater and looking up, you can see like a million loops of the water texture animation. On top of that, the water doesn't move three dimensionally like it does in Far Cry 1 and Crysis. (Can't remember if Far Cry 2 was the same or not.)
 
If someone was going to complain about things in demos not making the final game.. then I would tell them to go back and watch some of those Half Life 2 demo movies.... 1/2 of that stuff didnt make the game either.... especially those electric eel alien arms....

The behind the scenes book about HL2 actually talks about that. The alien arms thing was finished and working, but in play testing it turned out to work well gameplay-wise so they removed it.
 
Me, me here!

With an i7-3930K @ 4GHz, 32GB DDR3-1600 RAM, a 128GB SSD, and a single HIS 6850 I certainly expected much more at 1440x900 on the highest/maximum/Ultra graphic settings. Am kind of disappointed -- keep asking myself if these are the same people that made Crysis or Far Cry 2, because it doesn't seem like some things are "up to par".

Nonetheless, I am enjoying my game. :)

How do you have a 6 core i7 SB-E 32GB of Ram and a weak video card like a 6850?

and 1440x900? Is this a laptop?? wtf?

Am I missing something?
 
How do you have a 6 core i7 SB-E 32GB of Ram and a weak video card like a 6850?

and 1440x900? Is this a laptop?? wtf?

Am I missing something?

Can only assume that it isn't a gaming computer. Doesn't really explain the screen size though...
 
I don't understand the argument some are giving about dumbing down the graphics so lower end PC's can run it. It doesn't make a lick of sense. Like, isn't that what...oh I don't know...GRAPHIC SETTINGS ARE FOR?! If all PC games were made to run on lower end settings then we wouldn't have the likes of Crysis and Metro 2033. Common fuck sense people!

I've also read people argue that it was an E3 demo, so they made the game look better...still doesn't explain why they had to make the demo misleading and totally different from the end product. However, this is what they did ultimately.

If you play with the level creator for FC3 it is obvious that things like grass and trees are procedurally created...I'm sure some trees and what not were individually placed for certain effects, but most of the island is just randomized trees and what not.

Either way though, FC3 isn't a bad looking game, but if it came out with the level of fidelity shown in the E3 demo it would have come closer to a Crysis killer...something I've wanted for a long time now...
 
the guy posted the pc specs and game settings

"I'm running a 2600k @ 4.4ghz with 2 6870's, the in game settings were on the Ultra prefix, HDAO, 8xMSAA etc, the game was topped out. "
 
Come on.
Isn't this business as usual??
The demos they show at trade shows is fully fleshed out and pushes the game engine to it's limits.
The PCs running them is top shelf everything so there will be no lag or hic-up in game play.
You KNOW this will not be the released version because it would have unplayable frame rates on MOST of the PCs out there.
And here is something to consider, MANY PC gamers play on laptops with embedded graphics. Not custom built gaming rigs like [H] members.

This guy gets it.


Its called a "Trade Show" , they have to show its maximum potential. It would be like taking all your prototype cars to a trade show and showing them caked in dirt and stained and un-waxed.
 
If someone was going to complain about things in demos not making the final game.. then I would tell them to go back and watch some of those Half Life 2 demo movies.... 1/2 of that stuff didnt make the game either.... especially those electric eel alien arms....

Just be happy the game came out after all the hype and it's not Duke Nukem Forever...

It's not "things". The game isn't missing a few guns, or enemies from the E3 demo. It's how the game looks. It's clearly obvious that the game looks better in the E3 demo. It's also clearly obvious that they did this on purpose, to make it look better, to sell more games. It's not a big deal to me, but it is shady.
 
This guy gets it.


Its called a "Trade Show" , they have to show its maximum potential. It would be like taking all your prototype cars to a trade show and showing them caked in dirt and stained and un-waxed.

He nor you seem to understand the issue at hand.

See, there is this thing called "graphic settings". One uses these "graphic settings" to maximize performance in a video game to a level that their computer can handle.

Dr. Righteous seems to think that because they used a top-shelf PC for the E3 demo and that many PC gamer's use PC's with embedded graphics that this in some way means that making a game with graphics beyond that which an embedded GPU can handle is pointless. If it is not pointless then skies the limit! All this, again, goes back to "graphic settings". Look at Crysis...was a beast to run when it came out. However, due to its "graphic settings" it could be scaled to the point where it could bring a high-end PC to its knees all the way to running perfectly fluid on a low-end PC.

Now what YOU said, Godmachine, also makes no logical sense. Sure they want to make the game show its "maximum potential", however, there is no reason WHY they couldn't LEAVE this potential IN THE GAME. I highly doubt they ran it on anything higher than a quad-core Intel i7 CPU and GTX 680 GPU (as most demo running PC's were at E3)...a machine that is HARDLY omg wtf uber high-end. There is no reason at all why what was shown in the E3 demo COULDN'T be in the final product. If running the game at the settings shown in the demo is too much for even a vast majority of PC's out there then we come full circle back to GRAPHIC SETTINGS. TURN THE SHIT DOWN!

I swear...I wonder if I'm even on [H]ard|Forum sometimes! So many {S}oft people here nowadays! WE ARE HARD CORE PC GAMERS! This should piss off ANYONE here...instead all I get are people making excuses and seemingly supporting Ubisoft in saying they did nothing wrong. If Ubisoft is going to show a IN-GAME LIVE DEMO then I see no reason, especially considering all the difference are graphical, for the FINAL product not to have the intense graphics shown at E3.

It's as simple as that. If you disagree with me you are WRONG. Period.

Now go back to playing games on your console/tablet/laptop and leave the serious gaming to the big kids!
 
He nor you seem to understand the issue at hand.

See, there is this thing called "graphic settings". One uses these "graphic settings" to maximize performance in a video game to a level that their computer can handle.

Dr. Righteous seems to think that because they used a top-shelf PC for the E3 demo and that many PC gamer's use PC's with embedded graphics that this in some way means that making a game with graphics beyond that which an embedded GPU can handle is pointless. If it is not pointless then skies the limit! All this, again, goes back to "graphic settings". Look at Crysis...was a beast to run when it came out. However, due to its "graphic settings" it could be scaled to the point where it could bring a high-end PC to its knees all the way to running perfectly fluid on a low-end PC.

Now what YOU said, Godmachine, also makes no logical sense. Sure they want to make the game show its "maximum potential", however, there is no reason WHY they couldn't LEAVE this potential IN THE GAME. I highly doubt they ran it on anything higher than a quad-core Intel i7 CPU and GTX 680 GPU (as most demo running PC's were at E3)...a machine that is HARDLY omg wtf uber high-end. There is no reason at all why what was shown in the E3 demo COULDN'T be in the final product. If running the game at the settings shown in the demo is too much for even a vast majority of PC's out there then we come full circle back to GRAPHIC SETTINGS. TURN THE SHIT DOWN!

I swear...I wonder if I'm even on [H]ard|Forum sometimes! So many {S}oft people here nowadays! WE ARE HARD CORE PC GAMERS! This should piss off ANYONE here...instead all I get are people making excuses and seemingly supporting Ubisoft in saying they did nothing wrong. If Ubisoft is going to show a IN-GAME LIVE DEMO then I see no reason, especially considering all the difference are graphical, for the FINAL product not to have the intense graphics shown at E3.

It's as simple as that. If you disagree with me you are WRONG. Period.

Now go back to playing games on your console/tablet/laptop and leave the serious gaming to the big kids!

Oh burned... :rolleyes:
 
I'm away from my computer and reading this off my phone right now so I'm not even going to try to watch that video. But has there been a console comparison between it and the elements version? Did only the PC version get toned down?
 
He nor you seem to understand the issue at hand.

See, there is this thing called "graphic settings". One uses these "graphic settings" to maximize performance in a video game to a level that their computer can handle.

Dr. Righteous seems to think that because they used a top-shelf PC for the E3 demo and that many PC gamer's use PC's with embedded graphics that this in some way means that making a game with graphics beyond that which an embedded GPU can handle is pointless. If it is not pointless then skies the limit! All this, again, goes back to "graphic settings". Look at Crysis...was a beast to run when it came out. However, due to its "graphic settings" it could be scaled to the point where it could bring a high-end PC to its knees all the way to running perfectly fluid on a low-end PC.

Now what YOU said, Godmachine, also makes no logical sense. Sure they want to make the game show its "maximum potential", however, there is no reason WHY they couldn't LEAVE this potential IN THE GAME. I highly doubt they ran it on anything higher than a quad-core Intel i7 CPU and GTX 680 GPU (as most demo running PC's were at E3)...a machine that is HARDLY omg wtf uber high-end. There is no reason at all why what was shown in the E3 demo COULDN'T be in the final product. If running the game at the settings shown in the demo is too much for even a vast majority of PC's out there then we come full circle back to GRAPHIC SETTINGS. TURN THE SHIT DOWN!

I swear...I wonder if I'm even on [H]ard|Forum sometimes! So many {S}oft people here nowadays! WE ARE HARD CORE PC GAMERS! This should piss off ANYONE here...instead all I get are people making excuses and seemingly supporting Ubisoft in saying they did nothing wrong. If Ubisoft is going to show a IN-GAME LIVE DEMO then I see no reason, especially considering all the difference are graphical, for the FINAL product not to have the intense graphics shown at E3.

It's as simple as that. If you disagree with me you are WRONG. Period.

Now go back to playing games on your console/tablet/laptop and leave the serious gaming to the big kids!

Aren't the so called hard gamers their own worst enemy sometimes though ... I see many people who are complaining about paying full price, even for some games that the community agrees are cutting edge ... they brag about how they never buy any titles for more than $30 or more than $20 or more than $10 ... although their fiscal responsibility appears admirable you have some people with $3000+ rigs trying to put $20 software in it (the equivalent of a Ferrari owner putting cheap low test Costco gas in their fancy sports car) ;)

Trade shows are just that "Shows" ... unless they are demoing an actual product that is currently on the market they are likely to optimize their demos for maximum buzz effect ... computer games have often had tech demos that never made it into the actual gaming market ... especially now where a AAA title will typically derive a third or less of it's sales from the PC ... it would be nice if we could have games optimized for PCs again but except for a few developers who don't develop for the consoles that probably isn't practical

But who knows, when the 4K monitors hit the market next year that might open up the door for some new competition on the graphic front ... unless the Hard Gamers can't man up and shell out the bucks to upgrade their graphics :D
 
How do you have a 6 core i7 SB-E 32GB of Ram and a weak video card like a 6850?

and 1440x900? Is this a laptop?? wtf?

Am I missing something?
Can only assume that it isn't a gaming computer. Doesn't really explain the screen size though...
Mostly photography (Adobe Photoshop, Photomatix Pro, Autopano Giga, Silver Efex 2, NoiseNinja) and video rendering (MeGUI, AViSynth, other tools, Adobe After Effects, etc). I haven't found justification to get a better video card. I hardly play computer games anymore. :( No time, and if I did have time there's nothing interesting to play. The only other game on my wishlist is Mass Effect 3, but I won't play that until I get married (which may never happen either, which means eventually hopefully I will play it :D).
 
EDIT: I do have a 24" 1920x1200 ASUS monitor, but I can't run Far Cry 3 at native 1920x1200 without running into framerate being too low for my tastes. My 6850 can't handle that.
 
He nor you seem to understand the issue at hand.
See, there is this thing called "graphic settings". One uses these "graphic settings" to maximize performance in a video game to a level that their computer can handle.
Dr. Righteous seems to think that because they used a top-shelf PC for the E3 demo and that many PC gamer's use PC's with embedded graphics that this in some way means that making a game with graphics beyond that which an embedded GPU can handle is pointless. If it is not pointless then skies the limit! All this, again, goes back to "graphic settings". Look at Crysis...was a beast to run when it came out. However, due to its "graphic settings" it could be scaled to the point where it could bring a high-end PC to its knees all the way to running perfectly fluid on a low-end PC.
Now what YOU said, Godmachine, also makes no logical sense. Sure they want to make the game show its "maximum potential", however, there is no reason WHY they couldn't LEAVE this potential IN THE GAME. I highly doubt they ran it on anything higher than a quad-core Intel i7 CPU and GTX 680 GPU (as most demo running PC's were at E3)...a machine that is HARDLY omg wtf uber high-end. There is no reason at all why what was shown in the E3 demo COULDN'T be in the final product. If running the game at the settings shown in the demo is too much for even a vast majority of PC's out there then we come full circle back to GRAPHIC SETTINGS. TURN THE SHIT DOWN!
I swear...I wonder if I'm even on [H]ard|Forum sometimes! So many {S}oft people here nowadays! WE ARE HARD CORE PC GAMERS! This should piss off ANYONE here...instead all I get are people making excuses and seemingly supporting Ubisoft in saying they did nothing wrong. If Ubisoft is going to show a IN-GAME LIVE DEMO then I see no reason, especially considering all the difference are graphical, for the FINAL product not to have the intense graphics shown at E3.
It's as simple as that. If you disagree with me you are WRONG. Period.
Now go back to playing games on your console/tablet/laptop and leave the serious gaming to the big kids!

I understand your point. Yes it is possible for them to release the final game with all the goodies you see in the demo at the trade show.

But lets take a look at a recent example where they "misjudged" the performance on the average PC---RAGE. This game was an utter failure because they got this wrong. On far too many PCs even ones that were well within the required spec the lag and texture popping was unacceptable. Was the game overly ambitious? No. But the beta testing did not go deep enough on the compatibility issues with such a huge range of different system specs. Customers were screaming bloody murder. Let the patches begin. You can probably pickup a copy of RAGE for $5 now.
Believe me other developers were watching this closely.
The challenge before them is to make something One Size Fits All. And that is not easy and still keep high level of content.
The easiest way to insure you will get not hit this wall is; you guess it........Console port.
Keep in mind most "gamer" game on consoles. PC gamers; especially guys like us that build custom gaming rigs are in a minority.

I think there are some solutions here but it will have to focus on very comprehensive benchmarks. Yeah, they have been around for years but there has been no real SYSTEM that one can judge if a game will preform on a certain system.
Combined score synthetic benches are pretty much worthless unless you are only interested in those scores along and not actually game performance.
I think the difference scores must remain separate; otherwise you are grading on a curve.

So if the game developer says you system has to score a 75 in all 5 criteria to play this game in the "certified high content version". If not, upgrade or buy the "distribution" version which will be equivalent to a console port. .
This is a way to have your cake and eat it too but it puts the responsibility on the gamer to get his house in order.
Just some thoughts, but something has to change here or expect most games coming down the pike to basically be console ports.
 
I took it serious and then lol'ed @ "Why have you forsaken us?"

pfffft

I bet most people in a blind test wouldn't know which was which
A blind test for something you have to see, interesting.
 
It is disappointing, but the same thing happens with nearly all E3 trailers. I remember when The Elder Scrolls Oblivion came out and it looked worse than the demo. People complained then too, rightfully so, but nothing ever gets done about it.
 
My question has to do with what level of completion was that E3 playable demo? Was it a matter of they did the work they just dropped it, or they figured out doing that level of detail for everything would be too expensive? Yeah it's a playable demo of one particular level right after you get dropped off on the new island, but for how long? I doubt the entire game was playable with that level of detail. And like others have said it's probably just the flash and bling they use for the tech show.
 
He nor you seem to understand the issue at hand.

See, there is this thing called "graphic settings". One uses these "graphic settings" to maximize performance in a video game to a level that their computer can handle.

Dr. Righteous seems to think that because they used a top-shelf PC for the E3 demo and that many PC gamer's use PC's with embedded graphics that this in some way means that making a game with graphics beyond that which an embedded GPU can handle is pointless. If it is not pointless then skies the limit! All this, again, goes back to "graphic settings". Look at Crysis...was a beast to run when it came out. However, due to its "graphic settings" it could be scaled to the point where it could bring a high-end PC to its knees all the way to running perfectly fluid on a low-end PC.

Now what YOU said, Godmachine, also makes no logical sense. Sure they want to make the game show its "maximum potential", however, there is no reason WHY they couldn't LEAVE this potential IN THE GAME. I highly doubt they ran it on anything higher than a quad-core Intel i7 CPU and GTX 680 GPU (as most demo running PC's were at E3)...a machine that is HARDLY omg wtf uber high-end. There is no reason at all why what was shown in the E3 demo COULDN'T be in the final product. If running the game at the settings shown in the demo is too much for even a vast majority of PC's out there then we come full circle back to GRAPHIC SETTINGS. TURN THE SHIT DOWN!

I swear...I wonder if I'm even on [H]ard|Forum sometimes! So many {S}oft people here nowadays! WE ARE HARD CORE PC GAMERS! This should piss off ANYONE here...instead all I get are people making excuses and seemingly supporting Ubisoft in saying they did nothing wrong. If Ubisoft is going to show a IN-GAME LIVE DEMO then I see no reason, especially considering all the difference are graphical, for the FINAL product not to have the intense graphics shown at E3.

It's as simple as that. If you disagree with me you are WRONG. Period.

Now go back to playing games on your console/tablet/laptop and leave the serious gaming to the big kids!

Sure, they could have done like the original Crysis and kept those top-end visuals under an "Ultra" setting. But I remember the wailing and gnashing of teeth from hardcore gamers unable to play Crysis on Ultra even with the latest, crazy-expensive gear. It became a joke. Crysis became a widely used benchmark for stress-testing hardware to its limits, and basically that was its sole purpose after awhile. The Crysis devs said they believed they made a mistake putting the Ultra settings in at that time, because people didn't WANT to turn down the eye-candy, and they were feeling their shiny new 8800GTX should have played it without issue. This gave Crytek some bad press ("unoptimized code," "a pig of a graphics engine," "who cares about God-rays if you can't even get 20 FPS consistently", and Cevat Yerli actually blamed the performance issues with Ultra settings as the main reason he believes his game was pirated 10 times for every 1 legal sale. Regardless of the accuracy of Mr. Yerli's analysis, this had a major impact on Crytek, which went on announce they were now only developing games for consoles, and then releasing a port to PC. (All under the notion that console gamers pay, and are happy, while PC gamers don't pay, yet bitch and moan endlessly. :rolleyes:)
 
Back
Top