Just Cause 3 to feature GameWorks!

cageymaru

Fully [H]
Joined
Apr 10, 2003
Messages
22,090
WCCFTECH article on GameWorks added to Just Cause 3.

Just Cause 3 is the latest GameWorks success. There isn't a list of GameWorks features yet. A month ago there wasn't a mention of GameWorks in the graphics options screen. The fact it has GameWorks now is a testament to how fast Nvidia can implement their technology I suppose. Will be interesting to see what shows up closer to launch and how the game runs.

System Requirements:

I feel sorry for those with i3's and less. R9 290 / GTX 780 for recommended spec seems kinda steep. But hey, GameWorks happened so it's expected to enable the blur effects that every gamer seems to love.

PC SPECS
NOVEMBER 23 - PETRAO

Hi guys,

These are the final Just Cause 3 PC specs:

Minimum Specifications
OS: Vista SP2, Win 7 SP1, Win 8.1 (64-bit Operating System Required)
CPU: Intel Core i5-2500K, 3.3 Ghz | AMD Phenom II X6 1075T 3 Ghz
Memory: 6GB RAM
Graphics: NVIDIA GeForce GTX 670 (2GB) | AMD Radeon HD 7870 (2GB)

Recommended Specifications
OS: Vista SP2, Win 7 SP1, Win 8.1 (64-bit Operating System Required)
CPU: Intel Core i7-3770, 3.4 GHz | AMD FX-8350, 4.0 GHz
Memory: 8 GB Ram
Graphics: NVIDIA GeForce GTX 780 (3GB) | AMD R9 290 (4GB)
 
If I were to guess, I'd say that PhysX on the CPU could be the reason for the abnormally high CPU requirements. Will be interesting to see the reasoning for it when the first performance reports come through the grapevine after launch.
 
what? I think you should look at that again. It doesn't have anything to do with PhysX, certain effects in Just Cause 3, WILL require more CPU or better to watch some promo videos of it.
 
4K blur ftw!

Seriously, it would be nice for the developer to go on record and confirm the entire gameworks codepath can be disabled. Or not. Then everyone would know what to expect.
 
Game works for a destruction game might actually make sense.
 
The recommended specs seem in line with every other current game. An I7, 8 GBs of RAM and a GPU with 3 GB of VRAM or more.

I expect this game to run great, giving Avalanche's superb port of the super fun and highly underrated JC2. Still, gonna wait and see before buying.
 
The fact it has GameWorks now is a testament to how fast Nvidia can implement their technology I suppose. Will be interesting to see what shows up closer to launch and how the game runs.

Or more likely the fact it now has gameworks is testament to how an exchange of cash can get some gimmicky features shoehorned in at the last minute. :)
 
Well hopefully these releases will get better. As long as GW isn't changing then getting workable drivers quicker may become reality. AMD got FO4 drivers in a week where at the first of the year it was taking 2-4 weeks.
 
why are people complaining now?. I have to say again, wasn't Fallout 4 a GimpWorks/CrapWorks/GameShits/ShitWorks game with a black box inside that can not be optimized for AMD under any circumstance via drivers?.. yet AMD improved 20% performance with only ONE, SINGLE driver release just 9 days after launch... damn..

note aside, cage we know you hate i3s =D I caught the joke there :)

in my opinion requirements looks good. specially for the kind of game that It will be.. Just Cause 2 was also heavy CPU dependent due the huge nature of the map with a huge draw distance..
 
People are complaining without even knowing what the complain about lol, they don't know how this game runs, but they have to say something guess who the normal suspects are. Yeah I really should make a list of these people, and then make a spread sheet about where they complain, I can tell ya its going to really be funny!
 
People are complaining without even knowing what the complain about lol, they don't know how this game runs, but they have to say something guess who the normal suspects are. Yeah I really should make a list of these people, and then make a spread sheet about where they complain, I can tell ya its going to really be funny!

I couldn't be so right and agree with you..
 
I feel sorry for those with i3's and less. R9 290 / GTX 780 for recommended spec seems kinda steep. But hey, GameWorks happened so it's expected to enable the blur effects that every gamer seems to love.

I wouldn't worry too much about those advertised "minimum" CPU specs. Fallout 4 has already shown how overblown they are:

http://www.techspot.com/review/1089-fallout-4-benchmarks/page5.html

99egdvC.png


Summary:

1. Game is as playable on a Core i3 as it is on an FX-9590. Game is also somewhat playable on a Pentium, likely to be VERY playable if it's overclocked. Not exactly the CPU hog it was made out to be.

2. Bethesda worked some minor miracles to make their engine multi-threaded so it could work on 6 Jaguar cores (consoles), but there are still scaling issues. Based on the scaling between the AMD 4/6/8 cores, and the Intel Pentium versus 2500k, the scaling was only about 60% of theoretical, which is why the FX-6350 gets owned by the i3. But the game engine is clearly multithreaded, as evidenced by the higher minimum frame rate of chips with 4 threads.

So single-threaded performance still owns. The Core i3 is still the value sweet-spot, and I severely doubt Just Cause 3 is going to change that, despite the recommended core count. Remember, the recommend minimum for Fallout 4 was a Core i5, but that's obviously overkill. A Core i5 should have been the recommend spec!
 
Last edited:
4K blur ftw!

Seriously, it would be nice for the developer to go on record and confirm the entire gameworks codepath can be disabled. Or not. Then everyone would know what to expect.

Except there isn't any "entire gameworks codepath". GW = a collection of DirectX effects.
 
How to fanboy: Blame Gameworks for poor AMD performance until a driver is released.
 
Last edited:
why are people complaining now?. I have to say again, wasn't Fallout 4 a GimpWorks/CrapWorks/GameShits/ShitWorks game with a black box inside that can not be optimized for AMD under any circumstance via drivers?.. yet AMD improved 20% performance with only ONE, SINGLE driver release just 9 days after launch... damn..

note aside, cage we know you hate i3s =D I caught the joke there :)

in my opinion requirements looks good. specially for the kind of game that It will be.. Just Cause 2 was also heavy CPU dependent due the huge nature of the map with a huge draw distance..

Ok 2 things. No one really complained. I see 2 posters make short snipits but no ranting. Even I went so far to say it probably wont be a big deal. And for the record most complained about the likelyhood of AMD to make quick decisive driver improvements when the source code/libraries/whatever were locked ie:black box. They, well the reasonable ones, never said it couldn't be done just that it was not very efficient without access which by all accounts is a given Fact not opinion.

Now 3 of you have come forth making the argument for GW when thus far no real dissent has been made.
 
Ok 2 things. No one really complained. I see 2 posters make short snipits but no ranting. Even I went so far to say it probably wont be a big deal. And for the record most complained about the likelyhood of AMD to make quick decisive driver improvements when the source code/libraries/whatever were locked ie:black box. They, well the reasonable ones, never said it couldn't be done just that it was not very efficient without access which by all accounts is a given Fact not opinion.

Now 3 of you have come forth making the argument for GW when thus far no real dissent has been made.


Dont think it was geared towards you, mine definitely wasn't, I haven't heard you complain much about anything, yeah you have stated a few times that GW causes issues in other threads, but some of them were justified in that the developer has to be more careful when adding 3rd party libraries in their software and more testing and optimization from their end needs to be done.

As much as nV's responsibilities are, is to get gameworks into a game, its not their responsibility to fix a developer's product. Although it sure does make nV look bad when a games comes out with GW stamped all over it and it runs like a dog, even if GW features are off.

Don't take this out of context but, AMD supporters should cheer when nV promotes a game with GW and it runs like ass lol.
 
Minimum specs vary wildly across the internet. Never seen that before ;)

So, like usual, waiting to eat the pudding rather than wildly speculating on how it's going to taste before its prepared is recommended? And maybe not buying the game until it's out of unofficial beta?
 
Dont think it was geared towards you, mine definitely wasn't, I haven't heard you complain much about anything, yeah you have stated a few times that GW causes issues in other threads, but some of them were justified in that the developer has to be more careful when adding 3rd party libraries in their software and more testing and optimization from their end needs to be done.

As much as nV's responsibilities are, is to get gameworks into a game, its not their responsibility to fix a developer's product. Although it sure does make nV look bad when a games comes out with GW stamped all over it and it runs like a dog, even if GW features are off.

Don't take this out of context but, AMD supporters should cheer when nV promotes a game with GW and it runs like ass lol.

You are correct for the most part. I think What GW brings, as in implementing new tech, better visuals and such, is a great effort and kind of what we as gamers want. Unfortunately these devs I think are relying to heavily on it to help add value but aren't putting enough effort into their own engines first. Most of us just have to accept the outcome of GW on our hardware, especially those that aren't using Maxwell. As I stated in my first post, it seems to be getting better as it seems GWs hasn't changed and getting drivers out for at least that part is getting more timely. As far as the Games themselves, GW cant save them from bad code/port/programming.
 
Mad Max was also supposed to be garbage, yet they did a great job with it... If it sucks its only $25 out of my wallet heh.
 
How to fanboy: Blame Gameworks for poor AMD performance until a driver is released.

Funny thing is I myself am not a fan of Gameworks at all, but even I have to agree this whole "Blameworks" thing is getting real old real fast. Hell for FO4 AMD themselves didn't say word about Gameworks, and all I see is Gameworks this Gameworks that. -_-
 
WCCFTECH article on GameWorks added to Just Cause 3.

Just Cause 3 is the latest GameWorks success. There isn't a list of GameWorks features yet. A month ago there wasn't a mention of GameWorks in the graphics options screen. The fact it has GameWorks now is a testament to how fast Nvidia can implement their technology I suppose. Will be interesting to see what shows up closer to launch and how the game runs.

System Requirements:

I feel sorry for those with i3's and less. R9 290 / GTX 780 for recommended spec seems kinda steep. But hey, GameWorks happened so it's expected to enable the blur effects that every gamer seems to love.

PC SPECS
NOVEMBER 23 - PETRAO

Hi guys,

These are the final Just Cause 3 PC specs:

Minimum Specifications
OS: Vista SP2, Win 7 SP1, Win 8.1 (64-bit Operating System Required)
CPU: Intel Core i5-2500K, 3.3 Ghz | AMD Phenom II X6 1075T 3 Ghz
Memory: 6GB RAM
Graphics: NVIDIA GeForce GTX 670 (2GB) | AMD Radeon HD 7870 (2GB)

Recommended Specifications
OS: Vista SP2, Win 7 SP1, Win 8.1 (64-bit Operating System Required)
CPU: Intel Core i7-3770, 3.4 GHz | AMD FX-8350, 4.0 GHz
Memory: 8 GB Ram
Graphics: NVIDIA GeForce GTX 780 (3GB) | AMD R9 290 (4GB)

Why do mods allow false, deceptive FUD posts like this?

I bet the OP is clueless about coding and wouldn't know what

Code:
float InvSqrt (float x){
     float xhalf = 0.5f*x;
     int i = *(int*)&x;
     i = 0x5f3759df - (i>>1);
     x = *(float*)&i;
     x = x*(1.5f - xhalf*x*x);
     return x;
 }

is even if it bit him in the ass...
 
Code:
float InvSqrt (float x){
     float xhalf = 0.5f*x;
     int i = *(int*)&x;
     i = 0x5f3759df - (i>>1);
     x = *(float*)&i;
     x = x*(1.5f - xhalf*x*x);
     return x;
 }

is even if it bit him in the ass...

If you can explain 0x5f3759df I would be impressed.
 
If you can explain 0x5f3759df I would be impressed.


Its just a floating point representation, you can do either negative or positive values, with out that an inverse root won't work.
 
You know, if you don't like improved visuals in games, you can disable/lower the quality.

As a person who wants game graphics to be forward looking and continue to evolve, I welcome better looking graphics in games.
 
people don't like nvidia for unrelated but justified reasons
see gameworks in games that run like shit because the devs are incompetent
invoke hasty generalization fallacy despite evidence proving % performance hit for gameworks features is the same if not worse on nvidia cards vs amd cards in most gameworks games and ignoring the fact that you don't even have to fucking use gameworks features
blameworks shitposts in every thread about every game featuring the technology ad infinitum
 
You know, if you don't like improved visuals in games, you can disable/lower the quality.

As a person who wants game graphics to be forward looking and continue to evolve, I welcome better looking graphics in games.

As I am, as most of gamers want in the gaming world things have to evolve.
 
JC2 is still one of my favorite games of all time and I still spend a fair amount of time running around in there with 3D surround. Can't wait for JC3. Something tells me nvidia will fail yet again in the driver department though particularly with SLI and surround. Hope not but I'd bet even money on it...
 
You know, if you don't like improved visuals in games, you can disable/lower the quality.

As a person who wants game graphics to be forward looking and continue to evolve, I welcome better looking graphics in games.

I fear you expect too much of some posters...
 
If you can explain 0x5f3759df I would be impressed.

as a fun tidbit, not only is it possible to concoct a more accurate constant, the fabled fast inv sqrt is also generally slower on modern hardware.
 
I welcome better looking graphics that don't deliberately seek to penalize one vendor's cards over that of another.
 
I welcome better looking graphics that don't deliberately seek to penalize one vendor's cards over that of another.


You can't blame nV for that, its their software, they aren't going to go out of their way to make it better on other vendor's hardware, that would be doubly counter productive on their end. They would have to spend the money for the optimization for other vendors cards, and it helps sell the other vendor's cards. What company in their right mind would do this, unless they are in a position that adoption of their tech is too low to benefit from?
 
You can't blame nV for that, its their software, they aren't going to go out of their way to make it better on other vendor's hardware, that would be doubly counter productive on their end. They would have to spend the money for the optimization for other vendors cards, and it helps sell the other vendor's cards. What company in their right mind would do this, unless they are in a position that adoption of their tech is too low to benefit from?

and not only that, that performance hit also occur in their same segment of cards old and new are suffer the big performance penalty and people forget really that. isn't like they are favoring their cards without suffer any performance hit.
 
Back
Top