Nvidia's nerfed drivers for Kepler card.

I see the rough edges on the bottom of the sign with Ultra and smooth in that same area under High setting .

I have always believed Nvidia gimps performance of older cards as a marketing tool to sell newer cutting edge cards and the numbers don't lie as why Nvidia leads in market shares because there user base is always needing to upgrade as the life span of older cards is not the same as AMD which keeps adding performance to older cards and why the HD 7950> R 280 can compete with the 960GTX and that was what ? a 2011-2012 card.

I always love how people just become blind overtime :D.. look at the performance of the GTX 960.. a 200$ card that perform between a GTX 670 and GTX 680, guess what.. the same as the HD7950:D:p even in those times some games the 7950 was faster than the GTX 670 and closer much closer to the 680, so there were competition there.. now, AMD re-branded that card as 280 with higher clocks in both vRAM and Core.. that of course added extra performance to be more competitive, but guess what again? a R9 280 its barely JUST barely competitive with a GTX 770.. so it's the same shit, the GTX 960 fills exactly between those performance Tier at just 200$ and half the power consumption, most people how much paid for their 7950? 450$ at launch? :D 600$ on the mining craze?.. yes, that 2011-2012 card compete with the same cards available in their launch time ;)... Nvidia doesn't GIMP performance of older cards, they just exploit the max performance of current generation of cards...
 
The people that cannot see any difference I suggest that they, and you, drop PC gaming and buy a console.

Problem solved.

I see the difference and it's quite large IMHO.

Ooh, you showed me.
 
Ooh, you showed me.

The difference is there...if you cannot see it, no point in you wasting time on gaming on a PC, when you will have the same experience on a console.

I can see the difference, Brent can see the difference, Kyle can see the difference, the video shows the difference.

All you have shown is that you cannot see an obvious difference...hence you should consider a console.

It's funny that you get mad over logic...but also sad.

I wonder if you were also denying microstutter before FCAT?

EDIT: I do see you don't have a lot of knowledge of tessellation in previous posts as you post FUD about too much tessellation in Crysis 2 (based on wireframe mode, easy to debunk)...quite hilarious, that you point your finger at others based on your own ignorance...and yet you still call others "fanboy"...why is that?
 
Last edited:
The difference is there...if you cannot see it, no point in you wasting time on gaming on a PC, when you will have the same experience on a console.

I can see the difference, Brent can see the difference, Kyle can see the difference, the video shows the difference.

All you have shown is that you cannot see an obvious difference...hence you should consider a console.

It's funny that you get mad over logic...but also sad.

I wonder if you were also denying microstutter before FCAT?

anyone that haven't played the game (as it seems to be in the Remon's case) can't notice the difference in GodRays becuase they only have as base some screenshots.. Most of the times difference can't be seen in pictures.. the big difference its there in movement and the way Godrays interact with the environment.
 
anyone that haven't played the game (as it seems to be in the Remon's case) can't notice the difference in GodRays becuase they only have as base some screenshots.. Most of the times difference can't be seen in pictures.. the big difference its there in movement and the way Godrays interact with the environment.

I know :)
This applies for many games: gameplay > video > picture ;)
 
anyone that haven't played the game (as it seems to be in the Remon's case)

You base that on what?

I like how you both always use cold hard facts...

EDIT: I do see you don't have a lot of knowledge of tessellation in previous posts as you post FUD about too much tessellation in Crysis 2 (based on wireframe mode, easy to debunk)...quite hilarious, that you point your finger at others based on your own ignorance...and yet you still call others "fanboy"...why is that?

You're literally the only one that thinks this...
 
You base that on what?

I like how you both always use cold hard facts...



You're literally the only one that thinks this...

My fact its quite simply. everyone who has already played it, say that there's a big difference in Godrays in each of the presets. the main problem is, that isn't worth the performance hit, everyone I know can sustain that fact. if you say you don't see the differences (which are quite obvious in movement and even indoor). no picture can really show you that differences in dynamic glow, etc.. the way you attack others telling that there's no difference point that you haven't played the game as others said. even Kyle and Brent said that the difference.

In regards to Ultra godrays, you are missing what it is ultimately doing. It is about how the light wraps around objects and the shafts or rays of light fill in based on distance. The sign in that screenshot is very high from the users height, naturally it looks a bit blurry as you look into the distance, think of it as a depth of field effect, which it is adding. Closer objects are more clear than objects in the distance. It is the definition of Depth of Field. It also takes the intensity of the light into account, naturally the brighter something is the less visible it is, that's what it is doing.
 
My fact its quite simply. everyone who has already played it, say that there's a big difference in Godrays in each of the presets. the main problem is, that isn't worth the performance hit, everyone I know can sustain that fact. if you say you don't see the differences (which are quite obvious in movement and even indoor). no picture can really show you that differences in dynamic glow, etc.. the way you attack others telling that there's no difference point that you haven't played the game as others said. even Kyle and Brent said that the difference.

If it's not worth the performance hit, then there's not a big enough difference. Simple as that.
 
If it's not worth the performance hit, then there's not a big enough difference. Simple as that.

people with enough processing power will disagree with you, specially after the patch that allow SLI Profiles optimizations. anyone with 980TI or Titan X are perfectly playing it maxed out..
 
If it's not worth the performance hit, then there's not a big enough difference. Simple as that.

If that's the mentality we'd still be stuck on 8-bit graphics, just saying...

GPUs have always advanced because there's bat shit crazy effects developers want to do.
 
Sometimes a certain effect just takes that many cycle to compute. Graphical advancement is about slowly pushing towards realism, every effect that contributes to that counts. You can't pick and chose which effect to use based on the ratio between visual differences and performance impact alone, because often visual effect itself is subjective, it depends on the eye that sees them. Therefore, everything has to move forward.

Of course with that said, on PC, you are free to turn off any effect you deem to not be worthy of it's performance impact. Just turn it off. Let others have it. It's not something worth complaining about.
 


Do you know what god rays do and how they are created? That will answer why the lighting in Fallout 4 looks so good, softness due to lighting even in atmosphere, or around objects happens naturally in real world. Without god rays you end up objects or textures that are sharp which should be in a bit hazy because of strong light, PBR materials help enhance that. Maybe you like the fake way developers use to make god rays with planes and alpha textures? Yeah they looked like shit.

The reason why god rays are expensive is because its a ray trace algorithm, and future of gaming or real time graphics in general is ray tracing, more and more visual fx have been using raytracing since programmable shaders were introduced, nothing will change that. Ideally everything should be raytraced, but don't have the computational power yet for that.

God Rays isn't the holy grail, but its part of the overall drive to full ray tracing in real time.
 
I'm guessing you're trying to talk about volumetric light and not god rays.

Anyway, the Ultra setting, according to the Nvidia Fallout 4 Graphics, Performance & Tweaking Guide, which was withdrawn, enables sub surface scattering. Which could account for some of the stuff you're mentioning, but have nothing to do with tessellation.

NVM, its tied to the lighting quality menu, my mistake.

It doesn't change the fact that technically the game has many shortcomings, more important than god rays.


Here's cache of the guide, it's been removed.

http://webcache.googleusercontent.c...-guide+&cd=6&hl=en&ct=clnk&gl=gr&client=opera

EDIT: what is the code for strikethrough?
 
Last edited:
I always love how people just become blind overtime :D.. look at the performance of the GTX 960.. a 200$ card that perform between a GTX 670 and GTX 680, guess what.. the same as the HD7950:D:p even in those times some games the 7950 was faster than the GTX 670 and closer much closer to the 680, so there were competition there.. now, AMD re-branded that card as 280 with higher clocks in both vRAM and Core.. that of course added extra performance to be more competitive, but guess what again? a R9 280 its barely JUST barely competitive with a GTX 770.. so it's the same shit, the GTX 960 fills exactly between those performance Tier at just 200$ and half the power consumption, most people how much paid for their 7950? 450$ at launch? :D 600$ on the mining craze?.. yes, that 2011-2012 card compete with the same cards available in their launch time ;)... Nvidia doesn't GIMP performance of older cards, they just exploit the max performance of current generation of cards...


I have a 770GTX and a R9- 280 .. the 280 seems to have a lot of headroom in clocking and it's more competitive then you think.. also it does DX 12

770GTX Fire Strike

http://www.3dmark.com/fs/2844407

R9 -280

http://www.3dmark.com/fs/2752854
 
Last edited:
I see the problem with God rays now.

It seems that while ultra is more defined, it renders in lower resolution than the other settings to compensate.


This is god rays Ultra lighting the scene

WZNel6f.png


this is on Medium

zSnJKxu.png


and Low

b0jVW95.png
 
Last edited:
let's see this first point one moment, Nvidia launch have a much faster time between major drivers, every month you keep seeing the GPUs increasing the performance, AMD release a very few amount of major drivers annually also the increments per major driver its bigger due to this reason, so if we take an example of the time that nvidia take per major drivers that have performance improvement and optimizations lets say 6, 1 major release every 2 months, (random number don't take it too much seriously), while AMD 1? 2 major release annually?. how much time take drivers to fully mature in nvidia's hand vs AMD's hand?. one fully year in the nvidia side, 3 or more in the AMD side..

second point, GPU architectures. don't forget AMD hasn't changed since TeraScale 2 and 3 with the HD5000 and HD6000 series. GCN were introduced with HD7000 and they are STILL using it, the codename? doesn't matter each different codename just change that, the name, few improvisations here and there. however the architecture keeps the same. so if we take that into consideration how since GCN 1.0 introduction at the end of 2011 AMD have almost full 4 years of architecture optimizations and maturity. in the other hand nvidia since that time have passed for kepler and now maxwell (and even Fermi if we take in consideration AMD launched HD7000 in Fermi times) any optimization they can make for maxwell will not work for kepler and vice-versa; can you say that HD6000 series are still improving and keep receiving better performance? AMD also forgot their last gen once HD7000 arrives do not forget that tiny(big) difference and at this point HD7000 and majority of R9 200 (leaving 290 and 290X) are starting to show the age, they reaching the roof of top performance that can offer..

Maxwell was kind of anomaly as they skipped entirely 800 series and they had somewhat matured drivers already due all the time the tiny GTX 750(TI) had in the market.. if we go back to kepler, they had 600 series and 700 series all rebranded except big kepler GK110 and as kepler every improvisation they made also worked for the GTX 600 series, specially for the mid to low end segments between GTX 650TI and GTX 660TI...

and in the other hand we have to take in consideration hardware as itself, it's well known that nvidia its more discrete in the terms of raw power, they know exactly how to design their cards to achieve certain performance at certain power target, in my opinion AMD have way bigger and powerful architecture, which allow somewhat futureproof. they always had larger bus, more amount of vRAM, ROPS, TMUs, etc.. that again allow some kind of better future proof hardware overtime, i'm one of those who say AMD have amazing hardware but sadly that hardware is in the wrong team of drivers. if we want to represent that in a more real life just imagine those tiny Japanese cars with tiny engines, great maneuverability and efficiency versus a typical american Muscle Car heavy weight big engine with a lot of horsepower, everyone can easily thing hey that Ford Mustang will dust off that tiny Camry V6. and then that Camry with V6 Engine just destroy that big V8 mustang engine.. so yes, AMD have a lot of muscle power to exploit overtime versus Nvidia measured power and efficiency overtime... sadly again all of those uber hardware are in wrong hands.

Don't want to take this out of context, but it also looks a fair bit like the GCN architecture is poorly aligned with the demands/program workflow that have been placed upon it. That's (probably) more than a heroic set of drivers can fix. Big push towards the finer granularity of DX12/Vulkan/Mantle/What-have-you seems to alleviate that a bit, which is why GCN chips (seem) to be benefiting more than Maxwell/Kepler.
 
I have a 770GTX and a R9- 280 .. the 280 seems to have a lot of headroom in clocking and it's more competitive then you think.. also it does DX 12

770GTX Fire Strike

http://www.3dmark.com/fs/2844407

R9 -280

http://www.3dmark.com/fs/2752854

FireStrike means nothing when you are comparing Cross platform, also 3dMarks always tend to favor AMD, as the same way Unigine Favors more Nvidia.. that never ins't translated into Real World performance. I have a 280X and a GTX 770 4GB and once both are high clocked they trade blows in a lot of games, some games favor the 280X and some the GTX 770. and it was that way all the way, just look for some old [H] reviews of the GTX 680 and HD7970. they always traded blows, in fact when AMD launched the HD7970 Ghz Edition it was stronger almost in every game out there in the market than the GTX 680...

So why could be anything strange that they can still trade blows actually?. still some games favor AMD and others Favor Nvidia. I can find some games where the 280X can crush a GTX 770 and others all the contrary.. that's how those cards always performed trading blows, but Also I could find a GTX 960 which its generally slower than a GTX 770 crushing one GTX 770 or being crushed by a GTX 680... so yes, the HD7950/R9 280 have a good place being compared to the GTX 960, because that's their intended performance tier. Same apply to the same AMD camp, R9 285 where was intended as a replacement for the R9 280 but it turned to be slower than the R9 280 until some new games where the architectural improvements can make the R9 285 edge out the old R9 280. however, still in average the R9 280 its faster. how so?. its good when you can compare it with the competition but what's happens when its compared to the directly successor?... I still think and I always say HD7900 series are probably the best AMD cards ever made, the longevity those cards have prove it. but don't let it blind you forgetting where they were placed originally and where are placed actually.
 
I'm guessing you're trying to talk about volumetric light and not god rays.

Anyway, the Ultra setting, according to the Nvidia Fallout 4 Graphics, Performance & Tweaking Guide, which was withdrawn, enables sub surface scattering. Which could account for some of the stuff you're mentioning, but have nothing to do with tessellation.

NVM, its tied to the lighting quality menu, my mistake.

It doesn't change the fact that technically the game has many shortcomings, more important than god rays.


Here's cache of the guide, it's been removed.

http://webcache.googleusercontent.c...-guide+&cd=6&hl=en&ct=clnk&gl=gr&client=opera

EDIT: what is the code for strikethrough?

Volumetric lighting and god rays are two different things you are correct on that, but god rays is an addition to volumetric lighting, god rays is actually volumetric scattering of light. To create god rays you need formula that calculates from point of origin to where that light ray is being intersected to final destination, to have good approximation in realtime its a very complex lighting scenario and its very costly.

oops that part is only the first part of the algorithm, after each point that intersects an object then rays have to be calculated from them too.
 
Last edited:
All I have to say is that after playing the game maxed out for the last two days. I think Godrays are great looking but I agree that they really dropped the ball in other important places, so much so that the game still looks like shit even with such great lighting.
 
All I have to say is that after playing the game maxed out for the last two days. I think Godrays are great looking but I agree that they really dropped the ball in other important places, so much so that the game still looks like shit even with such great lighting.

It's ok, the modders will fix everything, like all of Bethesda's games. Of course, the game will run even crappier than it does now. So, it'll have modern graphics in 2-3 years, maybe.
 
If it's not worth the performance hit, then there's not a big enough difference. Simple as that.

So you claim about no difference was shot down, we are now left with moving the goalposts from you side.

You know there is no valid point when fallacies, not arguments are being used.

(He is now complaining about the performance loss on a feature he cannot see the visual I.Q of himself, instead of just disabling the features he cannot see and properbly will not use)

I will now go back to gaming (with godrays on ultra) instead of wasting other peoples time with fallacies...*hint-hint*
 
So you claim about no difference was shot down, we are now left with moving the goalposts from you side.

You know there is no valid point when fallacies, not arguments are being used.

(He is now complaining about the performance loss on a feature he cannot see the visual I.Q of himself, instead of just disabling the features he cannot see and properbly will not use)

I will now go back to gaming (with godrays on ultra) instead of wasting other peoples time with fallacies...*hint-hint*

Try disabling God Rays from the options...

You can't.

Wasting people's time... by talking about the game in a discussion about the game... omg, you showed me again...
 
Last edited:
I missed most of this thread but if you're talking about FO4 you can most certainly disabled God Rays in the options. I wouldn't, though. It ruins the lighting for the entire game.
 
I missed most of this thread but if you're talking about FO4 you can most certainly disabled God Rays in the options. I wouldn't, though. It ruins the lighting for the entire game.

No, not from the options, it defaults to low when the game starts.

Scratch that, it reverts to low the moment you close the Advanced options window, it doesn't even wait for the game to load.
 
Last edited:
Try disabling God Rays from the options...

You can't.

Wasting people's time... by talking about the game in a discussion about the game... omg, you showed me again...

This isn't about me, despite your fallatious attempt at making it so.

Here:
https://www.reddit.com/r/fo4/comments/3s9dqu/how_to_turn_off_god_rays_pc/cwwc2nx

You should really buy a console, PC gaming seems to complicated for you.

You head must explode when thinking of modding Fallout 4.

I think you should by a console and stop posting fallacies...win-win.
 
This isn't about me, despite your fallatious attempt at making it so.

Here:
https://www.reddit.com/r/fo4/comments/3s9dqu/how_to_turn_off_god_rays_pc/cwwc2nx

You should really buy a console, PC gaming seems to complicated for you.

You head must explode when thinking of modding Fallout 4.

I think you should by a console and stop posting fallacies...win-win.

Says the man defending a console port.

Also, I'm talking about about the options menu, not changing an .ini file or using a console command (it's gr off btw). I'm guessing your reading skills aren't good enough to read through a two line post?

And was fallacy on your word of the day calendar yesterday?

Anyway, the games I'm playing can't come out on consoles.
 
Says the man defending a console port.

Also, I'm talking about about the options menu, not changing an .ini file or using a console command (it's gr off btw). I'm guessing your reading skills aren't good enough to read through a two line post?

And was fallacy on your word of the day calendar yesterday?

Anyway, the games I'm playing can't come out on consoles.

Let's recap your "partitiation" in this thread.

1) Claims NO vidual difference between ultra and low godrays. Claims gets debunked.
2) First claim debunked, moves to the fallacy of "moving the goalposts", claims not worth the performance. Claims get debunked.
3) Claims it is impossible to disable. Claim gets debunked.
3) Gets mad, use ad hominem. moves to the fallacy of "crap console port".

Why are you in this thread again?
To make false claims and them debunked?
To attack other posters?

Because you whine about settings you say you cannot tell apart in what you call a crappy console port.
Dosn't sound useful in any manner?

Now excuse me while I go back to gaming FO4 at max settings.
 
Let's recap your "partitiation" in this thread.

1) Claims NO vidual difference between ultra and low godrays. Claims gets debunked.
2) First claim debunked, moves to the fallacy of "moving the goalposts", claims not worth the performance. Claims get debunked.
3) Claims it is impossible to disable. Claim gets debunked.
3) Gets mad, use ad hominem. moves to the fallacy of "crap console port".

You're just using the "putting words in my mouth" fallacy:

1+2) I never said NO "vidual" difference. I've said that the ultra settings were a blurry mess compared to the low and high, and I've proved it. Really, here's the post with screenshots where I prove that it's blurrier.
3) I only said that you can't disable them from the options, you know, where every other PC game normally does that sort of thing.
4) The first person that got mad and used the word consoles were you... as in "you should move to consoles".

Hopefully you don't argue for a living.
 
Nvidia has a history of not bringing their previous gen cards to full potential in driver. For example, Fallout 4 has been released and Kepler SLI users can enjoy full Ultra FPS framerate simply by setting SLI to AFR2. Yet, the SLI profile that comes with the driver is broken. Why? Simply because Maxwell cards have not yet have SLI figured out yet. Since it does not looks good that the Kepler cards outperforming their Maxwell cards, they nerfed the drivers for Kepler users.
I notice that this practice has been going on for some time. Like GTX590 initially outperformed GTX680 but after a year or 2, GTX680 outperformed the 590 in every games.

it's normal that initially a GTX590 is faster than a GTX680, GTX590 has more horse power than a GTX680.

it's normal that after a year or two GTX680 outperformed the GTX590, GTX680 has more VRAM than a GTX590.

at the start, horse power is the thing that make the difference but on the long run VRAM is what usually cap performance.ù

GTX590 and GTX580 performance are great but 1.5GB VRAM is really too few for most modern title.
 
You're just using the "putting words in my mouth" fallacy:

1+2) I never said NO "vidual" difference. I've said that the ultra settings were a blurry mess compared to the low and high, and I've proved it. Really, here's the post with screenshots where I prove that it's blurrier.
3) I only said that you can't disable them from the options, you know, where every other PC game normally does that sort of thing.
4) The first person that got mad and used the word consoles were you... as in "you should move to consoles".

Hopefully you don't argue for a living.

You are cute with pictures in a thread that contains videos...oh well, I enjoy the game.
Have fun with your console...off to ignore-land you :)
 
You are cute with pictures in a thread that contains videos...oh well, I enjoy the game.
Have fun with your console...off to ignore-land you :)

Again with the facts, dude... It's like you're a conversation wizard.
 
Again with the facts, dude... It's like you're a conversation wizard.

He wont be giving any facts just skewing anything you post as to convey you meant something else. It would be best to ignore him. He rarely brings anything tangible to the discussion.
 
He wont be giving any facts just skewing anything you post as to convey you meant something else. It would be best to ignore him. He rarely brings anything tangible to the discussion.

Just a little comparison since I put you on ignore too :)

Post by me:

LOL

This is hillarious, but bear with me

All the ocean simulation is run on a small square and after that tiled out.
So that mesh you are seeing is not true indication of the load of how the games runs.
After that the game-engine does a z-buffer writeback for something that is called "GPU Occlusion Query"
It not rendered.
It's occluded ergo it is NOT drawn.

I would almost call that article dishonest.
Why?
Because they:
- Isolate the mesh
- Render the mesh at maximum tesselation
- Disable occlusion culling
- Disable dynamic tesselation
- Disable adaptive tesselation.

And the present it as "This is how the game works!"

But this is not what the games does.
The claim smells like another PR FUD thing.

It's all documented here:
http://docs.cryengine.com/display/SDKDOC4/Culling+Explained

People should read that...and if they do not understand it...they should stop posting about the topic.
Simple as that.
If you post about stuff you don't know anything about, in a negative manner...you become a FUD'ster.

(Besides I doubt NVIDIA had anything to with that "ocean"...it seems more like something coded by the devs themselfes to be frank.)

So lets see if I can tally the list so far:

"GameWorks is black box and AMD can do nothing!!!" = FUD and false PR
"Planned obsolence for "Kepler"" = FUD and false PR
"Crysis 2 is using to much tesselation because of evil NVIDIA" = FUD and false PR.

Besides AMD PR, you know what else is a common factor in all 3?

Ignorance about the topic.
But that dosn't stop people from screaming all over forums and IMOH making them selfes look like uninformed, rabid PR FUD'sters.

But quite funny that the arguments used to declare NVIDA for "evil" all seem to come from AMD PR...and spread around forums like gospel.

I think it would be a good idea for a "Technical GPU"-sub forum, where people with technical insigth could debate this things, with out all the false claims.

IT is complicated, I know, I have been working with IT in +20 years (coding, hardware, networking)..and there are a LOT of topics I will not enter, because I have insufficient knowlegede about the topic.
But I will read with joy, ask questions and learn.

I will not run around like a braindead parrot, posting crap I don't understand...but I guess people have different standards...


Technical, to the point, dealing with facts.


Very well then, lets try this. Do you believe that the results given in the benchmarks are SOLEY indicative of architecture and in no way, not even slightly, possibly due to less than credible or moral attempt or serious lack of effort or even just plain incompetence?

Honestly on any given day I don't care about the benchmarks or reviews when it comes to performance. They rarely speak directly to me as far as the performance I can expect. I hate blur, never use it, that usually adds a bit of performance there alone. And then take into account that I am willing and able to tweak the game in a number of ways to get the performance level I need/require for a pleasant experience. But every so often a game release shows numbers like this and then I feel concerned. Not so much for me but for gamers as a whole. I can generally tweak enough of the crap out of a game to get desirable results, but this doesn't mean I want to every time.

Maybe it would help if everyone, both sides and neutrals, stop looking at this as a "AMD isn't winning so it must suck" debate. I feel for the most part what the rebuttal side is saying is that the discrepancy between the models is far greater than, especially in this case, all other occurrences. This doesn't have to be a debate between AMD and Nvidia. It isn't really. It should be about gamers against devs.

We already see how huge tessellation affects the position of the AMD user. Hence why I mentioned more than once here how a Nvidia user would feel if Async was used to the same degree with similar results to this game. According to a few posters this should mean these Nvidia users should just go buy an AMD GPU and suck it up because that is the feature being used and AMD is just better at it. Sorry but that is a terrible answer to the subject matter at hand.


Yours is looking for "evil doing"...with no facts, only FUD...dressed up to look like "real consern".

Bye.
 
if we are talking about Crysis 3 and tessellation water rendering, any article that took screenshots from wireframe mode and thought that was what was happening in real game play, it was false. I have developed on the Cry Engine 1 and 2, which 3 is Cry Engine 2 for the most part with some additions for consoles and a few other shaders, not too many core changes. And this was discussed with an Cry Engine renderer developer at B3D.

Water is not rendered when playing the game, it is rendered in wire frame mode because occlusion is done on a per pixel level, so when wire frame mode is active there is nothing to occlude the water mesh, so it will be rendered in wireframe mode and tessellated. So the whole conspiracy of nV sabotaging the game/patch by using excessive tessellation is not true.

It is true that sometimes developers don't take precautions when adding in 3rd party libraries, and don't pay attention to QA after adding them in, because sometimes its a rush job, Crytek is not one of the developers, they don't add in 3rd party libraries into their engines because when doing so it will also affect the engine licensing, which they will never do. As we can also see Unreal Engine doesn't do that either because they want to have developers have full control over the product they are creating so they need access to the source code. Of course developers can add in game works libs as they wish since nV and Epic create a separate branch for that for easy integration.
 
Last edited:
Just a little comparison since I put you on ignore too :)

Post by me:




Technical, to the point, dealing with facts.





Yours is looking for "evil doing"...with no facts, only FUD...dressed up to look like "real consern".

Bye.

First you have no post in this thread that even mirrors a vain attempt at reason or facts. Not sure which thread you pulled that from but it isn't this and therefore doesn't contradict what I stated.

Second as far as this case and many this year are concerned, there are no facts as of yet one way or another. You cant prove it is on the level no more than I can prove it isn't. Despite this it shouldn't mean we cant discuss our thoughts or concerns. Nor does it allow you or any other to go thru and just blanket claim those with opposing views as liars.

At no point have I stated that there is DEFINITELY illegal or immoral activity here, just that the results can be reasonably attributed to such with some plausibility.
 
Back
Top