Radeon™ RX 6000 Partner Showcase Ep. 2: Godfall & Counterplay Games

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,634

Radeon™ RX 6000 Partner Showcase Ep. 2: Godfall & Counterplay Games​




In this series we look at new technologies our partner studios have implemented in several upcoming next-generation titles. Graphics and Performance are at the core of the PC gaming experience. With AMD Radeon™ RX 6000 series & Ryzen™ 5000 series CPUs, gamers can expect a new standard for high-performance gaming.

Developed by Counterplay Games, Godfall™ is a new IP set in a bright fantasy universe. You play as one of the last members of the Valorian Knights, god-like warriors able to equip legendary armor plates known as Valorplates. Tear through foes as you climb through the elemental realms and challenge the mad god, Macros who awaits you at the top. In collaboration with AMD, the studio has been able to introduce several features such as DirectX® Raytraced shadows, FideltyFX™ Contrast Adaptive Sharpening, and more.

Pre-purchase Godfall™ on the Epic Games Store: https://www.epicgames.com/store/en-US/product/godfall/home

AMD Radeon™ RX graphics paired with AMD Ryzen™ processors enable ultimate gaming experiences from 1080p to 4K in today’s top titles. Enjoy high fidelity, immersive gaming on all Radeon RX 6000 series products with AMD Radeon™ Software Adrenaline 2020 Edition. Complete the experience by adding a FreeSync™ technology-enabled monitor for ultra-smooth, low latency gaming. Play it all on Radeon™ graphics and Ryzen™ processors. Discover AMD Radeon graphics: https://www.amd.com/en/graphics/amd-radeon-rx-6000-series
 
Doesn't look bad at all. But the draw distance on grass and such is pretty limited it seems. I noticed some snap in on ground clutter and grass in that video. I wonder if that was specific to that area/scene or what. Otherwise it looked very good. No real reflections done but nice shadows I suppose.

Still want a 6800xt ;)
 
The lights through the Aztez style wind chimes, does look pretty cool. Be interesting to see some benches of older games with RT added vs a few of these next new titles intended to launch with the consoles.

Godfall releases Nov 12... should be in time for hopefully some reviewers to get them into their benchmark decks.
 
Cant wait to get my grubby hands on a 6800xt

Maybe 2 of them for each of my rigs
 
Developer casually mentions using 12GB of VRAM at 4K resolution and 4K x 4K texture sizes at 1:15.

Remember, "using" and "requiring" aren't the same. But yes at 4K in upcoming games I can see 12GB being a problem.
 
That's exactly what I was afraid of...

I still want a 3080. I just hope they have a 20GB model.
Sounds like the initial rumored (with very strong evidence... ordering SKUs where creaated) 20gb models where canceled.

I think NV folks will be waiting till March or April for some form of super version... which will probably be 7nm TMSC as well. Probably going to be a lot of pissed off 3080 users. I do believe Nvidia could probably put out a 7nm ampere that will best the 6000s but there going to have to wait in line for fab space... or do something Nvidia has never done. Pay market value for their fab space. lol

Pretty good to see games that look half decent that may actually push Vram past 8GB though.
 
This reminds me of when the 980 launched with just 4GB. That was a good card that got hamstrung. 980ti is viable today partially due to the 6GB frame buffer, and I’ve been able to keep my old 770s in use far longer than expected because they’re the 4GB models.

I think Nvidia’s RAM production went sideways, and they underestimated the impact of consoles with 16GB on developers. The consoles bring the new VRAM base line up to 10GB(ish)....and that’s before adding the benefits of their texture compression systems and storage systems.

I’m hoping the 3080ti will be 20GB. I don’t even need more shaders. Just give me the VRAM.
 
Sounds like the initial rumored (with very strong evidence... ordering SKUs where creaated) 20gb models where canceled.

I think NV folks will be waiting till March or April for some form of super version... which will probably be 7nm TMSC as well. Probably going to be a lot of pissed off 3080 users. I do believe Nvidia could probably put out a 7nm ampere that will best the 6000s but there going to have to wait in line for fab space... or do something Nvidia has never done. Pay market value for their fab space. lol

Pretty good to see games that look half decent that may actually push Vram past 8GB though.

It looks like nVidia will release 20 GB version of 3090. Most probably the price should be competitive with 6900 XT at $999

https://www.thefpsreview.com/2020/1...-geforce-rtx-3090-10496-20-gb-of-gddr6x-vram/

According to kopite7kimi, the GeForce RTX 3080 Ti will boast a CUDA Core count of 10,496, which happens to be the same impressive amount as the GeForce RTX 3090.

It falls short of the BFGPU in the VRAM department, however: instead of 24 GB of GDDR6X, the GeForce RTX 3080 Ti will only have 20 GB to work with.

kopite7kimi also claims that the GeForce RTX 3080 Ti will mirror the memory speed (19 Gbps) and TGP (320 W) of the standard GeForce RTX 3080, as well as its lack of an NVLink port for SLI setups.
 
Developer casually mentions using 12GB of VRAM at 4K resolution and 4K x 4K texture sizes at 1:15.
I read his actual statement, he only said it "requires tremendous memory bandwidth to run smoothly", and that 4K texture is using 12GB. Whether that's allocated or actually in-use isn't mentioned and I'd wait for independent analysis.

Of course the game appears to look and play like garbage, and being Epic exclusive makes it all the more forgettable, but that's besides the point.
 
I read his actual statement, he only said it "requires tremendous memory bandwidth to run smoothly", and that 4K texture is using 12GB. Whether that's allocated or actually in-use isn't mentioned and I'd wait for independent analysis.

Of course the game appears to look and play like garbage, and being Epic exclusive makes it all the more forgettable, but that's besides the point.
When a dev says "4k textures are using" I would think that means... "Using", not pre-allocating. Obviously we can only guess but the term he used typically has a specific meaning. Just because you want it to mean something else doesnt make it so. We obviously won't know for sure until h clarifies he didn't misspeak or we get independent benches that show slowdowns with less ram. Either way, good to see AMD have some partner/games coming out as it seemed for so long everything was an Nvidia partnered game.
 
I remember when AMD's Fury came out people bitched and rightly so over the low amount of memory. Now the shoe is on the other foot but we are supposed to put up with "allocated is not in use" . It's ridiculous.
Who is this imaginary "we" and what does Fury have to do with anything? Its VRAM was the least of its problems, and I'd say the shoe is only on the other foot if someone is already suffering under the delusion of some tit-for-tat, us-versus-them, imaginary fanboy brand war.

Verifiable facts should be the goal. And I don't think anyone rational would argue it's not great we have choice in VRAM sizes, now across multiple brands with R6000 looking strong. But it's silly to imply that an outlier, partnered game means 10GB GPUs are obsolete or underleveled. Most AAA new releases will make sure they support 3080. For the same money, I'd rather have 20% more raster performance at the cost of 20% VRAM (hell, I'd take 10% more raster and give up 30% VRAM), since the VRAM will tend to matter less across a broad swath of games. Someone else might have different needs and run a few games that are mod heavy and eat tons of VRAM. Choice is cool.
 
Last edited:
Who is this imaginary "we" and what does Fury have to do with anything? It's VRAM was the least of it's problems, and I'd say the shoe is only on the other foot if someone is already suffering under the delusion of some tit-for-tat, us-versus-them, imaginary fanboy brand war.

Verifiable facts should be the goal. And
I don't think anyone rational would argue it's not great to have choice in VRAM sizes, now across multiple brands. But it's absurd to try to imply that an outlier, partnered game means 10GB GPUs aren't enough. Most AAA's will make sure they support 3080. For the same money, I'd rather have 20% more raster performance than 20% more VRAM, since the VRAM will tend to matter less across a broad swath of games. Someone else might run mostly a particular game that's mod heavy and eats VRAM. Choice is cool.

I don't know. It's kind of a fair comparison as both cards are/were perceived to be launched with "Not enough VRAM." The Fury performed well enough (at too high of a price) even with the VRAM limitation, and I'd expect that the 3080 will as well for the reasons you mentioned.
 
So the 3080 I just bought can’t maximize settings but the cheaper AMD card can? Ouch.
 
So the 3080 I just bought can’t maximize settings but the cheaper AMD card can? Ouch.
That is why you never jump on something as soon as it comes out, wait a little to work out the kinks and see what it lacks..
 
Can only speak for myself, I've purchased 2 3080 cards. One of those for $699 and the other for $809 to upgrade my wife's 2080ti(she wanted the 3090) and my 2080. I have loved every minute of having these cards even finding out about the shit power curves that need tweaking. It's been so long since I've gotten enjoyment on this level about PC tech that I think its the early 2000's again. The best thing is I'm not finished I have zen3 to get and 2 6800xt's still to buy and play with for my nephews PC's. These launches have been the shining light for 2020 and I am so hyped to get more toys to play with or talk about with my nephews. I've gotten so many calls and texts just going over the stuff they are reading, soon as they finish online school work they ask for rides over from anyone that will listen to play on or with their systems. I feel sorry for the people that are dead set on one brand but can't find one, these are the times were you need to rethink your ridiculous blind support for one brand. If AMD would have launched first I would have 2 of those instead grabbing 2 Nvidia's later.
 
I remember when AMD's Fury came out people bitched and rightly so over the low amount of memory. Now the shoe is on the other foot but we are supposed to put up with "allocated is not in use" . It's ridiculous.
Was it rightly so ?

Looking back in 2020 to a say 1060 6GB vs 1060 3GB:
https://www.gpucheck.com/compare/nv...6700k-4-00ghz-vs-intel-core-i7-6700k-4-00ghz/

Considering the 3GB has 10% less core the difference in ram possibly make no difference in 2019 games ( 3 year's after the release of the cards) even in 4K resolution, making 4 gig of VRAM sound like possibly a perfect amount for a 2015 card and that I overpaid for my 1060 gt with 6 gb, that use never manifested (i.e. the card is not powerful enough anyway to play games at setting that would use all of it).
 
Was it rightly so ?

Looking back in 2020 to a say 1060 6GB vs 1060 3GB:
https://www.gpucheck.com/compare/nv...6700k-4-00ghz-vs-intel-core-i7-6700k-4-00ghz/

Considering the 3GB has 10% less core the difference in ram possibly make no difference in 2019 games ( 3 year's after the release of the cards) even in 4K resolution, making 4 gig of VRAM sound like possibly a perfect amount for a 2015 card and that I overpaid for my 1060 gt with 6 gb, that use never manifested (i.e. the card is not powerful enough anyway to play games at setting that would use all of it).
That doesn't show tests done for low VRAM at all. I ran into a VRAM issue on my GTX 780. I was playing Alice Madness Returns. It was VERY obvious I was running into VRAM problems. Even though a 780 is faster than the 470 normally but it wasn't in this game precisely because of low VRAM. The game required 4GB with everything maxed. There were stutters. I had to put in a RX 470 with 8GB to play the game smoothly precisely because of VRAM. Turned down the Physx setting on the RX470 and there was no more stuttering. So yes rightly so. Again, there have been plenty of benches now that show problems with low VRAM usage. Pretending like it never happens is just straight up being dishonest as to why VRAM is there at all.
 
Last edited:
Who is this imaginary "we" and what does Fury have to do with anything? Its VRAM was the least of its problems, and I'd say the shoe is only on the other foot if someone is already suffering under the delusion of some tit-for-tat, us-versus-them, imaginary fanboy brand war.

Verifiable facts should be the goal. And I don't think anyone rational would argue it's not great we have choice in VRAM sizes, now across multiple brands with R6000 looking strong. But it's silly to imply that an outlier, partnered game means 10GB GPUs are obsolete or underleveled. Most AAA new releases will make sure they support 3080. For the same money, I'd rather have 20% more raster performance at the cost of 20% VRAM (hell, I'd take 10% more raster and give up 30% VRAM), since the VRAM will tend to matter less across a broad swath of games. Someone else might have different needs and run a few games that are mod heavy and eat tons of VRAM. Choice is cool.
You like Nvidia. Got it. Personally I just follow best perf/$. But to each their own.

I don't think anyone is saying 10GB is obsolete. But it's not future proofed as well as say the 3090 is. If Nvidia does release a Ti with more memory it too would be better.
 
Last edited:
Back
Top