I have to give AMD some credit here...

zamardii12

2[H]4U
Joined
Jun 6, 2014
Messages
3,414
So, maybe this isn't AMD so much as it is the optimization of the game but here is what I did over the weekend...

So I have always held onto my last gaming rig and kept it in storage since I upgraded it a while ago, but I decided to re-purpose it over the weekend. At my work we were retiring a bunch of Dell 790s... 4th gen Core i5s (still Quad Core) and I added the memory from my old gaming rig into it to push it up to 8gigs of ram. I put in my PSU and I bought a 128GB SSD and had a 2TB spinning drive and the idea was to just put together a simple media-center PC. I have a wireless keyboard with a built-in touch pad so the wife and I could watch movies, Netflix, Sling, and TV shows I have stored on the 2TB drive.

But, I got to thinking over the weekend if the video card could handle anything from not long ago so I downloaded the latest driver I could find for the card which was 15.7.1 from 2015 (actually surprised it was supported for so long).

As you can see from my sig I am using a Nvidia card for my main rig now; but this is the card I held onto from my old gaming computer...

H697QT2G2M.jpg


It has a HUGE cooler on it. I had to take a metal cutting scissor and cut out the hard drive bays just to get it to fit inside that business-class Dell 790 mid-tower case.

Anyway, my first test was Resident Evil 7. Our TV is 1080p so I loaded it up @ 1080p with pretty much all the settings on High and I have had virtually no issues playing the game on there with this setup. I have the occassional FPS dip, but the game looks and runs fantastic 99% of the time. I am quite shocked as to how well it has been running. I am going to try some other newer games to see how things go, but I really have to hand it to AMD... If I didn't have that itch to always have the latest and greatest I don't see why the HD 6970 couldn't still run a decent amount of today's games with ease and I probably would still be okay if I hadn't upgraded. this probably isn't all due to AMD... likely Capcom optimized RE7 quite well but it's still impressive regardless.
 
Lol, sorry but until you post proof of it I don't believe that, there's no way in this earth that game will run with everything high at 1080P and still have a decent FPS, at best you are running with everything medium with 1080P interlaced rendering at a lower scale resolution than native and no more beyond the guest house..

I agree the game it's pretty much kinda well optimized (without take in consideration the poor vRAM management and the way they programed the Texture pool and Shadow rendering performance) and may be tricky on the first area of the game, once the game start to swap assets and change areas Im pretty sure everything will be hardly playable with that GPU and only 2GB of vRAM. I have seen much more powerful GPUS than that suffer this fate with this game (ie HD 7870) fooling a lot of guys in the first 30 - 60min of the game.
 
That's the problem.. everyone has that "itch" to upgrade and have the latest and greatest.. in return.. that is what drives the industry..

another point.. what is considered a "decent framerate" these days in 2018 and 2019?
 
Lol, sorry but until you post proof of it I don't believe that, there's no way in this earth that game will run with everything high at 1080P and still have a decent FPS, at best you are running with everything medium with 1080P interlaced rendering at a lower scale resolution than native and no more beyond the guest house..

I agree the game it's pretty much kinda well optimized (without take in consideration the poor vRAM management and the way they programed the Texture pool and Shadow rendering performance) and may be tricky on the first area of the game, once the game start to swap assets and change areas Im pretty sure everything will be hardly playable with that GPU and only 2GB of vRAM. I have seen much more powerful GPUS than that suffer this fate with this game (ie HD 7870) fooling a lot of guys in the first 30 - 60min of the game.
It's very believable depending on which settings exactly and what frame rate he considers acceptable. The measly 1060 3GB averages 81 FPS 1080P maximum settings with Shadow Cache off. Lower a few more and an average of 30+ FPS from this console port seems very doable.
 
Lol, sorry but until you post proof of it I don't believe that, there's no way in this earth that game will run with everything high at 1080P and still have a decent FPS, at best you are running with everything medium with 1080P interlaced rendering at a lower scale resolution than native and no more beyond the guest house..

I agree the game it's pretty much kinda well optimized (without take in consideration the poor vRAM management and the way they programed the Texture pool and Shadow rendering performance) and may be tricky on the first area of the game, once the game start to swap assets and change areas Im pretty sure everything will be hardly playable with that GPU and only 2GB of vRAM. I have seen much more powerful GPUS than that suffer this fate with this game (ie HD 7870) fooling a lot of guys in the first 30 - 60min of the game.

It's very believable depending on which settings exactly and what frame rate he considers acceptable. The measly 1060 3GB averages 81 FPS 1080P maximum settings with Shadow Cache off. Lower a few more and an average of 30+ FPS from this console port seems very doable.

When I go home tonight I will verify. I know for sure I am playing 1080p and have some things set to high but maybe not everything. But again, subjectively to me what I have setup ran it surprisingly well and looks pretty damn good, but since I didn't pay too much attention when initially configuring the settings I will verify the settings when I get home.
 
I was extremely impressed on how well RE7 plays on old hardware. I played it on my old GTX 670 2GB @1200 w/ rig in sig @ max settings at first (1080p). Able to sustain 60fps until I hit the basement areas, that's when it became a slideshow. I had to lower shadows and turn the shadow cache off, after that it was smooth sailing all the way to the end. Thinking about doing a replay on my GTX 960 4GB to see if the extra VRAM makes a difference.
 
Try the 16.101.X Pre-GCN driver re-pack for your HD6970. I run an antiquated HD6750 (BIOS overclocked to 825/1250) in my daily driver as I mostly play older games, but do play some newer stuff. I have found this driver pack works very well and solved compatibility problems I was having with the 15.7 and 15.11 beta drivers with Far Cry 3/4/5. I have a 24" Samsung 1366x768 monitor because I'm blind and like the lower resolution for larger text, and can even play Far Cry 5 and Doom 2016 on my relic of a video card at low settings just fine (30 FPS avg).

https://forums.guru3d.com/threads/16-101-x-pre-gcn-driver-pack.409002/
 
That's the problem.. everyone has that "itch" to upgrade and have the latest and greatest.. in return.. that is what drives the industry..

another point.. what is considered a "decent framerate" these days in 2018 and 2019?
60fps will always be the standard at first.
120 and up isn't going to be possible on games that push the envelope hard which is why 60 is the standard.
 
Depends on the type of game.

RPGS at 30 is generally fine for me , slow paced shooters would probably fit that too.
This I should have mentioned, thank you for that.
It's very much how I tend to look at my games as well. Typically single player I can deal with 40fps or higher.
 
Back
Top