Starfield

Multiple reviewers saying to finish the main story before going after side stuff. Includes before and after NG+.
 
from "It's kind of obvious there is a contract / agreement, but we don't know the specifics." to "guesses" in just a few posts.... And people wonder how all this trash gets started and repeated?
In this day in age. If you keep screaming about something even though you have no proof or evidence means that it must be true! People like to join on on things when usually 90% of the time the people doing the screaming haven't even touched the product because they read it on the internet.

Gotta love the times.
 
New AMD driver says up to 16% more performance in Starfield @ 4K, for 7900 XT/XTX. Should be enough to give the XTX locked 60fps on Ultra, with no FSR.


This is what I am interested in.

I know my Threadripper 3960x is aging, but I'm hoping it will get me through this title with 60fps minimums.

I wonder if it uses lots of threads? :D
Turn off SMT. You will get at least a small performance boost.
"By all accounts," you mean by rumors you heard on line? Guess you missed Frank Azor directly addressing this exact rumor?

Edit: "AMD claims there’s nothing stopping Starfield from adding Nvidia DLSS" https://www.theverge.com/2023/8/25/22372077/amd-starfield-dlss-fsr-exclusive-frank-azor
If AMD was so bullish----the game would have launched with FSR 3! This is their highest profile partnership in over a year.

I think that most likely, its a huge game. And Bethesda didn't have time to test other stuff. FSR 2 works on 'everything', so that's the smart move for now.
 
In this day in age. If you keep screaming about something even though you have no proof or evidence means that it must be true! People like to join on on things when usually 90% of the time the people doing the screaming haven't even touched the product because they read it on the internet.

Gotta love the times.
When that "story" came out, I was shocked that AMD would do that. I called an AMD competitor contact that I have had for 20 years. Long story short, "It is all bullshit, AMD does not do that." Called a couple others, same answer. That story was a plant by NV IMO to make AMD the bad guy, and now it all makes sense.

Resources at NV going away from gaming IMO, and it looks pretty shitty to let Starfield RTX owners "suffer" due to that if that is what NV lets the story be. Instead we get "AMD locked us out of Starfield." And I was just told that perf is looking suspect in the NV vs AMD camp as well. I bet we get another "It's AMD's fault," if perf comparisons end up being "bad" for NV. Guess we will see soon enough.

zzzz.jpg
 
[Frank Azor, AMD gaming marketing head] He admits that — in general — when AMD pays publishers to bundle their games with a new graphics card, AMD does expect them to prioritize AMD features in return. “Money absolutely exchanges hands,” he says. “When we do bundles, we ask them: ‘Are you willing to prioritize FSR?’”
Starfield is bundled with AMD GPUs/CPUs so it sounds like a contractual agreement between Bethesda and AMD.
But without the full details we don't know why DLSS hasn't been implemented, just guesses.

I tend to think of it as both FSR and DLSS take developer time to implement.

Most titles are always rushing towards a deadline to try to hit the market.

I interpret (but could be wrong) AMD's line about prioritizing FSR that when there is a time crunch at the end they will do FSR first, and allow Nvidia and Intel to get their specific technologies implemented "when there is time".

And even if you aren't an AMD fan, this approach kind of makes sense because if the developer goes for FSR first, the game launches with scaling that - while it may not be optimized for Nvidia hardware - works on all GPU's, whereas if they instead prioritized DLSS, scaling would not work on non Nvidia GPU's.
 
I tend to think of it as both FSR and DLSS take developer time to implement.

Most titles are always rushing towards a deadline to try to hit the market.

I interpret (but could be wrong) AMD's line about prioritizing FSR that when there is a time crunch at the end they will do FSR first, and allow Nvidia and Intel to get their specific technologies implemented "when there is time".

And even if you aren't an AMD fan, this approach kind of makes sense because if the developer goes for FSR first, the game launches with scaling that - while it may not be optimized for Nvidia hardware - works on all GPU's, whereas if they instead prioritized DLSS, scaling would not work on non Nvidia GPU's.
Exactly. And to be to the point, AMD effed this up to by not addressing it through the end of the development phase. AMD left the door open for NV to run with that story. IMO.
 
Exactly. And to be to the point, AMD effed this up to by not addressing it through the end of the development phase. AMD left the door open for NV to run with that story. IMO.
Agreed. I also think AMD dropped the ball to not include FSR 3 with Starfield. I think that would of been a good PR move.
 
Resources at NV going away from gaming IMO, and it looks pretty shitty to let Starfield RTX owners "suffer"
To think Nvidia could fix everything by scooping up a few of these lucrative partnerships and give "80+%" of PC gamers the "correct" experience.

Would be so little compared to their often stated massive income over the competition.

But eh, guess they just don't care. Would suck if I bought a GPU based solely on Nvidia marketing....
 
I tend to think of it as both FSR and DLSS take developer time to implement.

Most titles are always rushing towards a deadline to try to hit the market.

I interpret (but could be wrong) AMD's line about prioritizing FSR that when there is a time crunch at the end they will do FSR first, and allow Nvidia and Intel to get their specific technologies implemented "when there is time".

And even if you aren't an AMD fan, this approach kind of makes sense because if the developer goes for FSR first, the game launches with scaling that - while it may not be optimized for Nvidia hardware - works on all GPU's, whereas if they instead prioritized DLSS, scaling would not work on non Nvidia GPU's.
Yup, probably working on FSR3 right now or after fixing release bugs. It's new tech and they have multiple platforms to juggle.

On other news 8GB VRAM isn't an issue per Digitaltrend's performance article. Too bad medium preset suffers significant image quality. O well.
Will pickup a 6800 XT if priced well or 7800 XT on Sept 6.
 
When that "story" came out, I was shocked that AMD would do that. I called an AMD competitor contact that I have had for 20 years. Long story short, "It is all bullshit, AMD does not do that." Called a couple others, same answer. That story was a plant by NV IMO to make AMD the bad guy, and now it all makes sense.

Resources at NV going away from gaming IMO, and it looks pretty shitty to let Starfield RTX owners "suffer" due to that if that is what NV lets the story be. Instead we get "AMD locked us out of Starfield." And I was just told that perf is looking suspect in the NV vs AMD camp as well. I bet we get another "It's AMD's fault," if perf comparisons end up being "bad" for NV. Guess we will see soon enough.

View attachment 595203

It seems almost unbelieveable that at the outrageous prices they have been charging lately, they predict a loss from gaming (If I am reading that chart right)

But yeah, the money is in AI and machine learning for sure. Those guys will spend thousands or even tens of thousands for a GPU without batting an eye.
 
It seems almost unbelieveable that at the outrageous prices they have been charging lately, they predict a loss from gaming (If I am reading that chart right)

But yeah, the money is in AI and machine learning for sure. Those guys will spend thousands or even tens of thousands for a GPU without batting an eye.
I see what you are saying about loss from gaming, but that is Cost of Revenue for the entire pipeline. Not just gaming.
Gaming is up 11% Q/Q.
Edit: They have gains across all departments.
 
And I was just told that perf is looking suspect in the NV vs AMD camp as well. I bet we get another "It's AMD's fault," if perf comparisons end up being "bad" for NV. Guess we will see soon enough.
IMO, its Nvidia's own fault. Their drivers have notably more CPU overhead than AMD's. And in some of these new games which only use a couple of CPU threads, or games which are CPU intensive in general----Nvidia cards can get choked out, in CPU limited scenarios. High end cards like 4080 and 4090 don't see their full potential on the best CPUs. And mid-range parts can also suffer, if the player is using a CPU which is a generation or two old.
 
Hey now I'm left handed too, but maybe early age PC use made me somewhat ambidextrous when it comes to using a mouse/keyboard.

I've been using computers since I was a kid (41 now) but I'm too retarded to use the mouse with my right hand. I am not ambidextrous at all.
 
I've been using computers since I was a kid (41 now) but I'm too retarded to use the mouse with my right hand. I am not ambidextrous at all.
Not gonna lie, when I see those left handed mice, they look like alien devices to me :D

Hey, whatever works.
 
IMO, its Nvidia's own fault. Their drivers have notably more CPU overhead than AMD's. And in some of these new games which only use a couple of CPU threads, or games which are CPU intensive in general----Nvidia cards can get choked out, in CPU limited scenarios. High end cards like 4080 and 4090 don't see their full potential on the best CPUs. And mid-range parts can also suffer, if the player is using a CPU which is a generation or two old.

Wouldn't this be something DX12 splitting the GPU related CPU tasks over all cores would help with?

Still probably going to be one thread that is pinned, one thread at 80%, two more threads at ~12% and the remaining 44 1-2%.
 
But with your avg gamer having at least 6c/12t, I would think at least 4 threads wouldn't be too much to ask for?

Especially since we've had dual cores in 2005 and quads in 2006.
I think they have had more than enough time to do it. Speaking of which, there are more than a few games that can take advantage of at least 8 threads.
Depends, it just may not be possible in some kinds of engines. There are tasks you just cannot split out in parallel. As I said, it's easy to say "they should just do it because we have the CPUs," that doesn't mean it is so easy to do.

I don't disagree they should do it if they can, but this idea that developers could just make it use more cores if they really wanted to is not necessarily true. For that matter it probably does use more cores, it just can't load them down. You can spin things off on to other cores but if they aren't that intense it may not hit that core a whole lot.

As a not precisely analogue situation: I've had cases in DAWs where I've pegged out a core, but had plenty of CPU available over all. Audio processing is easy to split up, each effect can run as a separate thread and you can process all the FX on all the channels in parallel. However that doesn't mean you can do EVERYTHING in parallel. The situation I had that pegged a core, script processing in a plugin. The instrument I was using had a rather intense script that it ran to determine what it did, and a lot of playback was happening at once. Since it is a script, it isn't something you could just toss off to more threads, and as such that one plugin was pegging the core it was on, even though the CPU as a whole had plenty of power left to do things. It wasn't that the DAW wasn't processing things in parallel, it was that one thread was WAAAAY more intense than the others.
 
I also think AMD dropped the ball to not include FSR 3 with Starfield.

Yup, probably working on FSR3 right now or after fixing release bugs. It's new tech and they have multiple platforms to juggle.

Let's face it, if AMD could have done it, they would have done it as this is the best marketing platform they will have for a long long time. To put out "half assed" FSR3 with Starfield would have not been a good move.
 
But with your avg gamer having at least 6c/12t, I would think at least 4 threads wouldn't be too much to ask for?

Especially since we've had dual cores in 2005 and quads in 2006.
I think they have had more than enough time to do it. Speaking of which, there are more than a few games that can take advantage of at least 8 threads.

Unfortunately it doesn't really matter how much you want something.

As any Computer Science major from a reputed school will tell you, true multi-threading is hard, and in many cases just not possible no matter how many resources you throw at something.

A lot of the calculations that we do on computers, particularly when it comes to game logic in first and third person shooters are time dependent, and when you are time dependent, the input for one calculation on one thread is often going to be dependent on the output of the previous calculation, so regardless of whether or not you put that calculation on a different CPU core, you are still going to have to wait for the first one to finish, before you can start the second, unless you design a time traveling CPU.

Strategy games have some more opportunities for multi-threading (like Civilization between turn calculations) but even there it is difficult, and if you don't do it just right you wind up with thread locks (thread 2 waiting for the output from thread 1 which is waiting for the output from thread 2, thus causing an infinite hang) or things that are technically multi-threaded, but actually run slower than the single threaded solution due to cache thrashing, etc.

What we have in most modern titles is actually not true multi-threading (where small tasks are split off and are run on different threads) but rather they have split off different portions of the game onto different threads, and then have them communicate with each-other. You might have the main game logic on one thread (usually the one that is pinned) audio on another thread, physics on yet another thread, etc. etc. so instead of true multi-threading, you just have a bunch of single threaded tasks that communicate status updates to each-other.

While there are some tasks (like rendering, encoding, etc) that lend themselves very well to multi-threading, the sad truth is that a majority of code just can't be multi-threaded, no matter how many talented programmers you hire and how much time you let them spend on it.

My educated guess is that when it comes to big resource heavy programs (like games) most things that can be multi-threaded already have been, and have been for some years now, and that we shouldn't expect this to get any better with time, as additional efforts are just going to be wasted. Many of these loads simply cannot be multithreaded regardless of how much we would desire it, how much time passes, how many new research findings are discovered, etc. etc. It's just not theoretically possible.
 
Hmm, should I even try to run Starfield on a 4790k.

Well, they list the minimum specs as an I7-6800k or a Ryzen 5 2600X

A 4790K isn't THAT far off from either of these, but it does have fewer threads which may be a problem.

I honestly couldn't tell you.

It's going to depend on how hard the game hits threads above 4, and what your expectations of "playable" are.

If it is a game you are interested in, I'd give it a try, and just utilize Steams refund policy if the performance is disappointing to you.
 
Back
Top