5800X3D Desktop Replacement

Did you even read the article you just referenced?

View attachment 516865


Higher Framerate != Lower Latency (in DLSS 3 FG)
I see it hard for you to understand that DLSS 3 with reflex on has a sizable uplift in FPS while still having better than native input lag with reflex off.

Keep thinking whatever you want, but DLSS 3 is shaping up to be a pretty good add on to Nvidias arsenal no matter how you slice it.
 
A lot of it is on the developer, too. The settings in today's games allow you to turn the image quality down far enough that the game will get 60fps on a smartwatch. They do this because they want to capture the widest possible audience. Globally, 90% of hardware is always 3-generations behind at any given moment. That's a lot of potential customers to cut out. On top of this, they've also been unwilling to write engines where the underlying mechanics aren't fundamentally keyed to the frame rate.

At the same time, the gamers are an incredibly whiny group. There's less crying in a typical daycare. You'd think the developers had murdered entire families of gamers just because they set minimum requirements above a 386. On top of that, gamers are convinced that they're basically all global esports champions who make their living by winning tournaments and streaming on twitch. "An extra 500ns of lag is easily noticeable and makes the game totally unplayable" typed the gamer on his BT keyboard. Meanwhile, LTT has had legitimate pro gamers be completely unable to tell the difference between 90fps and 144fps in blind testing.
Actually, averaged out, LTT's test crews always do better overall, on high framerate monitors. As you mention, The testers have sometimes even included notable pro gamers.
 
Linus tech tips did it, and for regular people on a snipper experiment they saw different result, there is a bunch of study on this.



At the time tag, they show difference between the tests at 144 FPS and 144 hz vs 240 FPS-240hz and it is giant to some and small but seem significative to others, high average hits rate, much less std.

View attachment 516508

If those experiment were well done (blind, enough trial, etc...), seem that for competitive shooter it would not just be voodoo, https://kr4m.com/high-fps-better-gamer/

It also show than past 144hz-fps, it is quite the diminishing return for good player it seem

I see a flaw in their argument, it is not FPS and hz of the monitor but pixel response time of the monitor while yes LCD"s pixel response time improves with more FPS but still does not compare to OLED pixel response times. In short having the most recent image data to react to gives you a potential advantage, OLED rules for motion clarity and speed.

240 hz may seem smother than let say a 120 hz OLED, is it more due to the blur of LCD displays when updating the pixel or the higher FPS? Probably both, while OLED sharp changes frame to frame is discernable. Of course adding motion blur would smooth that out perception wise while smudging up the visual cues to shoot and aim.

Would like to see OLED panels compared as well is the bottom line.
 
thats interesting that you chose around the ati 9600 gen as the staying power example, because i use around that date range as an example of when computing was exploding. like 1990 - 2007 cpus were doubling in speed every 2 years. i remember trying to use a pentium 2 when athlon palamino was already out, it was painful. that was like a 4 year difference. i had similar feelings using palamino when core 2 duo was out. these days i can use a 10 year old sandy bridge laptop and get most of what i need done. its obviously not great, but it's usable. 4 year old cpus dont even feel that old now. gpus the story is a little different.
Yeah, it was a 2.4 GHz Northwood with the ATI and I remember using it for a good long while. I think it became something of a novelty at some point, but it ran well until I bricked it with the "latest" BIOS, not even sure what I was trying to do with it anymore.

I've been on an upgrade kick ever since. Had my moments where I leveled off. I think Intel hit their Stride and Sandy Bridge and then just rode that wave for like 5 Generations of CPUs. Started shifting gears around 8th Gen due to AMD actually coming out swinging. I figured AMD was near death, they either released something of relevance or would just become the budget console processor for life and eventually fade into obscurity. I was a big fan of the 8th and 9th Gen Intel parts, good cheap and fast i5's for gaming. Then I switched to AMD and I have to say, I was somewhat disappointed until the 5000 series.

Now, I've hit this comfortable point where most of my stuff is 5000 series AMD parts and probably won't upgrade for a while. They hit or exceeded Intel parity with pre-12th gen parts and that's fast enough for me. Unless, I see the 13th gen parts and have disposable income. Though I doubt it.

The 5800X3D is kind of like a gaming flagship that came out of left field. AMD thought they had the tech for the previous generation, but realized it in the 5000 series and now they are having issues with it again in the 7000 series due to die complexity and machines not being up to the task at that node size.

I feel that this laptop would probably be a pretty sweet desktop replacement for anyone, for a couple generations. Though, the cheapo HP 5700u I picked up is pretty damn good too. Might suck in gaming, might have a terrible audio codec (It does, audio regularly chops and cuts out ...). However, for general usage it will be perfect for a long time. All I do is publishing work on it. Pretty close to my 5900X in performance, though I do see the actual difference in real world performance. There's this snappiness that feels like a good 25% faster on my desktop. Though, that "feeling" is not what it was 10 years ago when it was a must have kind of performance. Now, one is just seconds faster than the other, not minutes.
 
I mean, you know what the kids are doing these days though right?

The YouTubers are telling them that framerate is everything, and that you have to minimize all quality settings (and even use config file and command hacks) to disable as much as possible to make sure they get 165hz or whatever their monitor supports constantly.

And these fools are buying it up. They are essentially ruining their own gaming experience in an obsession to have their framerates flatlined at the max their monitor can handle.

No sacrifice is too great. My stepson was running his fancy FreeSync2 1440p monitor at some low-ass blocky resolution with everything minimized in games, and he doesn't even need it, because he has a pretty decent GPU, and he won't change it back no matter how much I educate him, because the fools on youtube and twitch tell him otherwise, and shit, what do I know compared to them? I've only been doing this for 30 years...

It's stupid out there.
No joke man. I still play Battlefield, Forza and a lil' bit of COD online and I fully max out my settings, at 4K, even use Ray Tracing... lol. I'm no ultra-sweat lord playing in competitive matches, but I'd say about 90% of time; I am in the top 3, if not first place every round playing in public servers. Works for me, as old as I am getting, I am happy with that... lol. Plus I enjoy playing with friends and have a good time.

The main reason I started PC gaming (back in the NES / SNES days), and still play PC games today is for the graphics and the reason I build beast machines is to run max graphics at good frames. Honestly; 120~144 FPS feels perfect to me and when I do get dips into the upper 90's, it still does not bother me or effect my gameplay. Most these MP games have shit tick rates well below what your render latency is anyway, so congrats, you get 260 FPS + on peasant graphics, but the server will still only poll "X" number of times during that same time slice, so enjoy your poop graphics you plebs while I still down you with pretty little rays of light and foliage... lol.
 
I mean, you know what the kids are doing these days though right?

The YouTubers are telling them that framerate is everything, and that you have to minimize all quality settings (and even use config file and command hacks) to disable as much as possible to make sure they get 165hz or whatever their monitor supports constantly.

And these fools are buying it up. They are essentially ruining their own gaming experience in an obsession to have their framerates flatlined at the max their monitor can handle.

No sacrifice is too great. My stepson was running his fancy FreeSync2 1440p monitor at some low-ass blocky resolution with everything minimized in games, and he doesn't even need it, because he has a pretty decent GPU, and he won't change it back no matter how much I educate him, because the fools on youtube and twitch tell him otherwise, and shit, what do I know compared to them? I've only been doing this for 30 years...

It's stupid out there.
My 17 year old son is almost that bad. I walked in on him playing metro exodus today and made him get up. I turned on HDR and maxed his graphics and set his monitor to not be full bright(see in dark)
.
He was like wow that looks good, but its laggy. He was getting 70-90fps in a single player game ffs.
 
I see a flaw in their argument, it is not FPS and hz of the monitor but pixel response time of the monitor while yes LCD"s pixel response time improves with more FPS but still does not compare to OLED pixel response times. In short having the most recent image data to react to gives you a potential advantage, OLED rules for motion clarity and speed.

240 hz may seem smother than let say a 120 hz OLED, is it more due to the blur of LCD displays when updating the pixel or the higher FPS? Probably both, while OLED sharp changes frame to frame is discernable. Of course adding motion blur would smooth that out perception wise while smudging up the visual cues to shoot and aim.

Would like to see OLED panels compared as well is the bottom line.
I would be curious about this as well.
 
Back
Top