• Some users have recently had their accounts hijacked. It seems that the now defunct EVGA forums might have compromised your password there and seems many are using the same PW here. We would suggest you UPDATE YOUR PASSWORD and TURN ON 2FA for your account here to further secure it. None of the compromised accounts had 2FA turned on.
    Once you have enabled 2FA, your account will be updated soon to show a badge, letting other members know that you use 2FA to protect your account. This should be beneficial for everyone that uses FSFT.

Gamers Nexus: NVIDIA's Dirty Manipulation of Reviews

Maybe your balls, but my balls haven't turned green. AMD and Intel are both pretty good alternatives at this point. Especially with how bad Nvidia's drivers have gotten. None of Nvidia's proprietary tech is something you must have. AMD's FSR4 is pretty good alternative over DLSS4, and the same can be said about XeSS. Frame generation is bad no matter which manufacturer you go with. Maybe Nvidia's noise cancellation but Discord's Krisp works pretty good.
View attachment 730909

If I am going to be completely honest, while I would LOVE to be able to buy an AMD or Intel GPU that works well for gaming for me, they simply don't make such a product.

Heck, even Nvidia doesn't make such a product.

I want 4K resolution, highest quality settings (including RT) with some form of decent AA, with the ability of hitting 0.1% lows above 60fps, without any upscaling or frame gen.

Not even the 5090 can do this right now, but it gets closer than anything else on the market, and as such if I were GPU shopping right now would be the only GPU I would even remotely consider. At least for gaming.

I have a Radeon 7600 (non XT) in my workstation, and it does fine there, but that system will never run a game. Almost any GPU could handle the 2d desktop workloads that machine sees.

AMD or Intel don't have to beat Nvidia at the high end in order for me to consider them, but they at least need to get close, and as it stands they have nothing even within a country mile.

And the truth is this. If I can't get 4k with all settings maxed to be playable (without AI nonsense) then I just don't play. I haven't played a game - any game - in over 6 months. Last game I played was a random Civilization map. Before that, the last game I truly played was the System Shock remaster which I played through a little over a year ago. Before that it was Starfield in late 2023. Before that Dying Light 2. Before that the original release of Cyberpunk, and before that Far Cry 6.

I intended to play the Cyberpunk expansion, but I never did. I hate how they integrated it into the story. If it were a new story in the same setting I would totally have done it, but the fact that they imported it into the same story line ruined it for me. I had already finished the game, and I wasn't about to start a new playthrough.

I guess my reason for mentioning all of this is that I don't just play games for the sake of playing games. I am looking for the fu perfect experience for one perfect playthrough, and if I can't get it, I just don't play.

For instance I have been looking forward to Stalker2 for 15 years, but I still haven't started playing it, because my existing CPU was not up to my expectations performance wise in that title. I now have a 9950x3d waiting to be built and I was hoping to get a 5090 so I could avoid enabling scaling, it that seems like a pipe dream considering how they are completely unobtainable unless you feed the scalpers which I refuse to do.

If I can't meet my performance expectations, I just don't play.

For me to consider buying AMD or Intel GPU's for anything beyond basic 2d desktop use they need to release something that either meets my performance expectations, get closer than anyone else does or at least gets within low single digit percent of whomever is closest.
 
If I am going to be completely honest, while I would LOVE to be able to buy an AMD or Intel GPU that works well for gaming for me, they simply don't make such a product.

Heck, even Nvidia doesn't make such a product.

I want 4K resolution, highest quality settings (including RT) with some form of decent AA, with the ability of hitting 0.1% lows above 60fps, without any upscaling or frame gen.
Spoiler alert, that is never going to happen. Games will continue to add more graphics features, game studios will continue to cut corners when coding games and let various "upscaling" software make up the performance gap, and when your terms are actually met at 4K a decade from now you will make the same demands but for 8K resolution and still be boycotting gaming.
EDIT fixed typo
 
Spoiler alert, that is never going to happen.
pointing finger meme.jpg
 
Nvidia's best non-existent vaporware which virtually no one could buy, and I say that as someone who literally had every intention of buying a 5090 RTX at release but didn't manage to get to the checkout within 30 seconds of orders going live. I agree with Steve from GN, Nvidia should just fuck off out of the market if it has no real intention of properly servicing ordinary consumers.
Until people stop buying their cards, they won't, it is free money for them...

People who keep buying exclusively nVidia cards will never see anyone else make top-end cards.

This, the brand loyalty some people have drives me nuts, NONE of these companies give a flying crap about any of us, they are in it for the money, buy the best $/performance product out there.. but if a company gets super shady vote with your wallet.

It is like people who pre-buy AAA game titles and then get mad on release day when it runs like total crap... ya, they already got your money, they dont care, then they do it again with the next release..../facepalm

If more people actually voted with their wallets, we might see change, but the general population are mindless twits who can't think for them selves or do basic research and just jump on bandwagons..
 
Last edited:
If I am going to be completely honest, while I would LOVE to be able to buy an AMD or Intel GPU that works well for gaming for me, they simply don't make such a product.

Heck, even Nvidia doesn't make such a product.

I want 4K resolution, highest quality settings (including RT) with some form of decent AA, with the ability of hitting 0.1% lows above 60fps, without any upscaling or frame gen.

Not even the 5090 can do this right now, but it gets closer than anything else on the market, and as such if I were GPU shopping right now would be the only GPU I would even remotely consider. At least for gaming.

I have a Radeon 7600 (non XT) in my workstation, and it does fine there, but that system will never run a game. Almost any GPU could handle the 2d desktop workloads that machine sees.

AMD or Intel don't have to beat Nvidia at the high end in order for me to consider them, but they at least need to get close, and as it stands they have nothing even within a country mile.

And the truth is this. If I can't get 4k with all settings maxed to be playable (without AI nonsense) then I just don't play. I haven't played a game - any game - in over 6 months. Last game I played was a random Civilization map. Before that, the last game I truly played was the System Shock remaster which I played through a little over a year ago. Before that it was Starfield in late 2023. Before that Dying Light 2. Before that the original release of Cyberpunk, and before that Far Cry 6.

I intended to play the Cyberpunk expansion, but I never did. I hate how they integrated it into the story. If it were a new story in the same setting I would totally have done it, but the fact that they imported it into the same story line ruined it for me. I had already finished the game, and I wasn't about to start a new playthrough.

I guess my reason for mentioning all of this is that I don't just play games for the sake of playing games. I am looking for the fu perfect experience for one perfect playthrough, and if I can't get it, I just don't play.

For instance I have been looking forward to Stalker2 for 15 years, but I still haven't started playing it, because my existing CPU was not up to my expectations performance wise in that title. I now have a 9950x3d waiting to be built and I was hoping to get a 5090 so I could avoid enabling scaling, it that seems like a pipe dream considering how they are completely unobtainable unless you feed the scalpers which I refuse to do.

If I can't meet my performance expectations, I just don't play.

For me to consider buying AMD or Intel GPU's for anything beyond basic 2d desktop use they need to release something that either meets my performance expectations, get closer than anyone else does or at least gets within low single digit percent of whomever is closest.
Good lord quit being such a princess and enjoy the games. Just because you can't have every setting on max at 4k is a lame reason not to play great games.
 
Nvidia has always been a garbage company. They never invented anything, just bought a bunch of other video card brands several decades ago, and the only thing they did over the years is increase the amount of cores and memory. Though we have to give credit for the CEO sticking his head inside an emulator-fridge, maybe that's why he keeps repeating himself so much. I don't know what is with that exclusivity-deal, but it's very similar to what they did with PhysX, where it made games like Mirror's Edge unplayable on graphics cards other than Nvidia. That is pure selfishness, and no consideration at all for the industry. Why are they still using the term GPU (graphics processing unit), when it's mostly processing "AI" garbage and cryptocurrency garbage. They have no respect for environment either by producing these cards with no video outputs. Maybe it's up to the individual manufacturers, but I am not sure about that. Another little known thing is that for at least 15 years Nvidia chips have the ability to disable video memory channels, so many cards that develop artifacts can continue working by disabling a video memory channel, losing some performance but at least not becoming electronic waste. I have done this myself, all it takes is changing three bytes or so in the video card firmware, as discovered by one particular Russian video card repair guy. The Nvidia CEO is a dangerous person. It's not impossible that he might become some evil AI mastermind in the future. And he is not the only such guy: Bill Gates, Elon Musk, Gabe Newell, Mark Zuckerberg, all these guys have too much money and too much power and are doing damage to our society. Bill Gates stole and glued together a bunch of crap to make Windows several decades ago, and today it barely changed since Windows 95. Elon Musk, what is there to say... Electric cars existed a 100 years ago. Gabe, made a cool game or two, decades ago, but Steam is just a circlejerk. Zuckerberg - same thing, the fucking UI didn't change since the very beginning. The important thing is not to follow these trends, both for the user and the developer. You can make a fun and beautiful game for a video card from 20 years ago, let alone the today's top Intel video card.
Electric cars from more than 100 years ago at this point are mentionable only on a technicality and are farcical to include with your luddite rant. The newer electric cars are a lot more competitive with the fuel swilling models. The first electric cars were quickly outmoded by the fuel swillers.
 
"Swilling" is an interesting choice of word for something that can push over 3000 pounds 12 miles at 65 mph with about a half-liter of fuel.
 
If I am going to be completely honest, while I would LOVE to be able to buy an AMD or Intel GPU that works well for gaming for me, they simply don't make such a product.

Heck, even Nvidia doesn't make such a product.

I want 4K resolution, highest quality settings (including RT) with some form of decent AA, with the ability of hitting 0.1% lows above 60fps, without any upscaling or frame gen.

Not even the 5090 can do this right now, but it gets closer than anything else on the market, and as such if I were GPU shopping right now would be the only GPU I would even remotely consider. At least for gaming.

I have a Radeon 7600 (non XT) in my workstation, and it does fine there, but that system will never run a game. Almost any GPU could handle the 2d desktop workloads that machine sees.

AMD or Intel don't have to beat Nvidia at the high end in order for me to consider them, but they at least need to get close, and as it stands they have nothing even within a country mile.

And the truth is this. If I can't get 4k with all settings maxed to be playable (without AI nonsense) then I just don't play. I haven't played a game - any game - in over 6 months. Last game I played was a random Civilization map. Before that, the last game I truly played was the System Shock remaster which I played through a little over a year ago. Before that it was Starfield in late 2023. Before that Dying Light 2. Before that the original release of Cyberpunk, and before that Far Cry 6.

I intended to play the Cyberpunk expansion, but I never did. I hate how they integrated it into the story. If it were a new story in the same setting I would totally have done it, but the fact that they imported it into the same story line ruined it for me. I had already finished the game, and I wasn't about to start a new playthrough.

I guess my reason for mentioning all of this is that I don't just play games for the sake of playing games. I am looking for the fu perfect experience for one perfect playthrough, and if I can't get it, I just don't play.

For instance I have been looking forward to Stalker2 for 15 years, but I still haven't started playing it, because my existing CPU was not up to my expectations performance wise in that title. I now have a 9950x3d waiting to be built and I was hoping to get a 5090 so I could avoid enabling scaling, it that seems like a pipe dream considering how they are completely unobtainable unless you feed the scalpers which I refuse to do.

If I can't meet my performance expectations, I just don't play.

For me to consider buying AMD or Intel GPU's for anything beyond basic 2d desktop use they need to release something that either meets my performance expectations, get closer than anyone else does or at least gets within low single digit percent of whomever is closest.
I just buy the fastest gaming CPU and GPU at any given time. When I can't max out games with all the eye candy, then its time to upgrade. If nothing exists because I'm waiting on new CPU's and GPU's then I simply play the game as it is tuning its options to whatever looks the best with "acceptable" frame rates.

Honestly, DLSS4 is pretty good. A lot of the image quality issues with earlier DLSS versions are gone. Even then, while you are playing the game (and moving around in the game world) you probably aren't going to see any minor image quality issues that might still be present. The bitching about Framegen is also kind of weird to me. The only real issue with it is increased latency. This may or may not be noticeable depending on your hardware, whether or not you can use NVIDIA Reflex and the in-game implementation. The difference is hard to even perceive on Doom the Dark Ages, even using 4x Framegen.
 
Spoiler alert, that is never going to happen. Games will continue to add more graphics features, game studios will continue to cut corners when coding games and let various "upscaling" software make up the performance gap, and when your terms are actually met at 4K a decade from now you will make the same demands but for 8K resolution and still be boycotting gaming.
EDIT fixed typo

Nah.

4k is my peak. I see no combination of viewing distance and screen size where the entirety of the picture fits within my field of view that makes anything above 4k worthwhile. Maybe if you love pixel-peeping at fonts, but that's about it.

If a 4k screen fits within my field of view (which a 4k screen art 42-43" does perfectly at arms length) that is my end stage screen/resolution.

Fonts don't currently look perfect on my 42" LG C3, but that is more an artifact of the WOLED subpixel configuration not being optimized for it than it is the resolution itself.

From here on out, I'd totally upgrade my screen for one that has a subpixel configuration that is better suited to desktop/text use, but I don't think I'd ever need or want more resolution. This may even result in getting an 8k screen some day, as maybe only those will be available with a combination of subpixel layouts and size that rendsers fonts smoothly, and if I do, I may even run the screen at 8k with scaling on the desktop just for good measure because I can, but I really wont need anything above a 4k source, and I won't have any reason to run my games at above 4k.

I mean, at that resolution where you really cant see the artifacting anyway, maybe upscaling even makes sense. It doesn't at my ~105ppi now, but if I wind up with an 8k screen (what is that, 7680x4320?) id be at ~210 ppi, and probably wouldn't be able to see any artifacting from upscaling anyway, so maybe then upscaling won't be terrible. I'd probably switch back and forth between upscaling and stretched native 4k and decide which I liked better. Who knows, at that point, upscaling to native in the GPU may even be lower latency than letting the screen do it. I'd still insist on rendering internally at 4k though.

I've been an early adopter at the forefront of new higher resolutions for so long, first with 1920x1200 in 2005, then with 2560x1600 in 2009, then again with 4k in 2015. I never learned my lesson, and have been constantly fighting a losing battle trying to get acceptable framerates at higher and higher resolutions for 20 years now. As soon as I got a GPU that was fast enough for the resolution I was using I pulled a stupid and got a new, larger and higher resoltuion screen, and each time once I experienced gaming at that higher resolution on a larger screen there was no going back.

I took a break from games from ~2004 to ~2009. In the interim while I was just using my machine for desktop stuff, I got a 24" 1920x1200 Dell screen (2405FPW? Can't remember) Then in 2009, I started getting back into some basic games. My old GeForce 6800GT was instantly no longer cutting it even for light gaming. (Civ 4 was what sparked it all, and got me back into games)

This sparked a series of rapid GPU upgrades.

GeForce 9200GT (I just need something basic) --> Radeon HD 5750 (OK, maybe I like some basic gaming) --> GeForce GTX 470 (Ok, I guess we are back).

And then, having caught up with acceptable 19320x1200 performance,I bought the 30" Dell U3011 2560x1600 screen, and it started all over again...

GeForce GTX 580 (briefly, but had to sell it, as my custom Shuttle SFF case PSU at the time didn't have sufficient power, and there was no way to upgrade it, I briefly tried with a 5.25" bay PSU, but it was REALLY bad so I sold the GPU) --> Radeon 6970 in new "wait for bulldozer Phenom II build" --> Dual Radeon 6970 (Crossfire, bulldozer sucks I'm getting a Core i7-3930k) --> Single Radeon 7970 (turns out crossfire sucked) --> GeForce GTX 680 --> GeForce GTX Titan (6gb) Finally caught up with something playable....

and then, I got tempted by a 2015 Samsung JS9000 4k TV as a monitor....

Single GeForce 980ti (was never going to be enough, it was temporary) --> Dual GeForce GTX 980ti's (SLI) --> Pascal Titan X (SLI sucks too) and I was fine for about a year, but then things started getting slower again. I had to play games using various tricks. Setting the TV to european broadcast 50hz mode, V-syncing at that refresh rate (no G/Free Sync) and playing in custom letterboxed 21:9 ultrawide resolutions (3840x1646) just to eak out a bit more framerate to make it work...

...and then the 2080ti had space invaders issues, so I skipped it. And then the 30-series was scalped to high heavens, so I skipped that. Finally I couldn't wait any longer for the GPU market to return to "normal" so I got an overpriced Radeon 6900xt (though admittedly a badass one from XFX with custom power stage, highly binned XTX chip, and an integrated EK fullcover block). It helped, but it was not enough.

Finally I swapped that out for a 4090. It was fine for a brief moment, but then it too started not being sufficient.


I guess what I'm saying is, I'm tired boss. It's time for a break. I'm happy to sit back at 4k now and let the GPU's catch up, and I'm not convinced I'll ever seek to go to 8k. There may come a time when in order to get a screen that I want, 8k is the only option, but I don't feel the need for that resolution.

Keeping the screen size the same and at the same distance, with each doulbing of resolution there is less and less of a visual benefit due to limiting returns. Now I havent exactly kept the screen size the same, going from 24" at 1920x1200, to 30" at 2560x1600, and then to 48" at 4k, before realizing it was too large and stepping back down to the sweet spot at 42". But at this point I ahve no desire to go beyond 42". I simply cant fit more within my field of view at typical desktop viewing distances, and since the screen will stop growing, there will also be less need for more resolution.

So yeah, I am ready to sit at 4k now and wait for GPU's to catch up. I simply don't see much if any benefit from going higher.

While I may be OK with scaling if the PPI is high enough to where artifacting is prettymuch not visible anyway, I am emphatic that there will never be a time when frame gen will be OK. If your frame rate is low enough that you need the frame generation, your input lag willl be dreadful. If your frame rate is high enough that you don't need the frame gen, it has no benefit, and only drawbacks from artefacting.
 
I just buy the fastest gaming CPU and GPU at any given time. When I can't max out games with all the eye candy, then its time to upgrade. If nothing exists because I'm waiting on new CPU's and GPU's then I simply play the game as it is tuning its options to whatever looks the best with "acceptable" frame rates.

Honestly, DLSS4 is pretty good. A lot of the image quality issues with earlier DLSS versions are gone. Even then, while you are playing the game (and moving around in the game world) you probably aren't going to see any minor image quality issues that might still be present.

I'll acknowledge that upscaling is much improved over where it was back when we were all making fun of consoles for doing it. I haven't really played with the new transformer model yet, but with the old CNN model, I could reluctantly tolerate "Quality" mode. Like, side by side I could tell the difference between it and native, but there wasn't always a clear winner. Anything lower than quality mode was a no-go for me though.

I'm told the transformer model improves this to where lower quality modes are similar to the old CNN "Quality" mode, but faster, but for me the jury is out on that one.

What I had read is that while going from "CNN Quality" to "Transformer Quality" results in a slight performance hit, it also has a slight quality improvement. What I had read was that the new "Transformer Performance" upscaling is similar in quality to the old "CNN Quality", but significantly better performing, but then I watched some reviews that brought that into question. Maybe I'd be OK with "Transformer Balanced"? I don't know.

But honestly, I'd rather just use the new Nvidia AI render pipeline at native resolution and take advantage of the DLAA anti-aliasing benefits on top of native resolution, but if forced to, I guess I could live with a mild upscale.

The bitching about Framegen is also kind of weird to me. The only real issue with it is increased latency. This may or may not be noticeable depending on your hardware, whether or not you can use NVIDIA Reflex and the in-game implementation. The difference is hard to even perceive on Doom the Dark Ages, even using 4x Framegen.
I'd argue the bitching about framegen is totally legitimate.

It's not just "only" input lag. Input lag is freaking everything. Reduction in input lag is the biggest reason why 30fps is unplayable and 60fps is playable.

I have seen some reviews point out artifacting with frame gen, but when I tested it (once, in Dying Light 2), I didn't notice it.

Here is the problem with frame gen to me.

- If you really need more framerate (rendering below 60fps*) and you enable frame gen, you still have a shitty experience, because you are still getting input lag consistent with below 60fps rendering.

- If you already have acceptable framerate (above 60fps*) you really don't need frame gen, so why bother? It will even incur a slight input lag penalty, so the experience might be worse.

And add to that, lets say you - like me - have a monitor that tops out at 120hz. If you run 2x framegen on that, framegen doesn't dynamically turn off and switch to native when you hit your screens max refresh it just continues running, so you are essentially capping yourself at an input lag consistent with 60fps (less frame gen penalties)

it is even worse if you use 4x frame gen. Now you are capping yourself at an input lag consistent with 30fps. That is awful.

So, the TLDR version for me with frame gen is that it sucks when you need it, and when you don't need it, sure it isn't terrible, but it also isn't helping you, and potentially introduces artifacting. So there is really no circumstance where it helps. The best it ever does is not hurt, and then at the potential risk of artifacts.



*keep in mind when I am talking about 60fps here, I am talking about 0.1% lows of 60fps, not 60fps averages. The reason we all thought of 60fps as acceptable framerate back then was because it was in the v-sync era. I ran my games such that frame rate would be a flat line at 60fps, never dropping below it, but also never exceeding it due to v-sync.

My average may have been 60fps, but that is simply because every single frame would render at 60fps (1/60th of a second).

A modern day 60fps average is a MUCH worse experience than a flat 60fps experience. With a modern average 60fps average, your framerates will spike at your 1% and 0.1% lows leading to a shitty experience. With the old flat-line 60fps experience, every frame comes out in 16.67ms without any frame time spikes.

The modern equivalent is to target 0.1% framerates that always exceed 60fps. That gives you an average in most titles of about 90fps, which is very playable.

Now, some frame rate freaks will insist they need more than that, and that is all good and well if that is what they prefer, but I just had to clarify this before I got rage replies because I considered 60fps acceptable.

Again, 0.1% lows above 60fps, NOT 60fps average. These are not the same thing.
 
Last edited:
Waiting for LTT's response. Oh nevermind, Linus isn't responsibility to discuss these kinds of issues with the community.

Linus still exists? I never saw a video of his in my feed again after the fallout from the GN exposé a couple of years ago.

Granted, I was never a subscriber (I have never subscribed to any channel on Youtube other than Kyle's lol, I ain't no "follower" :p ) but prior to that he randomly showed up in my feed all the time, and since then I havent heard a peep from him.
 
Good lord quit being such a princess and enjoy the games. Just because you can't have every setting on max at 4k is a lame reason not to play great games.

I live for the full eye candy immersive experience. It's pretty much the only reason I play games at all. So if I can't get the best, I don't want it at all.

I don't play games for the sake of playing games. It needs to be a truly special experience on all levels, story, gameplay graphics and sound, or I am not interested.

It is not a "social" thing for me. I don't even have a discord account. Never have. I also never had a Twitch account.

I don't play games with friends.
I don't want to ever play games in the living room.
I don't ever want to play games on mobile.
I haven't played a multiplayer game in close to a decade.

I don't have dead time where I am bored I just want to kill with games. I haven't been bored in over 20 years. I am way to busy for boredom. If I am going to spend my valuable time on a game, it had better be fucking special.

I play games one way and one way only, sequestered alone in my office, isolated from the rest of the world. And when I do I hope to be transported to a different world with an as immersive experience as possible.

I'll play through a game, hoping get completely sucked and get 80+ hours out of it if it is good, and then when I am done, I don't play another game again often for months, sometimes years until the next immersive experience that can suck me in comes along.

I used to enjoy multiplayer games, but the fourth wall breaking bullshit from all the stupid wannabe professional streamers just fucking around for likes or hoping to go viral, or goofing around with their friends ruined it for me. I demand an experience that is 100% conformant with the story, with no breaking of fourth walls. Absolutely everything has to believable (at least to the extent possible within the universe of the title). No goofy hats, or stupid theme breaking skins or anything like that. Absolutely everything has to be thematically correct.

I want the experience where every last player is 100% on board with trying hard, and living into the game as much as possible. No jokesters trying to be funny. No streamers trying to impress followers. No friends goofing around with each other, and no fucking trolls or cheaters. That was the experience I had with my regular group on my Red Orchestra 2 server in realism mode. You weren't goofing around with your friends playing a game about the eastern front in WWII. It was more of a mil-sim experience. For a couple of hours at a time you WERE on the eastern front, living into it as much as possible, or at least as much as possible as a game would allow, with every last player playing the objectives as if they were the actual character they were playing, and it mattered to them as much as it did to the character.

I want the experience you get from reading an excellent book, that utterly transfixes you and sucks you in. For a brief moment in time I want to be completely transported through time and space, and if I can't get that, I'll just do something else with my time. It isn't worth wasting my time on half measures. I don't have enough time for that.

If I can't squeeze every ounce out of a title to make it as special as possible, I'll just wait until I can. I don't want the experience to be lacking in any way.
 
Last edited:
I live for the full eye candy immersive experience. It's pretty much the only reason I play games at all. So if I can't get the best, I don't want it at all.

I don't play games for the sake of playing games. It needs to be a truly special experience on all levels, story, gameplay graphics and sound, or I am not interested.

It is not a "social" thing for me. I don't even have a discord account. Never have. I also never had a Twitch account.

I don't play games with friends.
I don't want to ever play games in the living room.
I don't ever want to play games on mobile.
I haven't played a multiplayer game in close to a decade.

I don't have dead time where I am bored I just want to kill with games. I haven't been bored in over 20 years. I am way to busy for boredom. If I am going to spend my valuable time on a game, it had better be fucking special.

I play games one way and one way only, sequestered alone in my office, isolated from the rest of the world, and hope to be transported to a different world with an as immersive experience as possible.

I'll play through a game, hoping get completely sucked and get 80+ hours out of it if it is good, and then when I am done, I don't play another game again often for months, sometimes years until the next immersive experience that can suck me in comes along.

I used to enjoy multiplayer games, but the fourth wall breaking bullshit from all the stupid wanna be streamers just fucking around for likes ruined it for me. I demand an experience that is 100% conformant with the story, with no breaking of fourth walls. Absolutely everything has to believable (at least to the extent possible within the universe of the title). No goofy hats, or skins or anything like that.

If I can't squeeze every ounce out of a title to make it as special as possible, I'll just wait until I can. I don't want the experience to be lacking in any way.
I'm sorry to hear that. My advice is to get a new hobby.
 
I used to enjoy multiplayer games, but the fourth wall breaking bullshit from all the stupid wannabe professional streamers just fucking around for likes or hoping to go viral, or goofing around with their friends ruined it for me. I demand an experience that is 100% conformant with the story, with no breaking of fourth walls. Absolutely everything has to believable (at least to the extent possible within the universe of the title). No goofy hats, or stupid theme breaking skins or anything like that. Absolutely everything has to be thematically correct.

I want the experience where every last player is 100% on board with trying hard, and living into the game as much as possible. No jokesters trying to be funny. No streamers trying to impress followers. No friends goofing around with each other, and no fucking trolls or cheaters. That was the experience I had with my regular group on my Red Orchestra 2 server in realism mode. You weren't goofing around with your friends playing a game about the eastern front in WWII. It was more of a mil-sim experience. For a couple of hours at a time you WERE on the eastern front, living into it as much as possible, or at least as much as possible as a game would allow, with every last player playing the objectives as if they were the actual character they were playing, and it mattered to them as much as it did to the character.

I want the experience you get from reading an excellent book, that utterly transfixes you and sucks you in, and if I can't get that, I'll just do something else with my time. It isn't worth wasting my time on half measures. I don't have enough time for that.

If I can't squeeze every ounce out of a title to make it as special as possible, I'll just wait until I can. I don't want the experience to be lacking in any way.

Reading this makes splinter cell come back into my mind. When playing with a friend sneaking around, if you spoke too loud over the mic to eachother, NPC's would hear it and start looking in that direction for you. Which I loved because it made perfect sense. You would ask "whats your situation?" and you would hear nothing back because he doesn't want to give away his location, so then you panic and try to find him. I miss that kind of detail and suspense in games. Yes it could be bypassed doing direct chat but why take away the fun?

I don't need as much graphical immersion as I do game play immersion, if that makes sense from what I just said. I am getting older, my eyes are easier to trick haha.
 
I don't need as much graphical immersion as I do game play immersion, if that makes sense from what I just said. I am getting older, my eyes are easier to trick haha.
I guess my take is that both are important.

A good immersive title pulls out all the stops to accomplish that, gameplay, story, graphics and sound.

I want to have that book-like experience where you are transfixed, and spend hours reading, finish it, and then feel kind of empty and hollow inside because it is over.

Can this be done without the best possible graphics, sure. I mean, some of the games I played years ago are dated, but they were still fun to play back then. I don't usually replay titles, as when I know the ending, it takes all of the fun out of them for me. I can no longer live into them. Occasionally years later I will replay a title because I can't remember it though. If it has been long enough, the backwards leap in graphics compared to the previous title I played is really difficult to get used to at first, but if I am able to force myself through it long enough, eventually I adapt. But it is much easier to not have to.

I'd argue that all of the elements add together to create as good of an experience as possible. Suffer too much on any one, and it breaks the spell.
 
Last edited:
Jensen thanks you for your contribution to his leather jacket fund. Well, these days, it's probably relatively very small. :)
I want the fastest GPU available. I don't really give a shit who makes it. I'd be much more inclined to go with Intel or AMD if they even got within 10% of the 5090's performance with their top offerings provided the price was right. With the kind of performance disparity we have now, Intel and AMD simply can't earn my business on high end GPU's. I'm not a charity.
 
I want the fastest GPU available. I don't really give a shit who makes it. I'd be much more inclined to go with Intel or AMD if they even got within 10% of the 5090's performance with their top offerings provided the price was right. With the kind of performance disparity we have now, Intel and AMD simply can't earn my business on high end GPU's. I'm not a charity.

Same.

The GPU I buy doesn't necessarily have to be the fastest of them all, but it has to at the very least be competitive to within a few percent. Not 45% behind like the 9070 xt is, or 75% behind, like Intel's fastest currently is.

They have to at least get close.
 
nVidia have been using dishonest or even illegal marketing tactics since the 2000's - probably even earlier, but I didn't become aware of it until around that time. So it's not at all suprising that they do this, but it's heartening to see it getting some attention. Maybe it will be enough to convince a few people to stop funding a company that actively works against them.
I missed that one but color me not surprised.
 
Remember folks, AMD CHOOSES not to compete, because Nvidia's pricing is so ludicrous that even being a distant 2nd is more profitable than actually trying to compete.

For those of you who say Nvidia's performance is stagnating

For those of you who say they are overcharging for little to no improvement

For those of you who see their scummy tactics

... AMD still chooses not to compete and prices their products in line with Nvidia, with similar VRAM and performance.

So if you're disappointed with the 5060's 8GB VRAM and 25% uplift over a 3060 at the same price as 4 years ago...

Just wait till you see AMD'd 8GB VRAM and 25% uplift over the 3060 at the same price as 4 years ago...


If you think Nvidia is holding back (and they are) but AMD isn't? You're truly lost.
 
nVidia have been using dishonest or even illegal marketing tactics since the 2000's - probably even earlier, but I didn't become aware of it until around that time. So it's not at all suprising that they do this, but it's heartening to see it getting some attention. Maybe it will be enough to convince a few people to stop funding a company that actively works against them.
I may be misremembering but wasn't there a poster here that got outed but kept denying it and just kept on like the whole forum wasn't on to him?
 
Well, not directly about this but he did call Nvidia out on their bullshit couple of weeks ago which I thought was actuallly very good video. Credit where credit is due.
Hopefully he does make a video about this. He's really the only tech tuber that has a big enough audience to make a dent in Nvidia. Even then, I doubt they would reply. They are not the same company they were even 2 years ago.
 
Hopefully he does make a video about this. He's really the only tech tuber that has a big enough audience to make a dent in Nvidia. Even then, I doubt they would reply. They are not the same company they were even 2 years ago.

Is he still though?

I guess I thought his viewership took a huge hit ~2 years ago after all of that controversy.

I just looked it up, and I guess 16.3M subscribers, to Gamer's Nexus 2.44M, I just feel like I never heard from or about Linus anymore. I assumed he was a has-been after all of that. People used to always share his videos on all PC groups on social media, now I never see them anymore. They also never pop up in my youtube feed anymore.

At the same time I keep seeing more and more of GN and HUB's content, both presented to me by YouTube algorithms, and shared by others in groups and on forums.

Odd.

I wonder who all of those who still subscribe to and watch his content are, if they are not in any of the communities I frequent.
 
Remember folks, AMD CHOOSES not to compete, because Nvidia's pricing is so ludicrous that even being a distant 2nd is more profitable than actually trying to compete.

For those of you who say Nvidia's performance is stagnating

For those of you who say they are overcharging for little to no improvement

For those of you who see their scummy tactics

... AMD still chooses not to compete and prices their products in line with Nvidia, with similar VRAM and performance.

So if you're disappointed with the 5060's 8GB VRAM and 25% uplift over a 3060 at the same price as 4 years ago...

Just wait till you see AMD'd 8GB VRAM and 25% uplift over the 3060 at the same price as 4 years ago...


If you think Nvidia is holding back (and they are) but AMD isn't? You're truly lost.
I've said this same thing before. If all these techtubers held all parties equally accountable, we would be in a much better place today. Let's also not forget these techtubers absolutely have a vested interest in getting supplied GPUs early and free of charge. Steve loves to harp that their mech sales allows them to be impartial, but that's horseshit. He knows full well that is he purchased products after release like any consumer would, which would absolutely make them more impartial, his channel would be dead within a year or two. The monetary incentive to be biased is unquestionable.

Could you imagine if Steve actually held AMD to the same standards and takes them over the coals as thoroughly as he has Nvidia when AMD straight lied about MSRP? He's barely even mentioned it. I think he mentioned it once in a single video. And how many "Nvidia bad!" Videos has he released in the same time period now? In the last 3 months he has made over a dozen videos rightfully calling out NVidia on their bulshit, but has only released one single video regarding AMDs fake MSRP.

I will say this though, he knows how to get people to watch his childish digs at Nvidia, and by extension getting that sweet, sweet engagement. His thumbnail have reached the moronic levels equal to "professional content creators".

1747687810532.png
 
Last edited:
Nvidia has entered their "Too Big to Fail" phase, where they are basically doing whatever they want and if you don't like it their honest response is "Well what are you gonna do about it?"

How much of that is Jensen and how much of it is Marketing Directors getting high on the smell of their own farts is anybody's guess, but yeah... This isn't the fun timeline.
 
Nvidia has been a shitbag company for years, even I know that and I run their hardware.
 
I've said this same thing before. If all these techtubers held all parties equally accountable, we would be in a much better place today. Let's also not forget these techtubers absolutely have a vested interest in getting supplied GPUs early and free of charge. Steve loves to harp that their mech sales allows them to be impartial, but that's horseshit. He knows full well that is he purchased products after release like any consumer would, his channel would be dead within a year or two. The monitory incentive to be partial is unquestionable.

Could you imagine if Steve actually held AMD to the same standards and takes them over the coals as thoroughly as he has Nvidia when AMD straight lied about MSRP? He's barely even mentioned it. I think he mentioned it once in a single video. And how many "Nvidia bad!" Videos has he released in the same time period now? In the last 3 months he has made over a dozen videos rightfully calling out NVidia on their bulshit, but has only released one single video regarding AMDs fake MSRP.

I will say this though, he knows how to get people to watch his childish digs at Nvidia, and by extension getting that sweet, sweet engagement. His thumbnail have reached the moronic levels equal to "professional content creators".

We might have been in a slightly better place if they had held them accountable, but ultimately since we are in a time where consumer hardware just isn't where the money is, in the grand scheme of things none of the players really care.

I mean, AMD and (shockingly) Intel are doing a better job of covering up that they don't give a rats ass better than Nvidia who are just giving everyone the finger by feeding them shit sandwiches and lying about it, but in the grand scheme of things for the last several years, enterprise, compute, crypto and now AI make them WAY more money than consumer does per square millimeter of silicon wafer, and since production capacity for manufactured silicon wafers is much lower than demand is (or at least was, at traditional pricing) all of them, every last one of them lose money (comparatively speaking) by manufacturing consumer GPU's.

There is more than enough demand in much higher profit margin categories (like AI) to eat up every single square millimeter of late gen silicon wafer capacity they can contractually get their hands on through TSMC and/or Samsung.

As Tech Jesus pointed out (and I have said this before) one wonders why they even bother.

If they could redirect everything from consumer over to Enterprise/Compute/Crypto/AI and in the process make way more money, why aren't they? Why even keep up this charade of pretending to want consumer business?

That - to me - is the root cause of this problem. It's not that they are tricking gamers, and no one is holding their feet to the fire (though, of course, that doesn't help). It's simply that gamers no longer are important to their profits, and in fact are more of a drag on their profits, so why should they care?

The only reason I can think of is that they don't want to put all of their eggs in one basket just in case they ever need us again. If they abandon consumer/gaming technologies and markets it will both be very difficult to catch back up technology wise, and to regain consumer trust.

So they instead give us the minimal amount of lip service they can so that if the goose that laid the AI golden egg ever disappears, they can try to save their businesses by refocusing back to the gamer.

This is true for all of them. Nvidia, Intel and AMD. Nvidia is just doing a much worse job of keeping a straight face. They are letting it shine through loud and clear that they "DGAF" as the kids would say.
 
As much as I hate what they're doing, Nvidia seems to be the only company pushing shit forward. They might have ulterior or at least monopolistic goals in mind, but still. When is the last time AMD rolled out major new tech that wasn't just a poor-man's response to something Nvidia just launched?
 
I may be misremembering but wasn't there a poster here that got outed but kept denying it and just kept on like the whole forum wasn't on to him?
I don't remember. I've seen a couple accidentally reveal themselves on forums over the years, by doing things like replying to their own posts as if they were logged into another account. Mostly they're very good at what they do though, and so don't get caught, as demonstrated by accounts here and elsewhere on the internet that date back to 2005 and are still posting (probably operated by different people now though). You can get some idea of which ones they are by the way they upvote each other quite consistently.
 
For the record I've bought Nvidia before. But I hope I don't have to again for quite some time. I'm 44 years old and lived through the era of rampant anti-consumer behavior that Intel did back in the day. Now it's been Nvidia for while.

After seeing these reports I'm VERY EFFING HAPPY that I got my 7900XTX when I did. It's plenty for what I need at 1440p and I just don't want to support that black leather jacketed midget who doesn't give a damn about me or you.
 
As much as I hate what they're doing, Nvidia seems to be the only company pushing shit forward. They might have ulterior or at least monopolistic goals in mind, but still. When is the last time AMD rolled out major new tech that wasn't just a poor-man's response to something Nvidia just launched?

I'd argue most of the "shit" Nvidia has pushed forward is just copium.

1.) They wanted to lock AMD and others out of the market, so they came up with RT, which was more about making developers lives easier than it was improving things for gamers, so they knew developers would immediately embrace it, and as a bonus, the competition didn't have the ability to keep up, so they were going to be locked out of latest top tier GPU capability. (Similar to what they did with Hairworks/Gameworks years ago)

RT wasnt necessary. It didn't sdignificantly move graphics forward. There were lots of rasterization tricks that could (and were) used to create similar effects. Raster titles looked better before RT took off than titles look now with RT disabled, running in raster mode. And this was very much by design. Push something upstream of the customers (the game developers) that you know they will love, and then corner the market downstream as users won't feel they have a choice.

2.) They can't (or don't want to) make faster GPU's (or sufficiently faster GPU's) so instead they are sprinkling AI assisted copium in the form of scaling and frame generation on top of everything like it is the second coming of Jesus Christ.

I'd argue that the last new thing Nvidia did that actually benefited gamers was G-Sync. But even then they did it in a way that it required a needless expensive module in the monitor they could license, and that locked buyers to only using Nvidia GPU's or they would lose the capability of their fancy new monitor. They knew people buy GPU's way more often than they do monitors, so if they could lock them in by having a monitor that wouldn't work to its full potential with the competition, buyers would feel forced to buy Nvidia GPU's.

Time and time again, Nvidia has the opportunity to compete the right way, by collaborating with Microsoft and others across the industry to create common standards, and then competing to produce the absolute best product they can that complies with those standards. And time and time again they instead decide to use sleazy business tactics that manipulate users into buying their products not because they really want to, but because they feel they don't have a choice.

You know like:
- I'd try AMD this gen, but if I do hairworks/gameworks titles will look like and run like shit, and who knows when an important title to me will use Hairworks/Gameworks?
- I'd try AMD this gen, but if I do my fancy expsnive monitor's G-Sync won't work.
- I'd try AMD this gen, but if I do if RT becomes a hard requirement (or even a soft one, in that new games look like shit without it) then Nvidia is the only choice.


Nvidia operates by attempting to divide the PC market and remove user choice. In a waym, that's the old Intel approach of the late 90's through earl 2000's (but with fewer lawsuits)

AMD may not be a boyscout (lets face it, no tech corporation is your friend, they are all just in it for the money) but at least their approach is to unify the market behind open standards and then champion those open standards and do their best to compete in them. It is because of AMD that Free-Sync and by extension VRR exists. They gave users more choice, not less. And eventually even Nvidia reluctantly adopted VRR (by by calling it "G-Sync Compatible").

What makes the PC industry great and successful is the fact that it is built on open standards that no one coporation controls, and it allows people to customize systems to their needs using common interfaces. Both IBM and Intel tried to control it, and both failed. I hope the same happens with Nvidia.


If it were up to Nvidia they would have their own proprietary PCI Express standard, and if your motherboard (which had a $200 Nvidia license fee added to it) didn't support the custom interface, you coul;dn't use their GPU's. In Nvidia's world, there wouyld be 12 different proprietary USB-like standards, none of which worked with eachother, and all which were leveraged to the absolute maximum to try to corner markets and twist customers arms into making puirchasing decisions that were not in their best interest.

Nividia is a fucking cancer on the PC market, just like how Intel used to be a fucking cancer on the PC market.

The only innovate in order to divide, destroy and lock out and/or in, and I fucking hate how they have been so successful doing it.

If there were any justice on this planet, they should have been broken up by the DOJ 15 years ago.

And yes. I still buy their products, because often I don't feel like I have a choice.
 
I'd argue most of the "shit" Nvidia has pushed forward is just copium.

1.) They wanted to lock AMD and others out of the market, so they came up with RT, which was more about making developers lives easier than it was improving things for gamers, so they knew developers would immediately embrace it, and as a bonus, the competition didn't have the ability to keep up, so they were going to be locked out of latest top tier GPU capability. (Similar to what they did with Hairworks/Gameworks years ago)

RT wasnt necessary. It didn't sdignificantly move graphics forward. There were lots of rasterization tricks that could (and were) used to create similar effects. Raster titles looked better before RT took off than titles look now with RT disabled, running in raster mode. And this was very much by design. Push something upstream of the customers (the game developers) that you know they will love, and then corner the market downstream as users won't feel they have a choice.

2.) They can't (or don't want to) make faster GPU's (or sufficiently faster GPU's) so instead they are sprinkling AI assisted copium in the form of scaling and frame generation on top of everything like it is the second coming of Jesus Christ.

I'd argue that the last new thing Nvidia did that actually benefited gamers was G-Sync. But even then they did it in a way that it required a needless expensive module in the monitor they could license, and that locked buyers to only using Nvidia GPU's or they would lose the capability of their fancy new monitor. They knew people buy GPU's way more often than they do monitors, so if they could lock them in by having a monitor that wouldn't work to its full potential with the competition, buyers would feel forced to buy Nvidia GPU's.

Time and time again, Nvidia has the opportunity to compete the right way, by collaborating with Microsoft and others across the industry to create common standards, and then competing to produce the absolute best product they can that complies with those standards. And time and time again they instead decide to use sleazy business tactics that manipulate users into buying their products not because they really want to, but because they feel they don't have a choice.

You know like:
- I'd try AMD this gen, but if I do hairworks/gameworks titles will look like and run like shit, and who knows when an important title to me will use Hairworks/Gameworks?
- I'd try AMD this gen, but if I do my fancy expsnive monitor's G-Sync won't work.
- I'd try AMD this gen, but if I do if RT becomes a hard requirement (or even a soft one, in that new games look like shit without it) then Nvidia is the only choice.


Nvidia operates by attempting to divide the PC market and remove user choice. In a waym, that's the old Intel approach of the late 90's through earl 2000's (but with fewer lawsuits)

AMD may not be a boyscout (lets face it, no tech corporation is your friend, they are all just in it for the money) but at least their approach is to unify the market behind open standards and then champion those open standards and do their best to compete in them. It is because of AMD that Free-Sync and by extension VRR exists. They gave users more choice, not less. And eventually even Nvidia reluctantly adopted VRR (by by calling it "G-Sync Compatible").

What makes the PC industry great and successful is the fact that it is built on open standards that no one coporation controls, and it allows people to customize systems to their needs using common interfaces. Both IBM and Intel tried to control it, and both failed. I hope the same happens with Nvidia.


If it were up to Nvidia they would have their own proprietary PCI Express standard, and if your motherboard (which had a $200 Nvidia license fee added to it) didn't support the custom interface, you coul;dn't use their GPU's. In Nvidia's world, there wouyld be 12 different proprietary USB-like standards, none of which worked with eachother, and all which were leveraged to the absolute maximum to try to corner markets and twist customers arms into making puirchasing decisions that were not in their best interest.

Nividia is a fucking cancer on the PC market, just like how Intel used to be a fucking cancer on the PC market.

The only innovate in order to divide, destroy and lock out and/or in, and I fucking hate how they have been so successful doing it.

If there were any justice on this planet, they should have been broken up by the DOJ 15 years ago.

And yes. I still buy their products, because often I don't feel like I have a choice.
100% agree...

I'm not sure though on some timelines though.

Like did Nvidia develop the RT stuff and then sell it as a solution to developers, or were publishers asking for a solution and Nvidia developed it based on a perceived market demand? Same goes for the Upscaling, etc ...
Because when I was working more on the hardware integration side of things I remember frequent meetings about how if certain technology was more refined it could dramatically cut costs and that was a good 4+ years before Nvidia and Microsoft announced their partnership for the RT extensions in DX12.

I am genuinely curious how much of the current GPU trends are directly related to Nvidia and how much is caused by Nvidia working withing the confines of TSMC, and I can only speculate as to what will change when Intel gets their 18A online.

GSync is weird, because that fundamental technology existed for along time, it was a central part of how CRT's worked, and when LCD's took over it was another function that was just lost to time. Forums were dedicated to trying to find what monitors could work with which GPU's for specific games to optimize for it but it was fickle at best.
I would guess that Nvidia put out GSync because somebody working there at the time was neck deep in one of those forums trying to make it work and simply asked somebody there "Can't we find a way to do this better? Because this sucks..."
Most of what made GSync work was simply building an input controller that was powerful enough to handle all the optional parts of the existing specifications that were mostly left off for cost cutting measures. GSync's proprietary part in there is the signaling they use to synchronize everything, not the process itself that was always available to everybody, it just cost more. The bulk of what made GSync work as a brand was the extra's above and beyond the input controller, they specified, and independently validated many aspects of the displays which vendors were more than happy to fib about. So GSync was less about VRR, which is still super significant, but more about ensuring that GSync as a brand was never attached to a sub standard unit. Even the first generation of GSync displays hold up very well today, where as the overwhelming bulk of FreeSync displays released are slightly better than garbage, which is why they had to add multiple tiers, where the upper tiers are also certified and likewise charge extra for the validation.

Nvidia does have it's own proprietary PCI Express standards... SXM and NVLink are essentially just that...

I see this less as a DOJ split up problem and more of a tech patent issue, the patent system fundamentally rewards this sort of behavior, when it was mechanical and things changed slowly over decades or more this worked perfectly fine, but with the tech industry and how fast it changes it completely kills any chance of meaningful competition.

I don't exactly know what changes need to be made to the patent system, but it does need a significant overhaul to better deal with the reality of the current world for sure.
 
. I'm not a charity.
Nobody said you had to be, but everyone thought like you[1], there would never be any chance of anyone else replacing nVidia.

[1] not calling you out as a bad person or anything like that! It's sort of an, I dunno, reverse tragedy of the commons situation.
 
Back
Top