- Joined
- Mar 19, 2003
- Messages
- 9,918
What's the TLDR of this tuber drama?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Maybe your balls, but my balls haven't turned green. AMD and Intel are both pretty good alternatives at this point. Especially with how bad Nvidia's drivers have gotten. None of Nvidia's proprietary tech is something you must have. AMD's FSR4 is pretty good alternative over DLSS4, and the same can be said about XeSS. Frame generation is bad no matter which manufacturer you go with. Maybe Nvidia's noise cancellation but Discord's Krisp works pretty good.
View attachment 730909
Spoiler alert, that is never going to happen. Games will continue to add more graphics features, game studios will continue to cut corners when coding games and let various "upscaling" software make up the performance gap, and when your terms are actually met at 4K a decade from now you will make the same demands but for 8K resolution and still be boycotting gaming.If I am going to be completely honest, while I would LOVE to be able to buy an AMD or Intel GPU that works well for gaming for me, they simply don't make such a product.
Heck, even Nvidia doesn't make such a product.
I want 4K resolution, highest quality settings (including RT) with some form of decent AA, with the ability of hitting 0.1% lows above 60fps, without any upscaling or frame gen.
Spoiler alert, that is never going to happen.
Until people stop buying their cards, they won't, it is free money for them...Nvidia's best non-existent vaporware which virtually no one could buy, and I say that as someone who literally had every intention of buying a 5090 RTX at release but didn't manage to get to the checkout within 30 seconds of orders going live. I agree with Steve from GN, Nvidia should just fuck off out of the market if it has no real intention of properly servicing ordinary consumers.
People who keep buying exclusively nVidia cards will never see anyone else make top-end cards.
Good lord quit being such a princess and enjoy the games. Just because you can't have every setting on max at 4k is a lame reason not to play great games.If I am going to be completely honest, while I would LOVE to be able to buy an AMD or Intel GPU that works well for gaming for me, they simply don't make such a product.
Heck, even Nvidia doesn't make such a product.
I want 4K resolution, highest quality settings (including RT) with some form of decent AA, with the ability of hitting 0.1% lows above 60fps, without any upscaling or frame gen.
Not even the 5090 can do this right now, but it gets closer than anything else on the market, and as such if I were GPU shopping right now would be the only GPU I would even remotely consider. At least for gaming.
I have a Radeon 7600 (non XT) in my workstation, and it does fine there, but that system will never run a game. Almost any GPU could handle the 2d desktop workloads that machine sees.
AMD or Intel don't have to beat Nvidia at the high end in order for me to consider them, but they at least need to get close, and as it stands they have nothing even within a country mile.
And the truth is this. If I can't get 4k with all settings maxed to be playable (without AI nonsense) then I just don't play. I haven't played a game - any game - in over 6 months. Last game I played was a random Civilization map. Before that, the last game I truly played was the System Shock remaster which I played through a little over a year ago. Before that it was Starfield in late 2023. Before that Dying Light 2. Before that the original release of Cyberpunk, and before that Far Cry 6.
I intended to play the Cyberpunk expansion, but I never did. I hate how they integrated it into the story. If it were a new story in the same setting I would totally have done it, but the fact that they imported it into the same story line ruined it for me. I had already finished the game, and I wasn't about to start a new playthrough.
I guess my reason for mentioning all of this is that I don't just play games for the sake of playing games. I am looking for the fu perfect experience for one perfect playthrough, and if I can't get it, I just don't play.
For instance I have been looking forward to Stalker2 for 15 years, but I still haven't started playing it, because my existing CPU was not up to my expectations performance wise in that title. I now have a 9950x3d waiting to be built and I was hoping to get a 5090 so I could avoid enabling scaling, it that seems like a pipe dream considering how they are completely unobtainable unless you feed the scalpers which I refuse to do.
If I can't meet my performance expectations, I just don't play.
For me to consider buying AMD or Intel GPU's for anything beyond basic 2d desktop use they need to release something that either meets my performance expectations, get closer than anyone else does or at least gets within low single digit percent of whomever is closest.
Electric cars from more than 100 years ago at this point are mentionable only on a technicality and are farcical to include with your luddite rant. The newer electric cars are a lot more competitive with the fuel swilling models. The first electric cars were quickly outmoded by the fuel swillers.Nvidia has always been a garbage company. They never invented anything, just bought a bunch of other video card brands several decades ago, and the only thing they did over the years is increase the amount of cores and memory. Though we have to give credit for the CEO sticking his head inside an emulator-fridge, maybe that's why he keeps repeating himself so much. I don't know what is with that exclusivity-deal, but it's very similar to what they did with PhysX, where it made games like Mirror's Edge unplayable on graphics cards other than Nvidia. That is pure selfishness, and no consideration at all for the industry. Why are they still using the term GPU (graphics processing unit), when it's mostly processing "AI" garbage and cryptocurrency garbage. They have no respect for environment either by producing these cards with no video outputs. Maybe it's up to the individual manufacturers, but I am not sure about that. Another little known thing is that for at least 15 years Nvidia chips have the ability to disable video memory channels, so many cards that develop artifacts can continue working by disabling a video memory channel, losing some performance but at least not becoming electronic waste. I have done this myself, all it takes is changing three bytes or so in the video card firmware, as discovered by one particular Russian video card repair guy. The Nvidia CEO is a dangerous person. It's not impossible that he might become some evil AI mastermind in the future. And he is not the only such guy: Bill Gates, Elon Musk, Gabe Newell, Mark Zuckerberg, all these guys have too much money and too much power and are doing damage to our society. Bill Gates stole and glued together a bunch of crap to make Windows several decades ago, and today it barely changed since Windows 95. Elon Musk, what is there to say... Electric cars existed a 100 years ago. Gabe, made a cool game or two, decades ago, but Steam is just a circlejerk. Zuckerberg - same thing, the fucking UI didn't change since the very beginning. The important thing is not to follow these trends, both for the user and the developer. You can make a fun and beautiful game for a video card from 20 years ago, let alone the today's top Intel video card.
I just buy the fastest gaming CPU and GPU at any given time. When I can't max out games with all the eye candy, then its time to upgrade. If nothing exists because I'm waiting on new CPU's and GPU's then I simply play the game as it is tuning its options to whatever looks the best with "acceptable" frame rates.If I am going to be completely honest, while I would LOVE to be able to buy an AMD or Intel GPU that works well for gaming for me, they simply don't make such a product.
Heck, even Nvidia doesn't make such a product.
I want 4K resolution, highest quality settings (including RT) with some form of decent AA, with the ability of hitting 0.1% lows above 60fps, without any upscaling or frame gen.
Not even the 5090 can do this right now, but it gets closer than anything else on the market, and as such if I were GPU shopping right now would be the only GPU I would even remotely consider. At least for gaming.
I have a Radeon 7600 (non XT) in my workstation, and it does fine there, but that system will never run a game. Almost any GPU could handle the 2d desktop workloads that machine sees.
AMD or Intel don't have to beat Nvidia at the high end in order for me to consider them, but they at least need to get close, and as it stands they have nothing even within a country mile.
And the truth is this. If I can't get 4k with all settings maxed to be playable (without AI nonsense) then I just don't play. I haven't played a game - any game - in over 6 months. Last game I played was a random Civilization map. Before that, the last game I truly played was the System Shock remaster which I played through a little over a year ago. Before that it was Starfield in late 2023. Before that Dying Light 2. Before that the original release of Cyberpunk, and before that Far Cry 6.
I intended to play the Cyberpunk expansion, but I never did. I hate how they integrated it into the story. If it were a new story in the same setting I would totally have done it, but the fact that they imported it into the same story line ruined it for me. I had already finished the game, and I wasn't about to start a new playthrough.
I guess my reason for mentioning all of this is that I don't just play games for the sake of playing games. I am looking for the fu perfect experience for one perfect playthrough, and if I can't get it, I just don't play.
For instance I have been looking forward to Stalker2 for 15 years, but I still haven't started playing it, because my existing CPU was not up to my expectations performance wise in that title. I now have a 9950x3d waiting to be built and I was hoping to get a 5090 so I could avoid enabling scaling, it that seems like a pipe dream considering how they are completely unobtainable unless you feed the scalpers which I refuse to do.
If I can't meet my performance expectations, I just don't play.
For me to consider buying AMD or Intel GPU's for anything beyond basic 2d desktop use they need to release something that either meets my performance expectations, get closer than anyone else does or at least gets within low single digit percent of whomever is closest.
Spoiler alert, that is never going to happen. Games will continue to add more graphics features, game studios will continue to cut corners when coding games and let various "upscaling" software make up the performance gap, and when your terms are actually met at 4K a decade from now you will make the same demands but for 8K resolution and still be boycotting gaming.
EDIT fixed typo
I just buy the fastest gaming CPU and GPU at any given time. When I can't max out games with all the eye candy, then its time to upgrade. If nothing exists because I'm waiting on new CPU's and GPU's then I simply play the game as it is tuning its options to whatever looks the best with "acceptable" frame rates.
Honestly, DLSS4 is pretty good. A lot of the image quality issues with earlier DLSS versions are gone. Even then, while you are playing the game (and moving around in the game world) you probably aren't going to see any minor image quality issues that might still be present.
I'd argue the bitching about framegen is totally legitimate.The bitching about Framegen is also kind of weird to me. The only real issue with it is increased latency. This may or may not be noticeable depending on your hardware, whether or not you can use NVIDIA Reflex and the in-game implementation. The difference is hard to even perceive on Doom the Dark Ages, even using 4x Framegen.
Waiting for LTT's response. Oh nevermind, Linus isn't responsibility to discuss these kinds of issues with the community.
Good lord quit being such a princess and enjoy the games. Just because you can't have every setting on max at 4k is a lame reason not to play great games.
Waiting for LTT's response. Oh nevermind, Linus isn't responsibility to discuss these kinds of issues with the community.
I'm sorry to hear that. My advice is to get a new hobby.I live for the full eye candy immersive experience. It's pretty much the only reason I play games at all. So if I can't get the best, I don't want it at all.
I don't play games for the sake of playing games. It needs to be a truly special experience on all levels, story, gameplay graphics and sound, or I am not interested.
It is not a "social" thing for me. I don't even have a discord account. Never have. I also never had a Twitch account.
I don't play games with friends.
I don't want to ever play games in the living room.
I don't ever want to play games on mobile.
I haven't played a multiplayer game in close to a decade.
I don't have dead time where I am bored I just want to kill with games. I haven't been bored in over 20 years. I am way to busy for boredom. If I am going to spend my valuable time on a game, it had better be fucking special.
I play games one way and one way only, sequestered alone in my office, isolated from the rest of the world, and hope to be transported to a different world with an as immersive experience as possible.
I'll play through a game, hoping get completely sucked and get 80+ hours out of it if it is good, and then when I am done, I don't play another game again often for months, sometimes years until the next immersive experience that can suck me in comes along.
I used to enjoy multiplayer games, but the fourth wall breaking bullshit from all the stupid wanna be streamers just fucking around for likes ruined it for me. I demand an experience that is 100% conformant with the story, with no breaking of fourth walls. Absolutely everything has to believable (at least to the extent possible within the universe of the title). No goofy hats, or skins or anything like that.
If I can't squeeze every ounce out of a title to make it as special as possible, I'll just wait until I can. I don't want the experience to be lacking in any way.
I get what he's saying a little bit. I want the best experience possible. That's why I spend so much on hardware. I refuse to turn settings down.I'm sorry to hear that. My advice is to get a new hobby.
Jensen thanks you for your contribution to his leather jacket fund. Well, these days, it's probably relatively very small.I get what he's saying a little bit. I want the best experience possible. That's why I spend so much on hardware. I refuse to turn settings down.
I used to enjoy multiplayer games, but the fourth wall breaking bullshit from all the stupid wannabe professional streamers just fucking around for likes or hoping to go viral, or goofing around with their friends ruined it for me. I demand an experience that is 100% conformant with the story, with no breaking of fourth walls. Absolutely everything has to believable (at least to the extent possible within the universe of the title). No goofy hats, or stupid theme breaking skins or anything like that. Absolutely everything has to be thematically correct.
I want the experience where every last player is 100% on board with trying hard, and living into the game as much as possible. No jokesters trying to be funny. No streamers trying to impress followers. No friends goofing around with each other, and no fucking trolls or cheaters. That was the experience I had with my regular group on my Red Orchestra 2 server in realism mode. You weren't goofing around with your friends playing a game about the eastern front in WWII. It was more of a mil-sim experience. For a couple of hours at a time you WERE on the eastern front, living into it as much as possible, or at least as much as possible as a game would allow, with every last player playing the objectives as if they were the actual character they were playing, and it mattered to them as much as it did to the character.
I want the experience you get from reading an excellent book, that utterly transfixes you and sucks you in, and if I can't get that, I'll just do something else with my time. It isn't worth wasting my time on half measures. I don't have enough time for that.
If I can't squeeze every ounce out of a title to make it as special as possible, I'll just wait until I can. I don't want the experience to be lacking in any way.
I guess my take is that both are important.I don't need as much graphical immersion as I do game play immersion, if that makes sense from what I just said. I am getting older, my eyes are easier to trick haha.
I want the fastest GPU available. I don't really give a shit who makes it. I'd be much more inclined to go with Intel or AMD if they even got within 10% of the 5090's performance with their top offerings provided the price was right. With the kind of performance disparity we have now, Intel and AMD simply can't earn my business on high end GPU's. I'm not a charity.Jensen thanks you for your contribution to his leather jacket fund. Well, these days, it's probably relatively very small.![]()
I want the fastest GPU available. I don't really give a shit who makes it. I'd be much more inclined to go with Intel or AMD if they even got within 10% of the 5090's performance with their top offerings provided the price was right. With the kind of performance disparity we have now, Intel and AMD simply can't earn my business on high end GPU's. I'm not a charity.
I missed that one but color me not surprised.nVidia have been using dishonest or even illegal marketing tactics since the 2000's - probably even earlier, but I didn't become aware of it until around that time. So it's not at all suprising that they do this, but it's heartening to see it getting some attention. Maybe it will be enough to convince a few people to stop funding a company that actively works against them.
I may be misremembering but wasn't there a poster here that got outed but kept denying it and just kept on like the whole forum wasn't on to him?nVidia have been using dishonest or even illegal marketing tactics since the 2000's - probably even earlier, but I didn't become aware of it until around that time. So it's not at all suprising that they do this, but it's heartening to see it getting some attention. Maybe it will be enough to convince a few people to stop funding a company that actively works against them.
Hopefully he does make a video about this. He's really the only tech tuber that has a big enough audience to make a dent in Nvidia. Even then, I doubt they would reply. They are not the same company they were even 2 years ago.Well, not directly about this but he did call Nvidia out on their bullshit couple of weeks ago which I thought was actuallly very good video. Credit where credit is due.
Hopefully he does make a video about this. He's really the only tech tuber that has a big enough audience to make a dent in Nvidia. Even then, I doubt they would reply. They are not the same company they were even 2 years ago.
I've said this same thing before. If all these techtubers held all parties equally accountable, we would be in a much better place today. Let's also not forget these techtubers absolutely have a vested interest in getting supplied GPUs early and free of charge. Steve loves to harp that their mech sales allows them to be impartial, but that's horseshit. He knows full well that is he purchased products after release like any consumer would, which would absolutely make them more impartial, his channel would be dead within a year or two. The monetary incentive to be biased is unquestionable.Remember folks, AMD CHOOSES not to compete, because Nvidia's pricing is so ludicrous that even being a distant 2nd is more profitable than actually trying to compete.
For those of you who say Nvidia's performance is stagnating
For those of you who say they are overcharging for little to no improvement
For those of you who see their scummy tactics
... AMD still chooses not to compete and prices their products in line with Nvidia, with similar VRAM and performance.
So if you're disappointed with the 5060's 8GB VRAM and 25% uplift over a 3060 at the same price as 4 years ago...
Just wait till you see AMD'd 8GB VRAM and 25% uplift over the 3060 at the same price as 4 years ago...
If you think Nvidia is holding back (and they are) but AMD isn't? You're truly lost.
I've said this same thing before. If all these techtubers held all parties equally accountable, we would be in a much better place today. Let's also not forget these techtubers absolutely have a vested interest in getting supplied GPUs early and free of charge. Steve loves to harp that their mech sales allows them to be impartial, but that's horseshit. He knows full well that is he purchased products after release like any consumer would, his channel would be dead within a year or two. The monitory incentive to be partial is unquestionable.
Could you imagine if Steve actually held AMD to the same standards and takes them over the coals as thoroughly as he has Nvidia when AMD straight lied about MSRP? He's barely even mentioned it. I think he mentioned it once in a single video. And how many "Nvidia bad!" Videos has he released in the same time period now? In the last 3 months he has made over a dozen videos rightfully calling out NVidia on their bulshit, but has only released one single video regarding AMDs fake MSRP.
I will say this though, he knows how to get people to watch his childish digs at Nvidia, and by extension getting that sweet, sweet engagement. His thumbnail have reached the moronic levels equal to "professional content creators".
I don't remember. I've seen a couple accidentally reveal themselves on forums over the years, by doing things like replying to their own posts as if they were logged into another account. Mostly they're very good at what they do though, and so don't get caught, as demonstrated by accounts here and elsewhere on the internet that date back to 2005 and are still posting (probably operated by different people now though). You can get some idea of which ones they are by the way they upvote each other quite consistently.I may be misremembering but wasn't there a poster here that got outed but kept denying it and just kept on like the whole forum wasn't on to him?
As much as I hate what they're doing, Nvidia seems to be the only company pushing shit forward. They might have ulterior or at least monopolistic goals in mind, but still. When is the last time AMD rolled out major new tech that wasn't just a poor-man's response to something Nvidia just launched?
Wow dude. That man certainly lives rent free in so many heads. You know, worst thing isn't unbridled hate, it's apathy.that black leather jacketed midget who doesn't give a damn about me or you.
100% agree...I'd argue most of the "shit" Nvidia has pushed forward is just copium.
1.) They wanted to lock AMD and others out of the market, so they came up with RT, which was more about making developers lives easier than it was improving things for gamers, so they knew developers would immediately embrace it, and as a bonus, the competition didn't have the ability to keep up, so they were going to be locked out of latest top tier GPU capability. (Similar to what they did with Hairworks/Gameworks years ago)
RT wasnt necessary. It didn't sdignificantly move graphics forward. There were lots of rasterization tricks that could (and were) used to create similar effects. Raster titles looked better before RT took off than titles look now with RT disabled, running in raster mode. And this was very much by design. Push something upstream of the customers (the game developers) that you know they will love, and then corner the market downstream as users won't feel they have a choice.
2.) They can't (or don't want to) make faster GPU's (or sufficiently faster GPU's) so instead they are sprinkling AI assisted copium in the form of scaling and frame generation on top of everything like it is the second coming of Jesus Christ.
I'd argue that the last new thing Nvidia did that actually benefited gamers was G-Sync. But even then they did it in a way that it required a needless expensive module in the monitor they could license, and that locked buyers to only using Nvidia GPU's or they would lose the capability of their fancy new monitor. They knew people buy GPU's way more often than they do monitors, so if they could lock them in by having a monitor that wouldn't work to its full potential with the competition, buyers would feel forced to buy Nvidia GPU's.
Time and time again, Nvidia has the opportunity to compete the right way, by collaborating with Microsoft and others across the industry to create common standards, and then competing to produce the absolute best product they can that complies with those standards. And time and time again they instead decide to use sleazy business tactics that manipulate users into buying their products not because they really want to, but because they feel they don't have a choice.
You know like:
- I'd try AMD this gen, but if I do hairworks/gameworks titles will look like and run like shit, and who knows when an important title to me will use Hairworks/Gameworks?
- I'd try AMD this gen, but if I do my fancy expsnive monitor's G-Sync won't work.
- I'd try AMD this gen, but if I do if RT becomes a hard requirement (or even a soft one, in that new games look like shit without it) then Nvidia is the only choice.
Nvidia operates by attempting to divide the PC market and remove user choice. In a waym, that's the old Intel approach of the late 90's through earl 2000's (but with fewer lawsuits)
AMD may not be a boyscout (lets face it, no tech corporation is your friend, they are all just in it for the money) but at least their approach is to unify the market behind open standards and then champion those open standards and do their best to compete in them. It is because of AMD that Free-Sync and by extension VRR exists. They gave users more choice, not less. And eventually even Nvidia reluctantly adopted VRR (by by calling it "G-Sync Compatible").
What makes the PC industry great and successful is the fact that it is built on open standards that no one coporation controls, and it allows people to customize systems to their needs using common interfaces. Both IBM and Intel tried to control it, and both failed. I hope the same happens with Nvidia.
If it were up to Nvidia they would have their own proprietary PCI Express standard, and if your motherboard (which had a $200 Nvidia license fee added to it) didn't support the custom interface, you coul;dn't use their GPU's. In Nvidia's world, there wouyld be 12 different proprietary USB-like standards, none of which worked with eachother, and all which were leveraged to the absolute maximum to try to corner markets and twist customers arms into making puirchasing decisions that were not in their best interest.
Nividia is a fucking cancer on the PC market, just like how Intel used to be a fucking cancer on the PC market.
The only innovate in order to divide, destroy and lock out and/or in, and I fucking hate how they have been so successful doing it.
If there were any justice on this planet, they should have been broken up by the DOJ 15 years ago.
And yes. I still buy their products, because often I don't feel like I have a choice.
Well, if it isn't 8K it will be something else that will not be up to your standards.Nah.
Nobody said you had to be, but everyone thought like you[1], there would never be any chance of anyone else replacing nVidia.. I'm not a charity.