Jabroni31169
My Future Son-in-Law
- Joined
- Apr 19, 2000
- Messages
- 10,636
I'm still rocking my 3080ti, depending upon reviews the 5080 seems like a decent upgrade for me...maybe wait on the ti models unless the 5090's are easy to get...LOL
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
This is a great take on things, I've felt the same way since around the 3xxx series, where nvidia started to plot the new course on gaming. I don't know why so many people hold onto the "old tech" of raster, etc. We are where we are in terms of gaming now FROM past innovations, otherwise we would all still be playing DOS like games! Quite franky, IDGAF how my game is rendered, as long as it looks great and plays great. nVidia has actually been taking a risk over time going in this direction, and while so many people seem to hate on it now because it is not "perfect" yet, it will be where we all end up someday thanks to the risks and development being taken now. This is why I love always getting into new technology and buying the best I can in PC parts. I find it exciting to be a part of honestly!The again, even in the past when "AA" became a thing, I was enabling it when people were crying about it's performance hit not being worth it. Different strokes for different folks.
Bigger die size with with almost a third more cores, though with a 4-ish percent lower official boost clock, but with significantly wider memory bus and faster memory for a total of nearly 80% more memory bandwidth. Granted most of that bandwidth is probably to supply fake frames for the not-really-AI tensor cores, but I am still interested.
I'll pass thanks.
I might even pass until iGPU machines kill it at 1440p.
Me neither. I'm enthusiastic that this generation is more a return to reasonable sized FE lineup. Same for the pricing. It's just about the best we could have hoped for, given nVIDIA has no real competition.Oh jeeze its two slots... now I want one. Not enough to spend two grand but I love that they're getting smaller again. I bet the AIB cards will all be monstrosities, I can't stand all of these oversized coolers.
I'm still rocking my 3080ti, depending upon reviews the 5080 seems like a decent upgrade for me...maybe wait on the ti models unless the 5090's are easy to get...LOL
They mentioned memory efficiency and CPU overhead improvements for frame generation for both the 4000 and 5000 series.For easy reference:
View attachment 702252
As for the multi frame DLSS, how hard will this hit VRAM? Because frame gen is already a VRAM hog, and one has to question how useful it will be in practice on a 5070 with 12GB of VRAM. Will GDDR7 make a difference here?
The 5070 seems a bit underwhelming. CUDA core count is not much higher than the 4070, and less than the 4070 Super. Yes I know that two aren't directly comparable, but still, seems like a low increase.
Almost feels like they want to push 5070ti & 5090 sales over 5070 and 5080.
User error.View attachment 702259
View attachment 702260
this concerns me, wish they'd do what Battlemage did, but 4x full sized 8 pinners
View attachment 702261
11% energy efficiency and 6% density improvements over N5.How different is the node for the 5090 from the 4090? If it's purely architectural gains for the most part, it's kind of impressive. Still though, if I owned a 4090 I wouldn't be in a hurry to plunk down 2 grand.
For Far Cry 6, which is a shitty example to use since it's one of only two games I know of that perform extremely poorly on NVIDIA cards.in raw performance, AKA comparing identical settings and using no frame-gen, the 5090 is about 20-30% faster than the 4090, by Nvidia's own graphs.
It's the architecture, not the node. The big mistake people are making is thinking that there is a direct correlation between the core count on Lovelace and Blackwell, coming to the conclusion that it's "literally impossible" for the 5070 to be as fast as the 4090.Not much process improvement this time while 40xx gen got a big boost going from Samsung 8 to TSMC 4.
5070 Ti is probably the best deal. Cut-down 5080 die with the same 16GB.
You only need a 1000W power supply for a 5090.Will be fine for 5090 Gigabyte Gaming OC, Seasonic PRIME PX-2200 80Plus Platinum ATX 3.1 PCIe 5.1 2024 2200W ?
https://seasonic.com/pl/atx3-prime-px-2200/
Yes.Let's be honest though, is Ray Tracing really that important? The amount of visual fidelity increase (small) to performance hit (large) has always been a poor value.
Pure rasterization will be a thing of the past in 5 years.Aside from the cool technical aspects all this AI shit and fake frame generation garbage is not moving the needle forward. It's raster performance isn't really massively improving. Maybe these cards are 25-30% faster than the previous generation and that's probably generous given that the stack below the 5090 is massively cut down.
5080 at a grand is interesting but circumspect. 1/2 of the 5090 is a bit rough. It's not even close to the top tier card.
We will see how all this stuff washes out in benchmarks later. This new frame generation stuff is like an artificial way to get FPS but you're not even looking at the native games anymore. AI FPS faking isn't performance in the normal sense.
I suspect what we are getting here are AI datacenter cards that have been reworked to act as gaming GPUs. A shitload of AI frame generation to give you FPS and some better RT and locks you completely in the Nvidia ecosystem.
All I needed to know was his statement about all of us on our $10,000 Command Center PCs to know how tone deaf this CEO and company is to the denizens of the world.
I had other ideas but I am getting swarmed at the helpdesk..........
It's cool that you're cool with whatever Nvidia gives you. You're getting a datacenter AI card that's not meant for gaming that's been repurposed to run as a gaming GPU and deliver these stellar FPS gains because it's making frames that only contain 25% of the original content and using AI to render the rest of it. That doesn't sit well with me. I don't want to struggle with artifacts, weird latency penalties and anything that can't actually display a game in all it's glory without creating fake shit to deliver FPS that won't mean shit to most people anyway... at the end of the day. 360 FPS, upscaled ... 30 FPS without upscaling... hm, something is wrong there..This is a great take on things, I've felt the same way since around the 3xxx series, where nvidia started to plot the new course on gaming. I don't know why so many people hold onto the "old tech" of raster, etc. We are where we are in terms of gaming now FROM past innovations, otherwise we would all still be playing DOS like games! Quite franky, IDGAF how my game is rendered, as long as it looks great and plays great. nVidia has actually been taking a risk over time going in this direction, and while so many people seem to hate on it now because it is not "perfect" yet, it will be where we all end up someday thanks to the risks and development being taken now. This is why I love always getting into new technology and buying the best I can in PC parts. I find it exciting to be a part of honestly!The again, even in the past when "AA" became a thing, I was enabling it when people were crying about it's performance hit not being worth it. Different strokes for different folks.
It's the architecture, not the node. The big mistake people are making is thinking that there is a direct correlation between the core count on Lovelace and Blackwell, coming to the conclusion that it's "literally impossible" for the 5070 to be as fast as the 4090.
It is very noisyLet's be honest though, is Ray Tracing really that important? The amount of visual fidelity increase (small) to performance hit (large) has always been a poor value.
Well, I guess I will hang onto it for the next five years then. I will probably never turn on the frame generation in my lifetime. I might just be getting too old for this shitThey mentioned memory efficiency and CPU overhead improvements for frame generation for both the 4000 and 5000 series.
User error.
11% energy efficiency and 6% density improvements over N5.
For Far Cry 6, which is a shitty example to use since it's one of only two games I know of that perform extremely poorly on NVIDIA cards.
It's the architecture, not the node. The big mistake people are making is thinking that there is a direct correlation between the core count on Lovelace and Blackwell, coming to the conclusion that it's "literally impossible" for the 5070 to be as fast as the 4090.
You only need a 1000W power supply for a 5090.
Yes.
Pure rasterization will be a thing of the past in 5 years.
The $10k PC comment was a joke.
This is a great take on things, I've felt the same way since around the 3xxx series, where nvidia started to plot the new course on gaming. I don't know why so many people hold onto the "old tech" of raster, etc. We are where we are in terms of gaming now FROM past innovations, otherwise we would all still be playing DOS like games! Quite franky, IDGAF how my game is rendered, as long as it looks great and plays great. nVidia has actually been taking a risk over time going in this direction, and while so many people seem to hate on it now because it is not "perfect" yet, it will be where we all end up someday thanks to the risks and development being taken now. This is why I love always getting into new technology and buying the best I can in PC parts. I find it exciting to be a part of honestly!The again, even in the past when "AA" became a thing, I was enabling it when people were crying about it's performance hit not being worth it. Different strokes for different folks.
A shooting war with the United States would be stupid. We're their biggest customer.I have mixed feelings. On one hand, you're correct. Even raster itself employs many tricks to get closer to "life". Shadows, rays, etc. The question has always been about "what's the real deal?" Is raster the complete removal of any "fakery"? No, not really. Raster is a lot of fakery, too (at least afaik). Is "real life" the actual source of truth? Then neither tech is inherently wrong. AI is simply approaching it from a different angle.
Here's what I predict happens (or is going to happen) when they train DLSS and whatnot: huge sets of GPUs actually perform the hard calculations for various scenarios at various angles, and then DLSS and whatnot is actually directly trained on the outcome. Then, these results are used to more quickly approximate the actual outcome. Instead of running ray tracing and all this stuff in real time, results are effectively being "cached" and used for interpolation and approximation. But I could be totally off base here.
Anyway on the other hand, socioeconomically I don't think we're ready for this. That's the elephant in the room. I personally feel like a tech dystopia is just on the horizon (well more like "has been on the horizon", from my viewpoint; I talked about this a long time ago already) unless someone curbs it. That, and maybe WW3 between us and China because I have no idea how Jensen is going to curb these things enough that China won't just smuggle them in anyway.
So, I have been using Frame Generation in Stalker 2, with DLAA, at 4K, max settings. There is absolutely no artifacts or anything I can see with my eyes playing the game (I have tried). Does that mean it is not there? No, of course not, but unless I am missing something, feels like a mound out of a mole hill type issue here. In terms of "feel", I dunno, it feels great to me and extremely smooth. I do not notice any input lag whatsoever in that game. It has either improved, or it was implemented well on this particular game. I can admit CP2077 did have SOME input lag it felt like, but the game was still very enjoyable with it.It's cool that you're cool with whatever Nvidia gives you. You're getting a datacenter AI card that's not meant for gaming that's been repurposed to run as a gaming GPU and deliver these stellar FPS gains because it's making frames that only contain 25% of the original content and using AI to render the rest of it. That doesn't sit well with me. I don't want to struggle with artifacts, weird latency penalties and anything that can't actually display a game in all it's glory without creating fake shit to deliver FPS that won't mean shit to most people anyway... at the end of the day. 360 FPS, upscaled ... 30 FPS without upscaling... hm, something is wrong there..
It's the architecture, not the node. The big mistake people are making is thinking that there is a direct correlation between the core count on Lovelace and Blackwell, coming to the conclusion that it's "literally impossible" for the 5070 to be as fast as the 4090.
Pure rasterization will be a thing of the past in 5 years.
The $10k PC comment was a joke.
Also to further cement the AI thing, here's a text message I got from Microcenter just now.
View attachment 702298
I'm not sure if this actually has any official link to what Nvidia themselves want to market, but the AI capabilities seem to be what these things are riding on.
I'm not arguing with you. You like what you like. I'm an oddball. I can perceive some weird shit. They say your auditory and visual acuity essentially ends around 20, I'm 51 and as deaf as I am for some things and as blind as I am for reading fine text I can perceive the most minute details in sound systems and frequencies when I'm tuning multipositional audio (which I miss terribly on the PC) and I can, literally, isolate artifacts (I always see them) on the screen. I always can tell when there is a pixel issue and it's made me something of a display savant even thought I don't have the latest and greatest shit.So, I have been using Frame Generation in Stalker 2, with DLAA, at 4K, max settings. There is absolutely no artifacts or anything I can see with my eyes playing the game (I have tried). Does that mean it is not there? No, of course not, but unless I am missing something, feels like a mound out of a mole hill type issue here. In terms of "feel", I dunno, it feels great to me and extremely smooth. I do not notice any input lag whatsoever in that game. It has either improved, or it was implemented well on this particular game. I can admit CP2077 did have SOME input lag it felt like, but the game was still very enjoyable with it.
I still don't understand the argument of "fake frames" either... it's all fake because it's all PC generated. You have to start somewhere in order to advance how games are rendered, people can't expect rasterization to be the end all be all until the end of time can they?
Breaking down everything for my PC including decade-old hardware that it inherited comes out to $6,605 total when accounting for what I actually paid. I am including controllers and external hard drives that are not exclusively used on just this PC. So I guess it's not too far offWell, I guess I will hang onto it for the next five years then. I will probably never turn on the frame generation in my lifetime. I might just be getting too old for this shit
Maybe he was joking but it's really not a joke anymore... His statement isn't too far off the mark. Anyone dropping in a 2,000 dollar graphics card likely already has almost 2-3 grand invested in the rest of their rig, at least. That's monitors, peripherals, etc. It didn't sound like a joke, it sounded like he was referring to his own rig...
Raytracing is the future, there is no question about that.
But the frame generation lies is what really gets me, advertising those framerates as real. FG is basically an advanced version from Motion Interpolation that is found in every TV and what all gamers should turn off because of lag.
Sure GPU FG is much better and advanced, it has more data to work with and it can do it quicker, but still it has to buffer two frames where it can inject those "guesstimated" frames inbetween. We are taking a step back as far as input lag is concerned. And the downsides of frame generation get worse the lower your starting framerate is, the situation where it is needed the most. So far it works the best when you are already running the game fast, like 90fps and want to take it up to your monitors 120 or 144fps. But at that point it is just icing on already good cake. But no matter how good the icing is, if the cake is shit then it cannot save it.
Also EU prices are out. Almost 1000€ for 5070Ti!? Are you fucking kidding me!? With these kind of prices PC gaming is going to die. As a teen i could buy myself a nice gaming PC with what I earned when working during summer breaks. Now as an adult I have to really ponder will I keep up with my hobby or will I go to console gaming instead.
I have mixed feelings. On one hand, you're correct. Even raster itself employs many tricks to get closer to "life". Shadows, rays, etc. The question has always been about "what's the real deal?" Is raster the complete removal of any "fakery"? No, not really. Raster is a lot of fakery, too (at least afaik). Is "real life" the actual source of truth? Then neither tech is inherently wrong. AI is simply approaching it from a different angle.
Here's what I predict happens (or is going to happen) when they train DLSS and whatnot: huge sets of GPUs actually perform the hard calculations for various scenarios at various angles, and then DLSS and whatnot is actually directly trained on the outcome. Then, these results are used to more quickly approximate the actual outcome. Instead of running ray tracing and all this stuff in real time, results are effectively being "cached" and used for interpolation and approximation. But I could be totally off base here.
Anyway on the other hand, socioeconomically I don't think we're ready for this. That's the elephant in the room. I personally feel like a tech dystopia is just on the horizon (well more like "has been on the horizon", from my viewpoint; I talked about this a long time ago already) unless someone curbs it. That, and maybe WW3 between us and China because I have no idea how Jensen is going to curb these things enough that China won't just smuggle them in anyway.
D4 isn't using 2D sprites.If 2-D sprites were still around I think you would see a better crop of games on the market. About the past 5-7 years 2-D games that are using 3-D sprites like Diablo 4 are making a dent in the gaming industry. The major problem with alot of 3-D games they all look similar. Once in a while with like Stalker 2 that just came out you get a decent 3-D game that actually stand out from the rest.
It is very noisy
And seems you can replicate the effects and experience with rasterization
View: https://youtu.be/K3ZHzJ_bhaI?si=U5X3aGyFMUpqBXU2
I would argue that Raytracing is the Now.
While I have only heard of one title that requires it to always be on, it is in pretty much every title at this point, and ever since the introduction of RT, running raster only looks worse than it did before, meaning unless you use RT now, you get a gimped product.
Agreed.
Absolutely agreed. Frame gen makes the screen motion look smoother, until you grab the mouse and realize that your input lag is shit.
I would argue that unless you can get 60+fps without frame gen, enabling frame gen only results in a smoother terrible experience with bad input lag. And there is nothing any AI technology can do to change that.
I am going to go out on a limb and guess that the EU prices are quoted with VAT/Sales tax? The U.S. prices are always quoted without.
That explains a large chunk of the difference, but not all of it.
Yeah, if you include the PC I have attached to my Home theater and short throw projector... I'm in that 10K+ investment. But that took me over a decade to assemble... My main rig is more like $2500 (1,000 of it is the 7900XTX).Breaking down everything for my PC including decade-old hardware that it inherited comes out to $6,605 total when accounting for what I actually paid. I am including controllers and external hard drives that are not exclusively used on just this PC. So I guess it's not too far off.
Yea, that's a non-zero possibity. Do like me: sell exactly 31-35 days before launch/announcement and use a back-up card for a bit. Sold my 4090 on Dec 3 for $1700.I am worried the guy that bought my 4090 last week for $2049 plus tax on eaby will have buyer's remorse and claim the card is broken...![]()
I might even pass until iGPU machines kill it at 1440p.
Yeah. I think the problem with the iGPU thing is also that if you want that "Strix Halo" like performance.. You're gonna be looking at a gaming system and by the time it comes out it might be adequate for older titles but it won't be anything special for new ones. Now, it might allow a lot more power efficiency on a laptop and that's what I'm interested in. A couple hours of cordless gaming (on a laptop, not a tiny ass gaming handheld) is a myth right now.Good luck with that.
I'm not convinced that day will come for a very very long time.
iGPU's keep getting faster, for sure, but so do the demands of new titles.
In the last few years we've even seen titles increase their demands faster than the GPU technology is advancing, leading to some titles being rather difficult to drive.
The only way the iGPU is going to get to the "adequacy at 1440p" level is likely if you only want to play older titles.
Good luck with that.
I'm not convinced that day will come for a very very long time.
iGPU's keep getting faster, for sure, but so do the demands of new titles.
In the last few years we've even seen titles increase their demands faster than the GPU technology is advancing, leading to some titles being rather difficult to drive.
The only way the iGPU is going to get to the "adequacy at 1440p" level is likely if you only want to play older titles.
Interesting that the card is not exhausting heat out of the case.
First look at DLSS4. Most of these features are also compatible with the RTX 4000 cards and earlier!
View: https://www.youtube.com/watch?v=xpzufsxtZpA
Trillion operations per second measured in INT8.WTF is an AI TOP?
I love my eye candy too, but frames/smoothness has its own unique benefit to the quality of the image. Like I'm not going to go from 144 FPS to 65 FPS just for slightly better reflections in limited areas.