RTX 5090 - $2000 - 2 Slot Design - available on Jan. 30 - 12VHPWR still

This is a great take on things, I've felt the same way since around the 3xxx series, where nvidia started to plot the new course on gaming. I don't know why so many people hold onto the "old tech" of raster, etc. We are where we are in terms of gaming now FROM past innovations, otherwise we would all still be playing DOS like games! Quite franky, IDGAF how my game is rendered, as long as it looks great and plays great. nVidia has actually been taking a risk over time going in this direction, and while so many people seem to hate on it now because it is not "perfect" yet, it will be where we all end up someday thanks to the risks and development being taken now. This is why I love always getting into new technology and buying the best I can in PC parts. I find it exciting to be a part of honestly! :) The again, even in the past when "AA" became a thing, I was enabling it when people were crying about it's performance hit not being worth it. Different strokes for different folks.

No one is stopping you from blowing over 2 grand on the card. But we can agree to disagree on fake frames being a boon for gamers.
 
Bigger die size with with almost a third more cores, though with a 4-ish percent lower official boost clock, but with significantly wider memory bus and faster memory for a total of nearly 80% more memory bandwidth. Granted most of that bandwidth is probably to supply fake frames for the not-really-AI tensor cores, but I am still interested.

I'm going to try to keep an open mind until it launches.

While I hate all of this DLSS crap, if it is true, and they actually demonstrate a real performance increase over the 4090 without DLSS I will consider it, provided it is a significant improvement. (I'm not going to drop 2 grand for a paltry few percent)

The thing is, I'd buy a high end GPU from someone else, but.... AMD dropped out of that race, and Intel - while they claim high end enthusiast GPU's are coming, isn't there yet.

The truth is, the 4090 isn't fast enough (unless you resort to AI fakery, and even then only barely), and I need more power.
 
I'll pass thanks.

I might even pass until iGPU machines kill it at 1440p.

I was thinking this was the year I would upgrade. I'm open to NV if the reviews/price is right. I have to admit I felt the same way last night... I thought you know I could probably just get some sort of mini desktop with a Strix Halo and reclaim some desk space. lol
 
Oh jeeze its two slots... now I want one. Not enough to spend two grand but I love that they're getting smaller again. I bet the AIB cards will all be monstrosities, I can't stand all of these oversized coolers.
Me neither. I'm enthusiastic that this generation is more a return to reasonable sized FE lineup. Same for the pricing. It's just about the best we could have hoped for, given nVIDIA has no real competition.
 
I'm still rocking my 3080ti, depending upon reviews the 5080 seems like a decent upgrade for me...maybe wait on the ti models unless the 5090's are easy to get...LOL

Depends how much you want fake frames in the what did they say 60 or 70 games that will support them. It really sounds like the actual raster upgrade is very slight. At least wait for the eventual super version.
 
For easy reference:

View attachment 702252

As for the multi frame DLSS, how hard will this hit VRAM? Because frame gen is already a VRAM hog, and one has to question how useful it will be in practice on a 5070 with 12GB of VRAM. Will GDDR7 make a difference here?

The 5070 seems a bit underwhelming. CUDA core count is not much higher than the 4070, and less than the 4070 Super. Yes I know that two aren't directly comparable, but still, seems like a low increase.

Almost feels like they want to push 5070ti & 5090 sales over 5070 and 5080.
They mentioned memory efficiency and CPU overhead improvements for frame generation for both the 4000 and 5000 series.
View attachment 702259

View attachment 702260




this concerns me, wish they'd do what Battlemage did, but 4x full sized 8 pinners
View attachment 702261
User error.
How different is the node for the 5090 from the 4090? If it's purely architectural gains for the most part, it's kind of impressive. Still though, if I owned a 4090 I wouldn't be in a hurry to plunk down 2 grand.
11% energy efficiency and 6% density improvements over N5.
in raw performance, AKA comparing identical settings and using no frame-gen, the 5090 is about 20-30% faster than the 4090, by Nvidia's own graphs.
For Far Cry 6, which is a shitty example to use since it's one of only two games I know of that perform extremely poorly on NVIDIA cards.
Not much process improvement this time while 40xx gen got a big boost going from Samsung 8 to TSMC 4.

5070 Ti is probably the best deal. Cut-down 5080 die with the same 16GB.
It's the architecture, not the node. The big mistake people are making is thinking that there is a direct correlation between the core count on Lovelace and Blackwell, coming to the conclusion that it's "literally impossible" for the 5070 to be as fast as the 4090.
Will be fine for 5090 Gigabyte Gaming OC, Seasonic PRIME PX-2200 80Plus Platinum ATX 3.1 PCIe 5.1 2024 2200W ?

https://seasonic.com/pl/atx3-prime-px-2200/
You only need a 1000W power supply for a 5090.
Let's be honest though, is Ray Tracing really that important? The amount of visual fidelity increase (small) to performance hit (large) has always been a poor value.
Yes.
Aside from the cool technical aspects all this AI shit and fake frame generation garbage is not moving the needle forward. It's raster performance isn't really massively improving. Maybe these cards are 25-30% faster than the previous generation and that's probably generous given that the stack below the 5090 is massively cut down.

5080 at a grand is interesting but circumspect. 1/2 of the 5090 is a bit rough. It's not even close to the top tier card.

We will see how all this stuff washes out in benchmarks later. This new frame generation stuff is like an artificial way to get FPS but you're not even looking at the native games anymore. AI FPS faking isn't performance in the normal sense.

I suspect what we are getting here are AI datacenter cards that have been reworked to act as gaming GPUs. A shitload of AI frame generation to give you FPS and some better RT and locks you completely in the Nvidia ecosystem.

All I needed to know was his statement about all of us on our $10,000 Command Center PCs to know how tone deaf this CEO and company is to the denizens of the world.

I had other ideas but I am getting swarmed at the helpdesk..........
Pure rasterization will be a thing of the past in 5 years.

The $10k PC comment was a joke.
 
This is a great take on things, I've felt the same way since around the 3xxx series, where nvidia started to plot the new course on gaming. I don't know why so many people hold onto the "old tech" of raster, etc. We are where we are in terms of gaming now FROM past innovations, otherwise we would all still be playing DOS like games! Quite franky, IDGAF how my game is rendered, as long as it looks great and plays great. nVidia has actually been taking a risk over time going in this direction, and while so many people seem to hate on it now because it is not "perfect" yet, it will be where we all end up someday thanks to the risks and development being taken now. This is why I love always getting into new technology and buying the best I can in PC parts. I find it exciting to be a part of honestly! :) The again, even in the past when "AA" became a thing, I was enabling it when people were crying about it's performance hit not being worth it. Different strokes for different folks.
It's cool that you're cool with whatever Nvidia gives you. You're getting a datacenter AI card that's not meant for gaming that's been repurposed to run as a gaming GPU and deliver these stellar FPS gains because it's making frames that only contain 25% of the original content and using AI to render the rest of it. That doesn't sit well with me. I don't want to struggle with artifacts, weird latency penalties and anything that can't actually display a game in all it's glory without creating fake shit to deliver FPS that won't mean shit to most people anyway... at the end of the day. 360 FPS, upscaled ... 30 FPS without upscaling... hm, something is wrong there..
 
It's the architecture, not the node. The big mistake people are making is thinking that there is a direct correlation between the core count on Lovelace and Blackwell, coming to the conclusion that it's "literally impossible" for the 5070 to be as fast as the 4090.

The graph they released shows that the 5070 is not anywhere close to a 4090. They admit they are counting 3 extra fake frames in their 5000 graphs. Yes if you compare a 5070 displaying 4 frames every time it renders one it can "match" a 4090 at native.
 
They mentioned memory efficiency and CPU overhead improvements for frame generation for both the 4000 and 5000 series.

User error.

11% energy efficiency and 6% density improvements over N5.

For Far Cry 6, which is a shitty example to use since it's one of only two games I know of that perform extremely poorly on NVIDIA cards.

It's the architecture, not the node. The big mistake people are making is thinking that there is a direct correlation between the core count on Lovelace and Blackwell, coming to the conclusion that it's "literally impossible" for the 5070 to be as fast as the 4090.

You only need a 1000W power supply for a 5090.

Yes.

Pure rasterization will be a thing of the past in 5 years.

The $10k PC comment was a joke.
Well, I guess I will hang onto it for the next five years then. I will probably never turn on the frame generation in my lifetime. I might just be getting too old for this shit ;)

Maybe he was joking but it's really not a joke anymore... His statement isn't too far off the mark. Anyone dropping in a 2,000 dollar graphics card likely already has almost 2-3 grand invested in the rest of their rig, at least. That's monitors, peripherals, etc. It didn't sound like a joke, it sounded like he was referring to his own rig...
 
This is a great take on things, I've felt the same way since around the 3xxx series, where nvidia started to plot the new course on gaming. I don't know why so many people hold onto the "old tech" of raster, etc. We are where we are in terms of gaming now FROM past innovations, otherwise we would all still be playing DOS like games! Quite franky, IDGAF how my game is rendered, as long as it looks great and plays great. nVidia has actually been taking a risk over time going in this direction, and while so many people seem to hate on it now because it is not "perfect" yet, it will be where we all end up someday thanks to the risks and development being taken now. This is why I love always getting into new technology and buying the best I can in PC parts. I find it exciting to be a part of honestly! :) The again, even in the past when "AA" became a thing, I was enabling it when people were crying about it's performance hit not being worth it. Different strokes for different folks.

I have mixed feelings. On one hand, you're correct. Even raster itself employs many tricks to get closer to "life". Shadows, rays, etc. The question has always been about "what's the real deal?" Is raster the complete removal of any "fakery"? No, not really. Raster is a lot of fakery, too (at least afaik). Is "real life" the actual source of truth? Then neither tech is inherently wrong. AI is simply approaching it from a different angle.

Here's what I predict happens (or is going to happen) when they train DLSS and whatnot: huge sets of GPUs actually perform the hard calculations for various scenarios at various angles, and then DLSS and whatnot is actually directly trained on the outcome. Then, these results are used to more quickly approximate the actual outcome. Instead of running ray tracing and all this stuff in real time, results are effectively being "cached" and used for interpolation and approximation. But I could be totally off base here.

Anyway on the other hand, socioeconomically I don't think we're ready for this. That's the elephant in the room. I personally feel like a tech dystopia is just on the horizon (well more like "has been on the horizon", from my viewpoint; I talked about this a long time ago already) unless someone curbs it. That, and maybe WW3 between us and China because I have no idea how Jensen is going to curb these things enough that China won't just smuggle them in anyway.
 
I have mixed feelings. On one hand, you're correct. Even raster itself employs many tricks to get closer to "life". Shadows, rays, etc. The question has always been about "what's the real deal?" Is raster the complete removal of any "fakery"? No, not really. Raster is a lot of fakery, too (at least afaik). Is "real life" the actual source of truth? Then neither tech is inherently wrong. AI is simply approaching it from a different angle.

Here's what I predict happens (or is going to happen) when they train DLSS and whatnot: huge sets of GPUs actually perform the hard calculations for various scenarios at various angles, and then DLSS and whatnot is actually directly trained on the outcome. Then, these results are used to more quickly approximate the actual outcome. Instead of running ray tracing and all this stuff in real time, results are effectively being "cached" and used for interpolation and approximation. But I could be totally off base here.

Anyway on the other hand, socioeconomically I don't think we're ready for this. That's the elephant in the room. I personally feel like a tech dystopia is just on the horizon (well more like "has been on the horizon", from my viewpoint; I talked about this a long time ago already) unless someone curbs it. That, and maybe WW3 between us and China because I have no idea how Jensen is going to curb these things enough that China won't just smuggle them in anyway.
A shooting war with the United States would be stupid. We're their biggest customer.

I think, unless they invade Taiwan before the 20th of January... They're gonna STFU and have a pretty quiet 4 years of kissing Trump's ass to remain relevant. Otherwise, their country will economically collapse under the weight of tariffs and sanctions.
 
It's cool that you're cool with whatever Nvidia gives you. You're getting a datacenter AI card that's not meant for gaming that's been repurposed to run as a gaming GPU and deliver these stellar FPS gains because it's making frames that only contain 25% of the original content and using AI to render the rest of it. That doesn't sit well with me. I don't want to struggle with artifacts, weird latency penalties and anything that can't actually display a game in all it's glory without creating fake shit to deliver FPS that won't mean shit to most people anyway... at the end of the day. 360 FPS, upscaled ... 30 FPS without upscaling... hm, something is wrong there..
So, I have been using Frame Generation in Stalker 2, with DLAA, at 4K, max settings. There is absolutely no artifacts or anything I can see with my eyes playing the game (I have tried). Does that mean it is not there? No, of course not, but unless I am missing something, feels like a mound out of a mole hill type issue here. In terms of "feel", I dunno, it feels great to me and extremely smooth. I do not notice any input lag whatsoever in that game. It has either improved, or it was implemented well on this particular game. I can admit CP2077 did have SOME input lag it felt like, but the game was still very enjoyable with it.

I still don't understand the argument of "fake frames" either... it's all fake because it's all PC generated. You have to start somewhere in order to advance how games are rendered, people can't expect rasterization to be the end all be all until the end of time can they?
 
Also to further cement the AI thing, here's a text message I got from Microcenter just now.

1000036865.jpg


I'm not sure if this actually has any official link to what Nvidia themselves want to market, but the AI capabilities seem to be what these things are riding on.
 
Last edited:
It's the architecture, not the node. The big mistake people are making is thinking that there is a direct correlation between the core count on Lovelace and Blackwell, coming to the conclusion that it's "literally impossible" for the 5070 to be as fast as the 4090.

I can't speak for everyone else, but that is not what I am doing.

I'm looking at perf/watt, and I don't care what they have done with the architecture, there is no way you get that kind of per/watt increase going from TSMC 4N to TSMC 4NP

Which means, AI trickery is the only thing that explains their bullshit claims.

Pure rasterization will be a thing of the past in 5 years.

I feels like that ship has mostly sailed.

It pretty much already is, unless you want a gimped experience.

Like it or not, RT is here to stay.

I just don't want this AI bullshit in the render pipeline.

Leather jacket man can deride it all he wants calling it "brute force" rendering, but the truth remains that unless you render every pixel and every frame natively, you are getting an inferior product. Absolutely nothing will ever change this.

The $10k PC comment was a joke.

What about a desk?
 
Raytracing is the future, there is no question about that. But the frame generation lies is what really gets me, advertising those framerates as real. FG is basically an advanced version from Motion Interpolation that is found in every TV and what all gamers should turn off because of lag. Sure GPU FG is much better and advanced, it has more data to work with and it can do it quicker, but still it has to buffer two frames where it can inject those "guesstimated" frames inbetween. We are taking a step back as far as input lag is concerned. And the downsides of frame generation get worse the lower your starting framerate is, the situation where it is needed the most. So far it works the best when you are already running the game fast, like 90fps and want to take it up to your monitors 120 or 144fps. But at that point it is just icing on already good cake. But no matter how good the icing is, if the cake is shit then it cannot save it.

Also EU prices are out. Almost 1000€ for 5070Ti!? Are you fucking kidding me!? With these kind of prices PC gaming is going to die. As a teen i could buy myself a nice gaming PC with what I earned when working during summer breaks. Now as an adult I have to really ponder will I keep up with my hobby or will I go to console gaming instead.
 
  • Like
Reactions: ChadD
like this
So, I have been using Frame Generation in Stalker 2, with DLAA, at 4K, max settings. There is absolutely no artifacts or anything I can see with my eyes playing the game (I have tried). Does that mean it is not there? No, of course not, but unless I am missing something, feels like a mound out of a mole hill type issue here. In terms of "feel", I dunno, it feels great to me and extremely smooth. I do not notice any input lag whatsoever in that game. It has either improved, or it was implemented well on this particular game. I can admit CP2077 did have SOME input lag it felt like, but the game was still very enjoyable with it.

I still don't understand the argument of "fake frames" either... it's all fake because it's all PC generated. You have to start somewhere in order to advance how games are rendered, people can't expect rasterization to be the end all be all until the end of time can they?
I'm not arguing with you. You like what you like. I'm an oddball. I can perceive some weird shit. They say your auditory and visual acuity essentially ends around 20, I'm 51 and as deaf as I am for some things and as blind as I am for reading fine text I can perceive the most minute details in sound systems and frequencies when I'm tuning multipositional audio (which I miss terribly on the PC) and I can, literally, isolate artifacts (I always see them) on the screen. I always can tell when there is a pixel issue and it's made me something of a display savant even thought I don't have the latest and greatest shit.

I prefer all my frames to be real, because I always see the fake shit for what it is.

I guess, technically, I do use upscaling. I use it for older movies I don't want to repurchase for the 57th time remastered and (seriously the ancient boxed Battlestar Galactica set is nearly as good as the upscaled version minus the lack of widescreen on my Panasonic DVD player). I use it with my shitty TV's as a monitor to upscale regular pictures and such, natively. Some amazing visual enhancements are available for the average user. Aside from that, I believe that my $1,000+ GPU investment should be able to render / raster natively without fuckery. God knows, we're paying enough for this shit that tends to be outdated in 2-3 years...
 
Last edited:
Well, I guess I will hang onto it for the next five years then. I will probably never turn on the frame generation in my lifetime. I might just be getting too old for this shit ;)

Maybe he was joking but it's really not a joke anymore... His statement isn't too far off the mark. Anyone dropping in a 2,000 dollar graphics card likely already has almost 2-3 grand invested in the rest of their rig, at least. That's monitors, peripherals, etc. It didn't sound like a joke, it sounded like he was referring to his own rig...
Breaking down everything for my PC including decade-old hardware that it inherited comes out to $6,605 total when accounting for what I actually paid. I am including controllers and external hard drives that are not exclusively used on just this PC. So I guess it's not too far off 😅.
 
If 2-D sprites were still around I think you would see a better crop of games on the market. About the past 5-7 years 2-D games that are using 3-D sprites like Diablo 4 are making a dent in the gaming industry. The major problem with alot of 3-D games they all look similar. Once in a while with like Stalker 2 that just came out you get a decent 3-D game that actually stand out from the rest.
 
Looks like a nice upgrade from my 3080 Ti FTW! Extreme Hydro copper or whatever. Going to be weird having a non-EVGA card in my PC for the first time since Crossfired XFX 7970 GHz Edition & XFX 280X.
 
Raytracing is the future, there is no question about that.

I would argue that Raytracing is the Now.

While I have only heard of one title that requires it to always be on, it is in pretty much every title at this point, and ever since the introduction of RT, running raster only looks worse than it did before, meaning unless you use RT now, you get a gimped product.

But the frame generation lies is what really gets me, advertising those framerates as real. FG is basically an advanced version from Motion Interpolation that is found in every TV and what all gamers should turn off because of lag.

Agreed.

Sure GPU FG is much better and advanced, it has more data to work with and it can do it quicker, but still it has to buffer two frames where it can inject those "guesstimated" frames inbetween. We are taking a step back as far as input lag is concerned. And the downsides of frame generation get worse the lower your starting framerate is, the situation where it is needed the most. So far it works the best when you are already running the game fast, like 90fps and want to take it up to your monitors 120 or 144fps. But at that point it is just icing on already good cake. But no matter how good the icing is, if the cake is shit then it cannot save it.

Absolutely agreed. Frame gen makes the screen motion look smoother, until you grab the mouse and realize that your input lag is shit.

I would argue that unless you can get 60+fps without frame gen, enabling frame gen only results in a smoother terrible experience with bad input lag. And there is nothing any AI technology can do to change that.

Also EU prices are out. Almost 1000€ for 5070Ti!? Are you fucking kidding me!? With these kind of prices PC gaming is going to die. As a teen i could buy myself a nice gaming PC with what I earned when working during summer breaks. Now as an adult I have to really ponder will I keep up with my hobby or will I go to console gaming instead.

I am going to go out on a limb and guess that the EU prices are quoted with VAT/Sales tax? The U.S. prices are always quoted without.

That explains a large chunk of the difference, but not all of it.
 
I have mixed feelings. On one hand, you're correct. Even raster itself employs many tricks to get closer to "life". Shadows, rays, etc. The question has always been about "what's the real deal?" Is raster the complete removal of any "fakery"? No, not really. Raster is a lot of fakery, too (at least afaik). Is "real life" the actual source of truth? Then neither tech is inherently wrong. AI is simply approaching it from a different angle.

Here's what I predict happens (or is going to happen) when they train DLSS and whatnot: huge sets of GPUs actually perform the hard calculations for various scenarios at various angles, and then DLSS and whatnot is actually directly trained on the outcome. Then, these results are used to more quickly approximate the actual outcome. Instead of running ray tracing and all this stuff in real time, results are effectively being "cached" and used for interpolation and approximation. But I could be totally off base here.

Anyway on the other hand, socioeconomically I don't think we're ready for this. That's the elephant in the room. I personally feel like a tech dystopia is just on the horizon (well more like "has been on the horizon", from my viewpoint; I talked about this a long time ago already) unless someone curbs it. That, and maybe WW3 between us and China because I have no idea how Jensen is going to curb these things enough that China won't just smuggle them in anyway.

The thing about AI rendering... is it even more removes artist from the ART of making video games.

The game publishing giants are bad enough. They have for years been letting business encroach on the medium. We are decades passed a few programmers and a few visionary artists making classics like Doom. Also long removed from some of the early studios that hired talented art departments, with not just random art school grads but legit artists to create new and interesting game art. (now you find the odd Indy title but the big... I need to see this on a 5090 class games have been stripped of all that) Now the last year or two those publishers are even removing the random art school grades pumping out route derivative work. (like the high end Creed artists that had been copy and pasting Chinese architecture into a Japanese creed game) Replacing them with AI.. doing even more derivative crap cause that is all it is capable of doing.

Now on top of that Nvidia wants us to be ok with Nvidia, having your own machine iterate and fake wash the game one more time at the GPU itself. It is cool they can fake the look of cloth and other things. I don't know call me crazy I want to see what an game dev wanted me to see. I want to see the art hand created by the people working at that studio. Not some faux version based of a compressed 720p screen image made up of further compressed and faked textures and models... all lit with RT with a blanket of noise which is also smeared away by yet another layer of machine guessing.

Nothing is left of any actual artistic human expression after the multiple layers of AI are done with the output. I think AI has touched the image first created at high res on a artists screen at the big studios about as many times as Jensen said the new Agentic buzz word last night.

Raster... may not be sexy, and it may not get investors all hot. However for the most part what your seeing is what the game devs and their art dept intended.
 
If 2-D sprites were still around I think you would see a better crop of games on the market. About the past 5-7 years 2-D games that are using 3-D sprites like Diablo 4 are making a dent in the gaming industry. The major problem with alot of 3-D games they all look similar. Once in a while with like Stalker 2 that just came out you get a decent 3-D game that actually stand out from the rest.
D4 isn't using 2D sprites.

Octopath Traveler is using 2D sprites.
 
I would argue that Raytracing is the Now.

While I have only heard of one title that requires it to always be on, it is in pretty much every title at this point, and ever since the introduction of RT, running raster only looks worse than it did before, meaning unless you use RT now, you get a gimped product.



Agreed.



Absolutely agreed. Frame gen makes the screen motion look smoother, until you grab the mouse and realize that your input lag is shit.

I would argue that unless you can get 60+fps without frame gen, enabling frame gen only results in a smoother terrible experience with bad input lag. And there is nothing any AI technology can do to change that.



I am going to go out on a limb and guess that the EU prices are quoted with VAT/Sales tax? The U.S. prices are always quoted without.

That explains a large chunk of the difference, but not all of it.

Yes there are taxes included. Prices shown here have to have taxes included, unless you are a company buying stuff for your company but that is a different story with different set of rules.
 
Breaking down everything for my PC including decade-old hardware that it inherited comes out to $6,605 total when accounting for what I actually paid. I am including controllers and external hard drives that are not exclusively used on just this PC. So I guess it's not too far off 😅.
Yeah, if you include the PC I have attached to my Home theater and short throw projector... I'm in that 10K+ investment. But that took me over a decade to assemble... My main rig is more like $2500 (1,000 of it is the 7900XTX).
 
I might even pass until iGPU machines kill it at 1440p.

Good luck with that.

I'm not convinced that day will come for a very very long time.

iGPU's keep getting faster, for sure, but so do the demands of new titles.

In the last few years we've even seen titles increase their demands faster than the GPU technology is advancing, leading to some titles being rather difficult to drive.

The only way the iGPU is going to get to the "adequacy at 1440p" level is likely if you only want to play older titles.
 
"5070 is as powerful as a 4090"
*shows ray tracing performance increase over 4070 of maybe 20%*

Riiiiiiiight.
 
Good luck with that.

I'm not convinced that day will come for a very very long time.

iGPU's keep getting faster, for sure, but so do the demands of new titles.

In the last few years we've even seen titles increase their demands faster than the GPU technology is advancing, leading to some titles being rather difficult to drive.

The only way the iGPU is going to get to the "adequacy at 1440p" level is likely if you only want to play older titles.
Yeah. I think the problem with the iGPU thing is also that if you want that "Strix Halo" like performance.. You're gonna be looking at a gaming system and by the time it comes out it might be adequate for older titles but it won't be anything special for new ones. Now, it might allow a lot more power efficiency on a laptop and that's what I'm interested in. A couple hours of cordless gaming (on a laptop, not a tiny ass gaming handheld) is a myth right now.

Ryzen AI Max+ 395 comes packing 16 CPU cores and 32 threads paired with 40 RDNA 3.5 .... That's not going to be a cheap laptop. There will be no all day gaming here. Damn thing just draws way too much power for that.

For the same price as the laptop (or less) you might as well just buy a 5090... lol
 
Good luck with that.

I'm not convinced that day will come for a very very long time.

iGPU's keep getting faster, for sure, but so do the demands of new titles.

In the last few years we've even seen titles increase their demands faster than the GPU technology is advancing, leading to some titles being rather difficult to drive.

The only way the iGPU is going to get to the "adequacy at 1440p" level is likely if you only want to play older titles.

Nonsense! One day an iGPU will run 4K at 240Hz. It won't mean much on a 32K display though...

Yeah, it's a long way off.

Stop stomping on my dreams though!
 
Interesting that the card is not exhausting heat out of the case.

It does when you put a water block on it :D

Honestly, I'm always surprised when I hear of people using 600+w GPU's with stock coolers.

I think it's time we start thinking of the stock coolers on high end GPU's similarly to how we think of the box coolers that (used to) come with CPU's.
 
Everyone is starting to wake up to the fact that the RTX 5090 probably isn't as powerful as everyone wants it to be. It's definitely an iterative performance increase, which is sad considering I've had my RTX 4090 since launch day over 2 years ago.
 
I love my eye candy too, but frames/smoothness has its own unique benefit to the quality of the image. Like I'm not going to go from 144 FPS to 65 FPS just for slightly better reflections in limited areas.

Yes, we come full circle to why I'm very interested in the RT features of the 5090 lol
 
Back
Top