$2500 5090 I’m out

Can't wait for official non biased reviews.

ButI saw something over the weekend, forgot which website, but it was a respectable site, and they said early rumors of just raw performance without the gimmicks enabled like DLSS4 or AI regen stuff, but just raw straight up benchmarks, the 5090 is 20% faster than on average in several games at 4k compared to the 4090.


https://www.notebookcheck.net/RTX-5...ngside-notable-ray-tracing-gain.943815.0.html

RTX 5090 vs RTX 4090: Questionable RTX 5090 rasterization performance surfaces alongside notable ray tracing gain​


So, it is safe to say that the RTX 5090 could bring a close to 20% ray tracing improvement vs the RTX 4090. All in all, if the pure rasterization performance of the RTX 5090 is less than 10% faster than the RTX 4090 as discussed above, it will undoubtedly put the $400 price increase in question. This small rasterization improvement would also explain why Nvidia leaned so heavily into DLSS 4 during the RTX 50 unveiling.
 
Last edited:
" 5090 will be 27% better performer than the 4090 is spot on. Gamer Nexus also says a less than 30% increase. "

This based off raster raw performance only, not accounting for DLSS4 or AI regen stuff.

Rumors are the performance is power limited, with 720 watts going direct to the video card they say the 5090 then matches the 60% to 70% performance upgrade like the 3090 to 4090, but under 600w it's only a 30% or less upgrade from the 4090. They say a 5090 Ti is for sure coming.
 
Last edited:
My bigger worry after watching that video is how long do I need to wait for a 5090 Ti or a full power Blackwell in my PC. If it is really 700 watts or some such then I assume it is not coming to PC and will remain a server card?

2 Gs is a lot of money that I am willing to spend but not if it will be replaced soon with a Ti.
 
Are there some titles where the 4090 raster is not strong enough, game where it struggles with RT off ?

That 25-30% could be very true, but not sure if it will be an issue, scenario you need more grunt that come to mind seem to only be with some form of RT that come up to mind.
 
Last edited:
Getting out my tinfoil hat.

Edit: Okay I finished it. I hope there isn't an 800 watt consumer card in the future. What are we doing, we have 800 watt cards now? 575 will be bad enough come spring time.
I guess that's why Nvidia limited the 5090 to 575 watt. I love as much performance as possible but 700+ watt on one tiny device sounds kinda dangerous?

Part of the reason for the poor power to performance ration is that the 5090 is still on the same node as the 4090. When they can shrink the die again, that's when we should see the huge leap in performance that was expected.

I'm likely to get a 5090 so I'm hoping that the speculations in the video are wrong but it kinda makes sense considering even Nvidia's charts show a lower than expected leap in non-FG titles.
 
Are there some titles where the 4090 raster is not strong enough, game where it struggles with RT off ?

That 25-30% could be very true, but not sure if it will be an issue, scenario you need more grunt that come to mind seem to only be with some form of RT that come up to mind.
Mostly VR which is higher than 4k+high refresh, and latency sensitive, but that's a niche.
 
Last edited:
I guess that's why Nvidia limited the 5090 to 575 watt. I love as much performance as possible but 700+ watt on one tiny device sounds kinda dangerous?

Part of the reason for the poor power to performance ration is that the 5090 is still on the same node as the 4090. When they can shrink the die again, that's when we should see the huge leap in performance that was expected.

I'm likely to get a 5090 so I'm hoping that the speculations in the video are wrong but it kinda makes sense considering even Nvidia's charts show a lower than expected leap in non-FG titles.
Agreed 700 watts in many scenarios can be hazardous in a home environment. Not everyone has home insurance etc especially in parts of the world where it is not mandatory to have one.
 
" 5090 will be 27% better performer than the 4090 is spot on. Gamer Nexus also says a less than 30% increase. "

That can't be right. Then 5080 with half of everything would be like 50% slower. Making it slower than 4080.
But all these performance leaks don't mention resolution. At 1080p, 27% increase is believable. At 4k I doubt it's less than 50%. Raster performance.
Nvidia lifting review embargo 1 week before the official release date shows they are confident in 5090 performance, meanwhile 5080 reviews are coming out a day before its release. Meaning it probably isn't as impressive as 5090 (and it's not on paper).
 
That can't be right. Then 5080 with half of everything would be like 50% slower. Making it slower than 4080.
But all these performance leaks don't mention resolution. At 1080p, 27% increase is believable. At 4k I doubt it's less than 50%. Raster performance.
Nvidia lifting review embargo 1 week before the official release date shows they are confident in 5090 performance, meanwhile 5080 reviews are coming out a day before its release. Meaning it probably isn't as impressive as 5090 (and it's not on paper).
Not exactly half of the performance because of
1. Lower clocks
2. Lower power limit compared to number of cores
3. CPU bottlenecks
I guess with some more high end version of cards from AIBs it should be possible to OC the 5090 to get the same clocks as 5080 has with exactly twice the power which is whopping 720W
Still you won't get 100% faster frame rates unless you use very high resolutions like 8K. Performance itself should in this case be however twice what 5080 has.

As for 4090 vs 5080 I really really doubt the latter will be faster. So far it is supposed to have much less shader/raster performance, slightly slower theoretical RT performance, less memory bandwidth and only thing it will really be faster at is AI but not even normal FP8/16/32/64 formats used up to 40 series but it will introduce new FP4 format which will be twice as fast as FP8. This is repeat of the story from Ampere vs Ada Lovelace where the latter architecture introduced FP8 format but FP16 did not get any performance boost.

With that the only way 50 series cards will get any performance benefit from "faster AI" is if DLSS models used for 50 series are different than for 40 series. Same model made for FP8 could be executed on FP4 after the conversion but you loose a lot of precision (it is 16 possible numbers for FP4 vs 256 numbers for FP8 !!!) and would lead to worse image quality (higher errors vs reference) so at most 50 series would need to use different models specifically made for them and which most likely would need more parameters to offset reduction of numerical precission - but then we would not be talking about twice the performance of execution of such models and image would still be somewhat different so most likely 50 seriec cards will only have MFG models running using FP4 precission.

At this moment it would be good to actually check if e.g. 30 series use the same DLSS Super Resolution models as 40 series. I would expect Nvidia to not change their strategy regarding these things. Then again who knows... maybe they do have plan to make 50 series use different super resolution models even if they didn't do that for 40 series.

Either way I doubt Nvidia will come forward and explain these nuances for us. As far as they are concerned 50 series have 2x AI performance and they really want everyone including investors to think that they do.
Heck, from what w heard Jensen would want people to treat fake generated frames the same as real frames and 5070 to be a card with 4090 levels of "performance" which is just a SCAM.
 
That can't be right. Then 5080 with half of everything would be like 50% slower. Making it slower than 4080.
This is assume perfect scaling with hardware growth, the performance gap between a 4090D and a 4090 was smaller than their core count would indicate and the 5080 will not run game at half the power.

The 4090 had 70% more cores with 40% more bandwidth than the 4080 was about 27-30% faster.

The more can't be right part would be gamer nexus not respecting the embargo ? That sound dubious, is that simply him commenting on the same bar on the same graph everyone saw ?
 
Last edited:
So a 5090 over a 4080 not 4090 is anywhere from what 45% to 60% faster in just raw performance?
 
Agreed 700 watts in many scenarios can be hazardous in a home environment. Not everyone has home insurance etc especially in parts of the world where it is not mandatory to have one.
I don’t follow. Why is 700 watts hazardous but not 575?
 
Same model made for FP8 could be executed on FP4 after the conversion but you loose a lot of precision (it is 16 possible numbers for FP4 vs 256 numbers for FP8 !!!) and would lead to worse image quality (higher errors vs reference)
I am not sure how much we can imagine the impact of AI weights (versus how much more parameters you can have in exchange as they get smaller) and the final result like that, if you look at flux for example 16bits vs 4bits model output:
https://blackforestlabs.ai/flux-nvidia-blackwell/

Not super obvious. And here specially it is a completely different way to do upscaling versus the past and a strategy that has the big advantage of the 5 previous explosive years in transformer ML-AI (that what LLM like chatgpt use) advancement versus 2019 for DLSS 1 and 2022 for DLSS 2 era.

Maybe all cards will use the same new DLSS model that use transformer now (they will all use a transformer model at least), the old card can always run a FP4 model s well they are just less optimized just for it, the way Nvidia phrase it:

DLSS Transformer Model​

5th Gen Tensor Cores: DLSS 4 employs 5 AI models, including upscaling, ray reconstruction, and multi frame generation between every two rendered frames. The RTX 50 family leverages the 5th Gen Tensor cores which offer 2.5x more throughput than their predecessors. This is achieved by switching to a lower precision FP4 metric.
DLSS 4 replaces the widely used CNN models with the transformer model. This vision transformer features “self-attention” operations that evaluate the relative importance of different pixels across a frame and over multiple frames. Using twice as much data as CNN allows transformer models to improve temporal stability, reduce ghosting, and increase the detail perceived in motion.


Make it sound if upscaling-RR and FG are on FP4 now, but not 100% clear, the 2 sentence just follow each other and make it look like it do, which could create a significant gap in DLSS performance between Blackwell and the previous one, specially say between a 2060/3060 and a 5060 class card.
 
Last edited:
So a 5090 over a 4080 not 4090 is anywhere from what 45% to 60% faster in just raw performance?
Hard to say anything before the reviews to be released January 24... but if a 5090 is about 30% faster than a 4090 and a 4090 are about 30% faster than a 4080, 1.3*1.3 = 1.69, or ~69% faster, if it is 28% and 28% = 1.28*1.28 = 1.63 about 63% faster.
 
Per their social media pages it looks like Microcenter is going the route of first-come-first-served with no online orders for pickup. In other words, I hope you have a comfy folding chair and some warm clothing. They haven't answered anything about their online store so I guess that could be an option. They haven't touched any questions about quantities, either.
Their Santa Clara store still has no ETA so I'll try my luck sniping one on BB or NE, if I want one.
 
Their Santa Clara store still has no ETA so I'll try my luck sniping one on BB or NE, if I want one.

Their nationwide social accounts are claiming stores will have 'em that morning. The staff might just be in the dark. Back before "video card madness" hit I bought an 8000-series card that the staff had 0 clue they even had in stock on release morning.
 
" 5090 will be 27% better performer than the 4090 is spot on. Gamer Nexus also says a less than 30% increase. "

This based off raster raw performance only, not accounting for DLSS4 or AI regen stuff.

Rumors are the performance is power limited, with 720 watts going direct to the video card they say the 5090 then matches the 60% to 70% performance upgrade like the 3090 to 4090, but under 600w it's only a 30% or less upgrade from the 4090. They say a 5090 Ti is for sure coming.
The 4090 doesn't fully use its TDP. I would expect the same for the 5080. Maybe you could just OC/UV the 5090 to the same clocks and increase power limit but I doubt that will provide over 30% performance uplift. The 5090 Ti is not for sure coming. Most of those full fledge dies are being used for datacenters. Maybe if enough of them don't meet the datacenter standards but still don't want to cutdown too much to a 5090 then they might decide we can market another faster card. I am having even doubts of a 5080 Ti that slots between a 5080 and 5090. The same gap could have been filled by the 4080 Super but it ended up just being a price cut. The reason people are anticipating a 5080 Ti is that the 24gb version isn't available.
 
I am not sure how much we can imagine the impact of AI weights (versus how much more parameters you can have in exchange as they get smaller) and the final result like that, if you look at flux for example 16bits vs 4bits model output:
https://blackforestlabs.ai/flux-nvidia-blackwell/

Not super obvious. And here specially it is a completely different way to do upscaling versus the past and a strategy that has the big advantage of the 5 previous explosive years in transformer ML-AI (that what LLM like chatgpt use) advancement versus 2019 for DLSS 1 and 2022 for DLSS 2 era.

Maybe all cards will use the same new DLSS model that use transformer now (they will all use a transformer model at least), the old card can always run a FP4 model s well they are just less optimized just for it, the way Nvidia phrase it:

DLSS Transformer Model​

5th Gen Tensor Cores: DLSS 4 employs 5 AI models, including upscaling, ray reconstruction, and multi frame generation between every two rendered frames. The RTX 50 family leverages the 5th Gen Tensor cores which offer 2.5x more throughput than their predecessors. This is achieved by switching to a lower precision FP4 metric.
DLSS 4 replaces the widely used CNN models with the transformer model. This vision transformer features “self-attention” operations that evaluate the relative importance of different pixels across a frame and over multiple frames. Using twice as much data as CNN allows transformer models to improve temporal stability, reduce ghosting, and increase the detail perceived in motion.


Make it sound if upscaling-RR and FG are on FP4 now, but not 100% clear, the 2 sentence just follow each other and make it look like it do, which could create a significant gap in DLSS performance between Blackwell and the previous one, specially say between a 2060/3060 and a 5060 class card.
I guess you could convert model made for FP4 to FP8 and load it to 40 series. Not sure about actually doing calculations with FP4 precission to get identical results but I guess running such model in FP8 should give comparable results that are not worse... convoluted way to not say 'better' 🙃

As for support for FP4 - with AI it does seem you don't need high precision for inferencing and increased performance can allow to have more parameters with the same memory and performance so it is an improvement. More parameters at lower precision seems to be better for AI. Not sure about training - I would guess training is best done at higher precision and later model to be downscaled and optimized for lower precision but I am not AI expert to be sure.

One thing is sure though: at FP8 the new GPUs won't perform much better at all. It is not like Nvidia increased transistors allocated to AI cores by 2x - more like improved AI cores to support lower precision with higher performance and for executing FP8 models the effective AI performance in TOPS needs to be cut in half so FP8 performance of 5090 would really be 1676* and versus 1321 of doesn't sound that impressive anymore as 3352 vs 1321 sounds like.
Anything designed for older GPUs like older versions of DLSS when running older games won't take advantage of the improvements. Not a big issue if Nvidia allows hot-swapping used DLSS model without even needing to fiddle with DLLs but still.

I guess we will need to wait for Transformer DLSS to come out to older RTX card to be able to do any actual comparisons.

*) This is rough napkin style calculation and not 'real' theoretical performance.
 
Not sure about actually doing calculations with FP4 precission to get identical results but I guess running such model in FP8
not sure if the word calculation is the right one here when it come to AI inference, a 7B parameter FP8 will give better result than a 7B FP4 model, but that not necessarily the choice, you can have a smaller and faster 12B FP4 model than the FP8 7B parameter one and maybe it get more complicated than that and I imagine depends on what to do with them. But here Nvidia as the advantage of going from CNN to transformer FP4 not from FP16 or 8 to FP4 transformer and transformer way to do it will have a giant advantage of how much context (from previous and the total current frame) it use.

So even if more parameter but not more to remove all the gain in performance-vram fp4 look worst than fp8 for dlss using transformer, we will not see that comp and will not be its competition.

I guess you could convert model made for FP4 to FP8 and load it to 40 series.
You can run FP4 model on a 40 series gpu, just not expect it to be twice as fast, but should still take less vram.
 
Last edited:
  • Like
Reactions: XoR_
like this
Their nationwide social accounts are claiming stores will have 'em that morning. The staff might just be in the dark. Back before "video card madness" hit I bought an 8000-series card that the staff had 0 clue they even had in stock on release morning.
I meant that the store itself has no ETA on when it will open lol
 
Are there some titles where the 4090 raster is not strong enough, game where it struggles with RT off ?

That 25-30% could be very true, but not sure if it will be an issue, scenario you need more grunt that come to mind seem to only be with some form of RT that come up to mind.
If you have a 4K 240hz monitor then there are many titles that still have a hard time hitting 4K 120+ hz without using DLSS. Some of those titles still don't support DLSS, like Callisto Protocol, for example.

4K DLSS quality, which is upscaling from 1440P, also won't maintain a 200-240 fps in most new AAA titles.

So more raster performance would definitely be welcome even for a 4090 owner.
 
  • Like
Reactions: XoR_
like this
I meant that the store itself has no ETA on when it will open lol
I have a microcenter 30 mins away. No way I'll be camping overnight tho.

But if I showed up two hours before it opens, and end up around 50th spot, what are the chances of grabbing a 5090?

Do they have good security? It's pretty scary holding a $2200 item that can be easily snatched away.
 
You can run FP4 model on a 40 series gpu, just not expect it to be twice as fast, but should still take less vram.
Not really. Ada Lovelace whitepaper does not mention FP4 even once: https://images.nvidia.com/aem-dam/S...idia-ada-gpu-architecture-whitepaper-v2.1.pdf
These things are wired in hardware so at most you could run FP4 model by converting it to FP8 and in this case it would use more ram but on the other hand it would possibly be more precise... though if that is good, bad or ugly it is not really clear.
Since the same situation with new supported formats happened with 40 series it could be used to compare image quality between 40 series cards and 10/20 series cards using reference images/animations which give reproducible results. If any differences are detected between different chip generations (between 40 series and older RTX cards) it might point to using higher precision on older cards.

not sure if the word calculation is the right one here, a 7B parameter FP8 will give better result than a 7B FP4 model, but that not necessarily the choice, you can have a smaller and faster 12B FP4 model than the FP8 7B parameter one and maybe it get more complicated than that and I imagine depends on what to do with them. But here Nvidia as the advantage of going from CNN to transformer FP4 not from FP16 or 8 to FP4 transformer and transformer way to do it will have a giant advantage of how much context (from previous and the total current frame) it use.
This is obvious - bigger model can be more complex and handle more complex... 'stuff'? If you can process larger model within the same time and use less memory while doing so it is a win win situation.
Still not apples to apples comparison if we use older models. That is why if Blackwell GPUs will use FP4 for DLSS SR or use FP8 will matter.

So even if more parameter but not more to remove all the gain in performance-vram fp4 look worst than fp8 for dlss using transformer, we will not see that comp and will not be its competition.
If FP4 is worse or needs significantly more parameters reducing performance gains then it makes things simpler for Nvidia: use higher precission for it.
We have 3 different 'fastest formats' on 20/30, 40 and 50 series GPUs making things more complicated.

And as for comparisons...
...XeSS is a great example that running different models under the same technology name can be an issue and even detrimental for the technology. XeSS on Intel GPUs use different more complete model that uses XMX acceleration and XeSS in this case has much better image quality with much less issues. When we see comparisons between DLSS and XeSS we usually see the single GPU being used putting XeSS at an disadvantage. Of course if you do have RTX card the comparison of DLSS to XeSS using DP4a is all that matters but it might make wrong impression as to what XeSS is and what to expect of this tech on Intel cards.

It is usual that all the nuance which makes things complex fast which can be skipped will gonna be skipped.
If DLSS Transformer looks slightly different between different generation of RTX cards but newer card doesn't necessarily mean better then it makes things complex really really fast. If it was higher precission on newer cards vs lower precission on older cards then things are simple: get new GPU to get better image quality. What in case older GPU has better DLSS SR image quality? Yeah...

If I had older RTX card on hand I would do comparison but I don't have enough money to keep multiple RTX cards at hand. I will in the future as I usually have bunch of older cards but only when they become almost worthless 🙃
 
If you have a 4K 240hz monitor then there are many titles that still have a hard time hitting 4K 120+ hz without using DLSS. Some of those titles still don't support DLSS, like Callisto Protocol, for example.

4K DLSS quality, which is upscaling from 1440P, also won't maintain a 200-240 fps in most new AAA titles.

So more raster performance would definitely be welcome even for a 4090 owner.
More raster performance is all that really matters. Unless FG tech improves to the point it is actually good. For now it is more like an experimental feature that in some cases can improve gameplay but overall it is best left disabled.
DLSS SR and such by itself are not using much time to execute so differences in execution times while affecting performance are not the most contributing factor.

For overall performance 4090 will be 2nd best GPU just after RTX 5090 just like it is now the best GPU bar none (except maybe Quadro RTX 6000 Ada...)
 
like Callisto Protocol,
That a Raytracing game too, no ?

Even then with the game patch (it is true that it was under 120fps at launch and people that buy that type of cards probably tend to play game at launch quite a bit):
https://www.techspot.com/article/2910-ray-tracing-performance-test/
TCP-p.jpg


RT off, seem to be fast enough by now (but maybe it depends on the game section)
 
Callisto is a total mess, AMD sponsored title with the worst RT ever that no one should use. It eats frames and stutters, without looking any different. And no DLSS option.

The game does look impressive in raster, though.
 
I have a microcenter 30 mins away. No way I'll be camping overnight tho.

But if I showed up two hours before it opens, and end up around 50th spot, what are the chances of grabbing a 5090?

Do they have good security? It's pretty scary holding a $2200 item that can be easily snatched away.

I think it's going to depend on demand. Based upon what I was told, the 3090's were basically all accounted for well before dawn. In fact, the store apparently went a little rogue and told people to show up at closing time the night before to get a ticket. Even then, they only got 40. The 4090's were available to people in line, though. They opened at normal time both days.
Security-wise, it probably depends on your area. Can't say I've ever felt threatened buying/carrying pricy electronics, but I've also never been carrying any in a truly dangerous area. You have to think that someone that brazen would probably steal from literally anyone shopping at a high-end computer store, though.
 
I think it's going to depend on demand. Based upon what I was told, the 3090's were basically all accounted for well before dawn. In fact, the store apparently went a little rogue and told people to show up at closing time the night before to get a ticket. Even then, they only got 40. The 4090's were available to people in line, though. They opened at normal time both days.
Security-wise, it probably depends on your area. Can't say I've ever felt threatened buying/carrying pricy electronics, but I've also never been carrying any in a truly dangerous area. You have to think that someone that brazen would probably steal from literally anyone shopping at a high-end computer store, though.
That's crazy. I probably should call the store the night before launch just to see if I should even bother going the next morning.

I feel like the demand should be a little less. Because the AIBS would probably start from $2100, which is damn high even for scalpers. Also, most people are pretty happy with a 4090 compared to a 3090 which was already lacking in 4K and many RT games.
 
More raster performance is all that really matters. Unless FG tech improves to the point it is actually good. For now it is more like an experimental feature that in some cases can improve gameplay but overall it is best left disabled.
DLSS SR and such by itself are not using much time to execute so differences in execution times while affecting performance are not the most contributing factor.

For overall performance 4090 will be 2nd best GPU just after RTX 5090 just like it is now the best GPU bar none (except maybe Quadro RTX 6000 Ada...)

It'd be nice to have 240 fps of "real" frames but I wouldn't say that FG is useless. For example, fighting games which are typically locked to 60fps for competitive reasons, look much smoother with lossless scaling's 4X frame gen with no perceptible loss in playability. I was still able to put off moves without feeling a delayed response.

A slower paced title like AW also looks much better with frame-gen on. Without frame-gen, there are certain segments in that game which almost look like a slide show even on a 4090.

Also, if you're running VRR on an OLED, it's best to keep FPS as high as possible in order to reduce VRR flickering. That's one benefit that people rarely talk about but it's a huge difference in some games.
 
Last edited:
I have the money and there are absolutely some games that an additional 20-30% will make a difference in my gameplay experience. If "gimmicks" don't look any different from "real" frames...I don't really care? I'm not going camping, but I will be F5'ing like a mofo that Thursday night.

Why do some people stay in nice hotel suites and others sleep at Motel 6? Why does the Capital Grill exist if we could all just be eating Del Taco?
 
What a world we live in, where people will camp out to get a $2,500.00 video card. Wow.

Especially if these 5090 buyers already have a 4090 that's insane IMO.

I just don't get it, what's it magically going to do getting it right away Jan. 31st? Give you 171 fps over the 147fps you currently get with a high-end card. Ok ...

Plus there's zero new games pushing hardware right now. Maybe Doom Dark Ages or Battlefield 6 or some other upcoming games.

I say if you want a 5090 just walk in casually to a Microcenter or Best Buy and see if they have in stock, just keep doing that weekly and eventually you'll get one in a few weeks or couple months.

Plus if the rumors are true, the 5090 is only 30% faster than a 4090 then no way would I fork over that much ca$h to nGreedia, just on principal they shouldn't get my money for such an underwhelming release and such an outrageous price. Now if the 5090 is like 75% faster than a 4090 in raw raster performance without the gimmick crap like DLSS4 or AI stuff, then yes that's an understandable price.

But hey, that's just me. Different strokes for different folks I guess.

Plus it's the year 2025 why do buyers need to wait in line in the winter to buy a product? How isn't this available via pre-order online and have it shipped to you like a new smartphone. Going to an actual store just to see if it's in stock seems very outdated.

Can you go to the nVidia website and actually place an order for a RTX-5090 FE, bill me now, and ship to me when in stock? I would be tempted to do so AFTER I see real reviews next week.

But isn't the only way to get the FE version through Best Buy? Which is nearly impossible, they sell out in seconds, and restock you have less than one minute to click the purchase button, otherwise it's gone immediately.
 
Last edited:
Vast amount of apple phone sales goes through carrier and I feel that make scalping for those extremelly niche, as it would be hard to scalp from dataplan-carrier-replace your current phone promo models and that competition calm down how much people are ready to pay for unlocked phone, also said unlocked phone are already priced to the moon.

Apple have its store chain infrastructure and everything it need year long and not just the weeks of the launch that also help versus the Sony-Nintendo-Nvidia and they only sell to people with an AppleID, which seem more natural and easy to detect scalper than Nvidia accounts.
 
Callisto is a total mess, AMD sponsored title with the worst RT ever that no one should use. It eats frames and stutters, without looking any different. And no DLSS option.

The game does look impressive in raster, though.
Recently played through it and with RT off it performs really well now. Tested it with RT on the DLC and it wasn't a stuttering mess like it was on release but it still wasn't 100% smooth.
 
I did a playthrough with RT off a month ago on a 4090 and 7800X3D, RT felt horrible, but with it off it was fine.

I even tried lossless scaling frame gen on it but that didn't cut it. And since it doesn't change the visuals... whatever, RT off it is.
 
What a world we live in, where people will camp out to get a $2,500.00 video card. Wow.

Especially if these 5090 buyers already have a 4090 that's insane IMO.

I just don't get it, what's it magically going to do getting it right away Jan. 31st? Give you 171 fps over the 147fps you currently get with a high-end card. Ok ...

Plus there's zero new games pushing hardware right now. Maybe Doom Dark Ages or Battlefield 6 or some other upcoming games.

I say if you want a 5090 just walk in casually to a Microcenter or Best Buy and see if they have in stock, just keep doing that weekly and eventually you'll get one in a few weeks or couple months.

Plus if the rumors are true, the 5090 is only 30% faster than a 4090 then no way would I fork over that much ca$h to nGreedia, just on principal they shouldn't get my money for such an underwhelming release and such an outrageous price. Now if the 5090 is like 75% faster than a 4090 in raw raster performance without the gimmick crap like DLSS4 or AI stuff, then yes that's an understandable price.

But hey, that's just me. Different strokes for different folks I guess.

Plus it's the year 2025 why do buyers need to wait in line in the winter to buy a product? How isn't this available via pre-order online and have it shipped to you like a new smartphone. Going to an actual store just to see if it's in stock seems very outdated.

Can you go to the nVidia website and actually place an order for a RTX-5090 FE, bill me now, and ship to me when in stock? I would be tempted to do so AFTER I see real reviews next week.

But isn't the only way to get the FE version through Best Buy? Which is nearly impossible, they sell out in seconds, and restock you have less than one minute to click the purchase button, otherwise it's gone immediately.
Nvidia has never been a consumer friendly company, part of the strategy is creating an unobtainium aura around their 'video cards'. Technically they can sell FE cards amongst other products locally in an Apple like store near their HQ.
 
Do people that an nvidia store near their HQ would change anything ? that the apple magic, a single physical store near an HQ ? what would it change versus the closest bestbuy that sales them ?
 
Do people that an nvidia store near their HQ would change anything ? that the apple magic, a single physical store near an HQ ? what would it change versus the closest bestbuy that sales them ?

What? I don't understand
 
Back
Top