Which RTX 4090 card are you planning or consider to get?

Halo is halo. That doesn't make the rest of the stack pricing okay in my opinion. I don't think I've seen many argue your point.
No one until this gen has argued that the halo card is a “value” - NVIDIA’s marketing works!

They cleared the 3xxx stock. Sold out the halo card, and even sold out the 2nd best - all while beating AMD and “unlaunching” a card. It’s incredible.
 
Where did you find one? I want a 4090FE due to it being the only one that will fit my case (except for AIO) but I haven’t seen any in stock since the release date. I am thinking about just giving in and trying to snatch a 7900xtx tomorrow morning because I have a feeling I will never see a 4090FE.
I’m in Europe where the RTX 4090 are actually in stock here and there. You can’t always be picky with brands though.

Current komplett.dk that covers Denmark, Sweden and Norway have them in stock.
 
I’m in Europe where the RTX 4090 are actually in stock here and there. You can’t always be picky with brands though.

Current komplett.dk that covers Denmark, Sweden and Norway have them in stock.
Gotta say that's one of the better sites out there, if the info is accurate. Pick up only :)

1670915442108.png
 
No one until this gen has argued that the halo card is a “value” - NVIDIA’s marketing works!

They cleared the 3xxx stock. Sold out the halo card, and even sold out the 2nd best - all while beating AMD and “unlaunching” a card. It’s incredible.
My calling the 4090 Halo pricing the best “value” from Nvidia is intended to directly insult Nvidia.

At least with the halo pricing, you get halo performance. None of the rest of the stack offer halo performance, but prices of the entire stack now are based off halo price.
 
Last edited:
My calling the 4090 Halo pricing the best “value” from Nvidia is intended to directly insult Nvidia.

At least with the halo pricing, you get halo performance. None of the rest of the stack offer halo performance, but prices of the entire stack now are based off halo price.
That's what halo means, though...only one product is halo. :)
 
Is it really that bad though? You can game/3d Render/Create Content/Do AI work. for Lets call it $800 a year or less because you can sell that card for a little bit of cash and keep current. I mean Jesus car notes house notes phone bills insurance are more then that and you don't get any of that money back. I mean does everyone on here have dead end job's or make $30k a year? Retired and no job? I do see a lot of bastards on here lol. But I do agree prices should be lower but i can live with these prices. I say go pick up a second job for like 2 months doing shit security work were all you have to do is sit in a chair and check twic cards because i think its going to get wayyy worse then it is now.

you have

1. people paying the asking price
2. amd not going for the high end
3. tsmc raising prices because the research for these products is insane
4. the world just printing money and jobs not paying more.

and if mining ever comes back holy shit rolling black outs count on it

Edit: yes everything under a 4090 is garbage.
 
Last edited:
You are going to be waiting a long ass time for a 4090 TI. Is it really worth it? I guess if you are going to keep it for a long ass time but if you are going to jump on the 5090 i would buy it when you can at the start of the year.
I was already prepared to skip this generation, anyway, so waiting is not an issue.
 
FYI, I screwed up and deleted/warned a bunch of posts. I am a dunderhead and realized I was not viewing the right thread topic. All posts have been undeleted, and warnings deleted. My apologies. Carry on.
 
Edit: yes everything under a 4090 is garbage.
I could get behind a 7900XTX at $800, $900 for an OC version. If Nvidia considers a 4080 drop to $999 (which they should) AMD could retaliate and suddenly the market corrects.

Otherwise, what do we get, a 4070 for $899? Is that the new world order?
 
I got the 4090FE, it's the best computer purchase I've made since 3DFX Voodoo 2 SLI. After owning the space heater 3090, this thing is just a breath of fresh cool air. It's so good.
How is a 450 watt 4090 any less of a space heater than a 350 watt 3090.
 
  • Like
Reactions: noko
like this
How is a 450 watt 4090 any less of a space heater than a 350 watt 3090.

I don't mine. A 3090 is hot as hell, the 4090 is not. Temps that range 73-80 are going to warm a room far more than temps that range 55-60? 450 watts doesn't seem to matter. It's handled well either by design or a big hunk of metal. I think both it's just a cooler running card



3090 my gaming room got warm as hell. With the 4090 it does not. It's so much cooler than the 3090 it is literally one of the best things about the damn thing. I can't tell if you were genuinely asking but I have to wonder if you've ever even gamed on a 3090. That card was a toaster. It was really annoying actually. Nvidia just threw up power on a board lol. Users were left to deal with that heat. The 4090 fan doesn't even run during some of heaven benchmark lol. It's cool literally. Not trying to be snarky, I just don't think people are really appreciating just how well the 4090 is made and runs, it's next level compared to the 30 series IMO and that's not even about it's performance.
 
Last edited:
I don't mine. A 3090 is hot as hell, the 4090 is not. Temps that range 73-80 are going to warm a room far more than temps that range 55-60? 450 watts doesn't seem to matter. It's handled well either by design or a big hunk of metal. I think both it's just a cooler running card



3090 my gaming room got warm as hell. With the 4090 it does not. It's so much cooler than the 3090 it is literally one of the best things about the damn thing. I can't tell if you were genuinely asking but I have to wonder if you've ever even gamed on a 3090. That card was a toaster. It was really annoying actually. Nvidia just threw up power on a board lol. Users were left to deal with that heat. The 4090 fan doesn't even run during some of heaven benchmark lol. It's cool literally. Not trying to be snarky, I just don't think people are really appreciating just how well the 4090 is made and runs, it's next level compared to the 30 series IMO and that's not even about it's performance.
The cooler on the 4090 is more efficient, meaning it is outputting more heat at a faster pace into the room it's in. The card is running cooler, but that doesn't mean the room is any cooler.
 
The cooler on the 4090 is more efficient, meaning it is outputting more heat at a faster pace into the room it's in. The card is running cooler, but that doesn't mean the room is any cooler.
Yeah I had a feeling that he just didn't understand exactly how things work and was only looking at temperature of the card.
 
Not just the cooler and my room is much cooler the card is mildly warm to the touch the air coming out of the back is much cooler compared to the 3090 which was hot as hell to the touch. Cooler card=cooler room. Source: Me in said room that I used to have to open a window in and now no longer do.
 
Not just the cooler and my room is much cooler the card is mildly warm to the touch the air coming out of the back is much cooler compared to the 3090 which was hot as hell to the touch. Cooler card=cooler room. Source: Me in said room that I used to have to open a window in and now no longer do.
So instead of realizing that you didn't understand exactly how things work you're going to double down? The only way the 4090 will be putting less heat in your room is if it's not even close to being fully utilized and using less power than the 3090 was.
 
So instead of realizing that you didn't understand exactly how things work you're going to double down? The only way the 4090 will be putting less heat in your room is if it's not even close to being fully utilized and using less power than the 3090 was.


Yeah, I am. My room is cooler after putting in a new GPU. The biggest change in heat output I can feel is from my GPU, where as before it was cranking hot air out, now it isn't. So again, the GPU running cooler, regardless of how, is resulting in a cooler gaming room. I haven't checked too much into utilization but it's running at the correct clock speeds and my games are running very fast.
 
So instead of realizing that you didn't understand exactly how things work you're going to double down? The only way the 4090 will be putting less heat in your room is if it's not even close to being fully utilized and using less power than the 3090 was.
It's like the AMD has better picture quality thread. People have made up their mind, it just works.
 
The top end cards have never been close to cheap since the GTX 580 if I remember correctly. Did I like paying $1799 for my GB Gaming OC? Nope. I wanted the fastest GPU I could buy and that was the price. Sucks to be me.

GTX 580 was $500.
The GTX 680 was also $500. Although it wasn't the halo card, it was still 50% faster than the 580.

Things got a bit weird after Titan at $1000. Titan replaced the halo card, but at least it was its own brand. Either you got a Titan for $1200, or you got the Ti for for $700. And that's how it was for a while - what I believed was the happy middle ground. The high premium for the halo card, or a much more budget friendly, but still pretty much halo card.

Then came Titan V at $3000, and it still sold...
So then came along the 2080Ti... at $1200 with Titan RTX at $2500 (wasn't as good as the V in compute).

Prices doubled in a generation. From $700(Ti)/$1200(Titan) to $1200(Ti)/$2500(Titan).

Now we have x90 instead of x80Ti so that's better right? So let's add another $300 there:
$1500 for the 3090.
$1600 for the 4090.

For what was just 4 years ago right prior to the 2080Ti release: the $700 price bracket (1080Ti).

So how many people who bought the 4080/4090 were ripping on Titan owners just a few years ago?
 
GTX 580 was $500.
The GTX 680 was also $500. Although it wasn't the halo card, it was still 50% faster than the 580.

Things got a bit weird after Titan at $1000. Titan replaced the halo card, but at least it was its own brand. Either you got a Titan for $1200, or you got the Ti for for $700. And that's how it was for a while - what I believed was the happy middle ground. The high premium for the halo card, or a much more budget friendly, but still pretty much halo card.

Then came Titan V at $3000, and it still sold...
So then came along the 2080Ti... at $1200 with Titan RTX at $2500 (wasn't as good as the V in compute).

Prices doubled in a generation. From $700(Ti)/$1200(Titan) to $1200(Ti)/$2500(Titan).

Now we have x90 instead of x80Ti so that's better right? So let's add another $300 there:
$1500 for the 3090.
$1600 for the 4090.

For what was just 4 years ago right prior to the 2080Ti release: the $700 price bracket (1080Ti).

So how many people who bought the 4080/4090 were ripping on Titan owners just a few years ago?
The GTX 680 was most certainly not 50% faster than the 580. It was just 30% faster according to the techpowerup review at the highest tested resolution.

https://www.techpowerup.com/review/nvidia-geforce-gtx-680/27.html
 
I grabbed a low end 4090 - mainly because it was the one I could grab. I used to run SLI for many generations up to the 980Ti's. The 1080Ti was the first card I felt I could get away with a single card for the performance I wanted, where I could get it. SLI didn't work on everything. So, I was paying double for the performance for awhile. 20 series, meh. 3090 and I felt I was back in business. And the 4090, even the entry level is another good bump in performance / price, in my opinion.

Having said that, I wish there was a good ~$250-300 card available. Price is why we see the 1060 top the Steam charts for years only now being supplanted by the 1650.
 
So instead of realizing that you didn't understand exactly how things work you're going to double down? The only way the 4090 will be putting less heat in your room is if it's not even close to being fully utilized and using less power than the 3090 was.
This is exactly what is happening in my case. My undervolted 4090 is so fast that the GPU utilization is lower when the frame rates hit the cap around my 3440 x1440 monitor's 120Hz refresh rate which is pretty much almost all the time. In this situation, the card power consumption is under 300W and DLSS drives power consumption down even lower. Only with the most demanding games do I see it reaching the low 400W range, but still, I've never really seen it hit 450W.

Comparing this with my previous card, the undervolted 3080 Ti, that card was always running full GPU utilization at 300-350W with framerates below 100fps on average and the 3080 Ti really made the room hotter by 1-2C on average. This difference is also felt when I put my hand next to the exhaust fan of the case.

I do wonder if the size of the cooler makes any difference in terms of how fast or slow heat is dissipated into the room. With the 3080Ti, the fan speeds were around 1700-2000rpm. With the 4090, the fans only spin at around 1000-1500RPM.

HUB did a total system power test with the framerates locked at 90fps. The results seem to align with my findings.
Screenshot_20221215_122556.png
 
Last edited:
So instead of realizing that you didn't understand exactly how things work you're going to double down? The only way the 4090 will be putting less heat in your room is if it's not even close to being fully utilized and using less power than the 3090 was.
That would be quite common for people with a 144hz or less monitor no ? Lot of games will be around 250 watt at 120fps
 
Yeah I game at 1440p 165. I don't think I am stressing the card out at all compared to the 3090 then? I'm really not looking to argue or prove anyone wrong, but all I can say is my room is cooler since upgrading to a 13th gen CPU from an 10th and from a 3090 to a 4090. The air coming out of the back is cooler, the card itself, is only mildly warm to the touch where as the 3090 was hot as shit. I don't know why and I'm really done explaining. I mean it's silly at this point. "No, the room YOU are sitting in is in fact not cooler at all, you are wrong"- person not anywhere near me. Sorry if I'm wrong on how it all works, but the bottom line is I am gaming in a cooler room where I no longer have to open the freaking window which I did ALL THE TIME before this. So, let's end it there. I'll be wrong, my room is actually hotter and I'm just confused. Even though I've had a 2080ti, 3070, 3080, 3090 AND 4090 in this very room, my concepts of ambient temps are just out of whack. ;) Sorry again for being so dumb.
 
Yeah I game at 1440p 165. I don't think I am stressing the card out at all compared to the 3090 then? I'm really not looking to argue or prove anyone wrong, but all I can say is my room is cooler since upgrading to a 13th gen CPU from an 10th and from a 3090 to a 4090. The air coming out of the back is cooler, the card itself, is only mildly warm to the touch where as the 3090 was hot as shit. I don't know why and I'm really done explaining. I mean it's silly at this point. "No, the room YOU are sitting in is in fact not cooler at all, you are wrong"- person not anywhere near me. Sorry if I'm wrong on how it all works, but the bottom line is I am gaming in a cooler room where I no longer have to open the freaking window which I did ALL THE TIME before this. So, let's end it there. I'll be wrong, my room is actually hotter and I'm just confused. Even though I've had a 2080ti, 3070, 3080, 3090 AND 4090 in this very room, my concepts of ambient temps are just out of whack. ;) Sorry again for being so dumb.
I mean it's pretty simple. For 1440p @ 165hz a 4090 is serious overkill, you're not utilizing the card @ 100% like you were often doing with the 3090.
 
I mean it's pretty simple. For 1440p @ 165hz a 4090 is serious overkill, you're not utilizing the card @ 100% like you were often doing with the 3090.
The classic “overkill” line, lol. Without context or knowledge of what games he plays. I can’t even sustain 165 FPS in Fortnite with a 7950X / 4090.
 
It's beside the point. It's one game. It's like saying the 4090 is not overkill for 1080p because I can't maintain 165fps in star citizen.

Individual use cases differ between users, what you subjectively consider "overkill" is perfect subjectively for another user.
Assuming that your definition suits all users is a flawed assumption.

I would like more computational power than the 4090 provides but alas that is not an option yet.
Different goals makes blanket statements an errand in futility.
 
It's beside the point. It's one game. It's like saying the 4090 is not overkill for 1080p because I can't maintain 165fps in star citizen.
The point is - he bought a 4090 to play games. If the only game he plays is ____ - mission accomplished. lol. I've been doing overkill for many years. Heck, that's what [H] is all about!!! No one prescribes what card is best for me.
 
Back
Top