Buying a 4090 card

Psycrow

Gawd
Joined
Feb 26, 2010
Messages
631
I got a Maximus Extreme XIII ROG board with a 3080 ti gpu.
Then i installed M.2 in all the motherboards sockets. And it will make my current 308 ti only run 8 x gen 4 and not 16 x gen 4.
I looked up the difference and compared the 8 and 16 x where only littel differnce was shown in results.Strange enough but some games even run better on 8 x then the 16 x
in the videos i found.

I could ofc remove 2 of my M.2 and install em in a dimm.2 module but then they will loose performance regarding speed

- But my real question is about the 4090 card i want to buy. Will i miss out alot of performance with a 4090 x 8 gen 4 rather than 4090 with x 16 gen 4 ?
 
Techpowerup recently did an investigation into this: here .
Long story short, a decrease of 2% is what you're looking at for this scenario. So post #2 is correct :p

Enjoy the 4090!
I forgot to mention i have a i9 11900k cpu.
Asus told me once something about Max # of PCI Express Lanes 20.
So i will loose peformance buying a 4090 card.

Here is what they said:

The CPU you have in the computer can only run a max of 20 PCI lanes, so it will under perform. https://ark.intel.com/content/www/u...1900k-processor-16m-cache-up-to-5-30-ghz.html

And here is stats from another site with a 2k / 4 k monitor:

Intel Core i9-11900K is too weak for NVIDIA GeForce RTX 4090 on 3840 × 2160 pixels screen resolution for General Tasks.
This configuration has 39.0% of processor bottleneck .

Intel Core i9-11900K is too weak for NVIDIA GeForce RTX 4090 on 2560 × 1440 pixels screen resolution for General Tasks.
This configuration has 61.3% of processor bottleneck .

Why does the % lower each time i get a better monitor like 2k or 4k or 6 k and on...
So its better to have a 4k at least than a 2k and best options seems to have a 6k

Intel Core i9-11900K and NVIDIA GeForce RTX 4090 will work great together on 6016 × 3384 pixels screen resolution for General Tasks.
This configuration has 3.7% of processor bottleneck

How do this apply ? I dont get it.

For now i only got a 1080 p monitor but i am planning on buying a 2 k or 4 k but prefer these new oled ones.
Thats why i havent upgradet my monitor for a long time.
 
Good grief.get a 4K monitor or there's absolutely no point in you having a 4090 especially with that CPU. And even at 4k, you are going to lose quite a bit of performance in the more CPU demanding games. At just 1440p even a 4080 would be held back nevermind a 4090.
 
I forgot to mention i have a i9 11900k cpu.
Asus told me once something about Max # of PCI Express Lanes 20.
So i will loose peformance buying a 4090 card.

Here is what they said:

The CPU you have in the computer can only run a max of 20 PCI lanes, so it will under perform. https://ark.intel.com/content/www/u...1900k-processor-16m-cache-up-to-5-30-ghz.html

And here is stats from another site with a 2k / 4 k monitor:

Intel Core i9-11900K is too weak for NVIDIA GeForce RTX 4090 on 3840 × 2160 pixels screen resolution for General Tasks.
This configuration has 39.0% of processor bottleneck .

Intel Core i9-11900K is too weak for NVIDIA GeForce RTX 4090 on 2560 × 1440 pixels screen resolution for General Tasks.
This configuration has 61.3% of processor bottleneck .

Why does the % lower each time i get a better monitor like 2k or 4k or 6 k and on...
So its better to have a 4k at least than a 2k and best options seems to have a 6k

Intel Core i9-11900K and NVIDIA GeForce RTX 4090 will work great together on 6016 × 3384 pixels screen resolution for General Tasks.
This configuration has 3.7% of processor bottleneck

How do this apply ? I dont get it.

For now i only got a 1080 p monitor but i am planning on buying a 2 k or 4 k but prefer these new oled ones.
Thats why i havent upgradet my monitor for a long time.
The percentage of a CPU bottleneck shrinks as resolution increases because the GPU is doing more work. Essentially the performance metric flips from the GPU waiting on the CPU at a lower resolution to the CPU waiting on the GPU at a higher resolution. Available PCI-E lanes has nothing to do with it. All consumer-level Intel processors "only" come with 20 lanes.

I wouldn't sweat those numbers. Video cards are so fast these days that 2560x1440 has become a CPU-dependent resolution. Simply speaking, I wouldn't buy a 4090 if you're going to be gaming at less than 8,294,400 pixels (3840x2160 or equivalent resolution). A 3060 Ti is an appropriate performance class for 1920x1080. Add ray tracing and you may want to step it up to a 3070 Ti. A 4090 will still be fastest at that resolution, but the performance delta to a lower card is going to be much smaller than at 4K. So you're not "losing" performance, but the amount of money spent becomes quite questionable compared to cards in a lower performance tier.
 
So i should buy a 4k monitor with a 4090 card and still have bottelneck but it will be better than my 3080 ti card with a 4k ?

I have been looking at this monitor, that should be a 2k right ?
But it seems like im gona need a 4 k.
https://www.asus.com/dk/displays-desktops/monitors/tuf-gaming/tuf-gaming-vg28uql1a/

There just isent any 27" 4k oled monitors out yet that i know of unless i wait for ever...
Or should i just get this

https://rog.asus.com/articles/gaming-monitors/rog-swift-oled-pg27aqdm-intro/
 
If you can't find a 4k monitor you like and you really want one of those 27-in OLEDs then go ahead and get that and just use DSR to run 4k or 5k. Plenty of games have a resolution increaser or image scaling which essentially does the same thing as running a higher resolution. For instance in Resident Evil 4 remake just turn the image scaling up and you're effectively running a higher resolution than 1440p. Only downside is some games do not have this and some games do not work with DSR but those are very rare. God of War is an example and you would have to put your desktop resolution at 4k before entering the game to select 4K within the game.
 
So i should buy a 4k monitor with a 4090 card and still have bottelneck but it will be better than my 3080 ti card with a 4k ?
I would buy a 4K monitor or TV for a 4090, yes. Yes, it will be better than your 3080 Ti.
I have been looking at this monitor, that should be a 2k right ?
But it seems like im gona need a 4 k.
https://www.asus.com/dk/displays-desktops/monitors/tuf-gaming/tuf-gaming-vg28uql1a/
No, this is a UHD 4K monitor. Looks like a great monitor by all accounts aside from the HDR spec, which isn't going to be better on a LCD without spending a lot more money on one with a FALD.
1679346489342.png

There just isent any 27" 4k oled monitors out yet that i know of unless i wait for ever...
Or should i just get this

https://rog.asus.com/articles/gaming-monitors/rog-swift-oled-pg27aqdm-intro/
Yes, ASUS are known for announcing future monitors with no prospective release date, often years before it actually gets released. I wouldn't get excited for an ASUS monitor until it is actually available for purchase.

There are no 27" 4K OLED monitors that I know of on the market or in production. They're all 2560x1440. LG has one for sale right now in the US. I'm not sure about when or if it's coming to Denmark or Europe at large.

https://www.lg.com/us/monitors/lg-27gr95qe-b
https://www.rtings.com/monitor/reviews/lg/27gr95qe-b
You can't find a 4k monitor you like and you really want one of those 27-in OLEDs then go ahead and get that and just use DSR to run 4k or 5k. Plenty of games have a resolution increaser or image increaser which essentially does the same thing as running a higher resolution. For instance, in resident evil 4 remake just turn up the image scaling up and you're effectively running a higher resolution than 1440p. Only downside is some games. Do not have this and some games do not work with DSR with those are only a few. God of war is an example and you would have to put your desktop resolution at 4k before entering the game to select 4K within the game.
DSR is certainly an option. 2560x1440 scales nicely to 5120x2880 (5K) resolution since it's an integer multiplier.
 
I would buy a 4K monitor or TV for a 4090, yes. Yes, it will be better than your 3080 Ti.

No, this is a UHD 4K monitor. Looks like a great monitor by all accounts aside from the HDR spec, which isn't going to be better on a LCD without spending a lot more money on one with a FALD.
View attachment 557906

Yes, ASUS are known for announcing future monitors with no prospective release date, often years before it actually gets released. I wouldn't get excited for an ASUS monitor until it is actually available for purchase.

There are no 27" 4K OLED monitors that I know of on the market or in production. They're all 2560x1440. LG has one for sale right now in the US. I'm not sure about when or if it's coming to Denmark or Europe at large.

https://www.lg.com/us/monitors/lg-27gr95qe-b
https://www.rtings.com/monitor/reviews/lg/27gr95qe-b

DSR is certainly an option. 2560x1440 scales nicely to 5120x2880 (5K) resolution since it's an integer multiplier.
With DLDSR he could run 4K in games and with that it doesn't really matter if it is not perfectly divisible into 5k.
 
I had my eyes on that LG monitor but after some youtubers made a video about it, then i sort of left it.
Due to the screen was like looking at a oil picture or oil text. Like blurry of some sort.Matte coating introduces haziness.

Anyone know a good 4 k 27"
 
I don't know about 27 but Samsung just dropped the new 43 mini led monitor looks pretty good.
 
Gosh man! I've never seen a time when pc gamers make gaming seem so freaking hard. This thread is almost painful to read. I myself just purchased an RTX 4090 that is paired with an I9 10850k and 32gb of ram with Windows 11 on a nice 500gb SSD. I have a 2tb ssd for my games. My display resolution is 4320x2560@165hz using x3 Samsung Odyssey G50a 1440p monitors in portrait mode. I'm quite happy with this setup and all my games are played maxed out at the full 165hz my screens support. I don't really care if my 10850k may be holding my gpu back as long as the setup does what I want it to do. To worry about anything else is just undue stress and defeats the purpose of this hobby which ultimately is to game and to have fun. Stop overthinking and just have fun man!
 
Last edited:
Well, most people prefer to get the full use out of a part they spend around $1,800 for.
 
Well, most people prefer to get the full use out of a part they spend around $1,800 for.
Of course you do and rightly so but it seems nowadays people over think this hobby and stress over things that get in the way of actually enjoying your pc and having fun while gaming. I'm sure my 10850k holds back my 4090 in some scenarios but I am still going to enjoy my new card and not lose sleep over it.
 
Of course you do and rightly so but it seems nowadays people over think this hobby and stress over things that get in the way of actually enjoying your pc and having fun while gaming. I'm sure my 10850k holds back my 4090 in some scenarios but I am still going to enjoy my new card and not lose sleep over it.
but but youre not getting the full 500fps its capable of! ;)
 
I found the last posts funny :D

Its true that it is stressfull and not making me younger and i should be happy and enjoy my gaming.
But then again i did pay alot for my 3080 ti card. In fact i paid nearly 6000 $ for my newest pc in 2022.
And im looking on a 4090 card from gigabyte that cost 2500 $ here in Denmark.

So I would very much like to know some data on my performance now and before i buy anything.
Or its good to know about this stuff when i buy a new pc next year.
 
yes you might lose a little perf but it will still be a massive improvement and you might not even notice the difference without benchmarking it. and when your'e able to push 100s of FPS, are you really gonna notice 10fps lost?!
 
I'll say what I said before -

Any system, any where, at any time, with any potential combination of hardware, will have a bottleneck. If it did not, computing would complete immediately - by definition, there MUST be a bottleneck. This bottleneck may be the CPU, it may be the GPU, it may be the monitor - but somewhere, there is something limiting performance. Always. You worry about bottlenecks when a component is drastically (not even significantly, but drastically) impacting performance of other hardware - EG 4-8G of ram in 2023 is a bottleneck for almost everything. Using an FX-8350 CPU is a bottleneck for a 4090. DDR3 is a ... and so on.

You're talking hardware from the last few years - the impact, unless you're chasing benchmark scores or working on world records, is insignificant. The freak out over bottlenecks is pointless - and if it isn't pointless, it'll be absurdly obvious. Don't worry about it.
 
Of course you do and rightly so but it seems nowadays people over think this hobby and stress over things that get in the way of actually enjoying your pc and having fun while gaming. I'm sure my 10850k holds back my 4090 in some scenarios but I am still going to enjoy my new card and not lose sleep over it.
True. If you're on 4k you can just crank up the settings to max fidelity and even the playing field.
 
I'll say what I said before -

Any system, any where, at any time, with any potential combination of hardware, will have a bottleneck. If it did not, computing would complete immediately - by definition, there MUST be a bottleneck. This bottleneck may be the CPU, it may be the GPU, it may be the monitor - but somewhere, there is something limiting performance. Always. You worry about bottlenecks when a component is drastically (not even significantly, but drastically) impacting performance of other hardware - EG 4-8G of ram in 2023 is a bottleneck for almost everything. Using an FX-8350 CPU is a bottleneck for a 4090. DDR3 is a ... and so on.

You're talking hardware from the last few years - the impact, unless you're chasing benchmark scores or working on world records, is insignificant. The freak out over bottlenecks is pointless - and if it isn't pointless, it'll be absurdly obvious. Don't worry about it.
That's it in a nutshell.......so message to all of you boys and girls out there....eat your spinach.......don't talk to strangers.......and stop running endless loops of this years hottest benchmark if all you want to do is pew pew the bad guys. Lastly, enjoy your pc and don't sweat the small stuff! :)
 
So i should buy a 4k monitor with a 4090 card and still have bottelneck but it will be better than my 3080 ti card with a 4k ?

I have been looking at this monitor, that should be a 2k right ?
But it seems like im gona need a 4 k.
https://www.asus.com/dk/displays-desktops/monitors/tuf-gaming/tuf-gaming-vg28uql1a/

There just isent any 27" 4k oled monitors out yet that i know of unless i wait for ever...
Or should i just get this

https://rog.asus.com/articles/gaming-monitors/rog-swift-oled-pg27aqdm-intro/
You can also keep your existing monitor and use NVIDIA's DLDSR to supersample your image up from 1080 to 2880 x 1620 and then display it on your 1080p monitor. You also can use regular DSR to upsample to 4K and display on your 4k monitor. 2280 x 1620 DLDSR gives you basically the same result as 4K DSR at much less performance cost. So there IS in fact a use for all that GPU power.

I'd get the 4090 and take your time looking for the perfect 4k oled to go with it, using DLDSR in the meantime. Cheers!
 
Since you're splashing out for a 4090... are you sure you don't want that extra 1% of average and minimum framerates by going with a motherboard that will offer you the full x16 lanes?

To really utilize a 4090 on a DDR4 intel platform, like your Z590, you would need to absolutely tighten to the limit the subtimings on your dual rank DDR4 ram with as close to 4,000+ speed as you can in gear 1. The overall lower the general latency the better.

With Intel and DDR5... you're looking for silicon lottery that will let you run 8,000 or better with tight timings. On the AMD side of things, 1:1 ratio with 6,000 and super tight subtimings.

CPU and RAM matters far more for you than total lanes.

You should Professor Monkey-for-a-head for an appropriate CPU and motherboard combo if you are really trying to get every last frame.
 
Professor Monkey for a head :D earth worm jim ref :p
So many funny comments in here making me laugh and less stress about it :D

Btw what is this NVIDIA's DLDSR ? I heard of it but dont know how it works...

I dont know how to squize my ram timings for more hz. I got G skill ddr Royal trident z 3600 mhz
Cpu z shows NB 4000 freq.

And yes i would like to have a new motherboard and cpu but 1 year after i just bought my board and cpu then the new boards released....
Also i heard you can only run win 11 on these chipsets and i have a strong bond with my win 10. So thats also a thing that holds me back.
 
Professor Monkey for a head :D earth worm jim ref :p
So many funny comments in here making me laugh and less stress about it :D

Btw what is this NVIDIA's DLDSR ? I heard of it but dont know how it works...

I dont know how to squize my ram timings for more hz. I got G skill ddr Royal trident z 3600 mhz
Cpu z shows NB 4000 freq.

And yes i would like to have a new motherboard and cpu but 1 year after i just bought my board and cpu then the new boards released....
Also i heard you can only run win 11 on these chipsets and i have a strong bond with my win 10. So thats also a thing that holds me back.
google DLDSR
dont mess with your ram, you wont notice a difference unless youre playing benchmarks
yes, win11 is needed to correctly use all the E cores on the intel 11 series chips. on win10 it might try to push "performance" stuff onto the "eco" cores.
upgrade and get used to it. its really not that much different. moving the start button back to the left is half of it...
 
The thing is i acualy use windows classic shell to emulate win 7 start menu :D
Cuz that was the best menu of em all
 
The thing is i acualy use windows classic shell to emulate win 7 start menu :D
Cuz that was the best menu of em all
meh, i spend so little time in the start menu it really doesnt matter much. i have desktop shortcuts and taskbar pins, way faster. and if i do have to hit the start menu, i click it and type the first 3/4 characters of what i want and poof its there.
 
meh, i spend so little time in the start menu it really doesnt matter much. i have desktop shortcuts and taskbar pins, way faster. and if i do have to hit the start menu, i click it and type the first 3/4 characters of what i want and poof its there.
Exactly.
 
So thats why it looks like a mess today..people dont use em haha makes sence
 
google DLDSR
dont mess with your ram, you wont notice a difference unless youre playing benchmarks
yes, win11 is needed to correctly use all the E cores on the intel 11 series chips. on win10 it might try to push "performance" stuff onto the "eco" cores.
upgrade and get used to it. its really not that much different. moving the start button back to the left is half of it...

One of several reasons that I went with a 7950x rather than a 12900K is that I want to avoid Win11 for as long as possible and scheduling with E cores is a mess on 10.

I wouldn't say that RAM speed and timings are only useful for benching with DDR4. Lots of reviews showing memory scaling having a meaningful effect on framerates in games. The biggest issue is remaining in first gear, next is to keep latency down, and third is whatever combination of speed, timings, and subtimings gets you there.
 
sure, speed helps, but fiddling with all the settings to squeeze out that last tiny bit, is usually pointless for normal use and you wont see/feel it unless playing benchmarks.
Under most circumstances it is fairly pointless, but 11900k already leaves a lot on the table compared to the next two generations. Ignoring JDEC specs, fiddling with Ram can net up to about 5% more frames and further help with frame lows compared to untuned but otherwise decent memory. That's something potentially worth doing as opposed to trying to have a full 16x set of lanes on Z590 for the card.
 
Under most circumstances it is fairly pointless, but 11900k already leaves a lot on the table compared to the next two generations. Ignoring JDEC specs, fiddling with Ram can net up to about 5% more frames and further help with frame lows compared to untuned but otherwise decent memory. That's something potentially worth doing as opposed to trying to have a full 16x set of lanes on Z590 for the card.
and when your'e able to push 100s of FPS, are you really gonna notice 10fps lost?!
if he just wants to play and enjoy games and not shoot for top tens its fine as is. imo.
 
Like i said earlier..don't sweat the small stuff!

From the latest TPU review minimum FPS at 4k. I currently have a 4090 paired with a 10850k and game in triple portrait NV surround at 4320x2560@165hz. Not too much of a difference between the 10850/10900k and the latest and greatest at these high resolutions; not enough to justify changing my entire platform for either Intel or AMD. https://www.techpowerup.com/review/amd-ryzen-7-7800x3d/22.html
 

Attachments

  • 834711_Screenshot_2023-04-05_215658.jpg
    834711_Screenshot_2023-04-05_215658.jpg
    180.9 KB · Views: 0
Like i said earlier..don't sweat the small stuff!

From the latest TPU review minimum FPS at 4k. I currently have a 4090 paired with a 10850k and game in triple portrait NV surround at 4320x2560@165hz. Not too much of a difference between the 10850/10900k and the latest and greatest at these high resolutions; not enough to justify changing my entire platform for either Intel or AMD. https://www.techpowerup.com/review/amd-ryzen-7-7800x3d/22.html

So with my 11900k and a 4090 card and a 4 k monitor i will have 105 ish fps ?
 
Back
Top