Which is fine. Things like cars and other integrated electronics don’t need the most advanced node. The only things that truly need the most advanced mode are those where power usage is a factor.
I've looked into the situation and tried to analyze it to the best of my ability based on previous GPU launches.
Nearly 2.5 years without a new flagship/halo GPU. The RTX 4090 has been the "it" card for a long time now, so it has created insane demand for something faster, even if it isn't...
The biggest issue IMO is that there is a massive gulf between the 5080 and 5090. 16 -> 32GB, double the cores, double the price, etc. The 4090 would do amazingly well right in the middle, but Nvidia has stopped production of it. I just don't get their strategy.
There's already people claiming that the 5080 is "technically" faster due to Multi-Frame Generation. Do these people not realize that FG and MFG actually hurt performance in a competitive game scenario? It also hurts the base framerate compared to a game which only renders the base frames, not...
We are living in a strange world. Once the 5090 and 5080 release, the 4090 will still be the 2nd fastest GPU in the world, even after being out for 2.5 years. What in the world...
I've seen some people asking "When can I get a 4090 for under $1000?"
At this rate, never; it's still going to be a halo card with only the 5090 being faster. What a weird time we live in.
Incredible design... except for the fact that it's going to be dumping 575-watts or more worth of heat back into the case. That's sure to warm up your components a bit.
I'm one of those people that said my 3080 was good, but then the 4090 turned out to be an absolute monster, so I bought one. If the 5090 turns out the same, I will buy one.
Who is defending the RTX 5090? Seems like most of this thread is people that are really skeptical about the RTX 5090's actual performance uplift over the 4090. And yes, it can be a prosumer card, but as for the 4090, it is very much a gaming GPU with prosumer features on it. It offered a massive...
Everyone is starting to wake up to the fact that the RTX 5090 probably isn't as powerful as everyone wants it to be. It's definitely an iterative performance increase, which is sad considering I've had my RTX 4090 since launch day over 2 years ago.
I remembered after I posted that Nvidia marketing slide that Far Cry 6 is usually CPU limited, so I did find it odd that Nvidia would use that in their marketing slides. However, at the bottom, it says they're using a 9800X3D for the gaming benchmarks, so it's possible that CPU limitation has...
That's where I'm leaning, too. I was thinking "$2000 isn't as bad as I thought" but then remembered that the 4090 was like 70% faster than the 3090 in rasterization, and much faster in RT/AI. I don't think we're getting that this time.
Just FYI, Nvidia has deployed "Hype Bots" to a bunch of different platforms to try and build up excitement. These are usually new accounts with low friend counts. I would not be surprised if they try and do the same thing here. It's going to be weird with this GPU launch.
As has been echoed here, if you can get your hands on the Intel Arc B580, that's the winner. It just spanks everything else up and down the block in its price class. Been testing mine from a couple of days, and I'm impressed by the level of performance, especially at $250.
As weird as this may sound, I have had an RTX 4090 since launch, and I have been getting more enjoyment out of my PS5 than I have my PC as of late. I can no longer justify the insane $1600+ for flagship GPUs. At this point, and for the foreseeable future, I've bowed out of the high-end GPU race...
Been playing with my B580 for a couple of days now, and I must say, it's one solid card. There's a few bugs that I'm sure will get ironed out with future drivers, but I'm seeing great performance in all games I'm testing in. I can do Crysis Remastered at 4K "Can it run Crysis" settings (no RT)...
https://www.cnbc.com/amp/2024/02/14/nvidia-passes-alphabet-market-cap-now-third-most-valuable-us-firm.html
Guys… big green is closing in on Microsoft and Apple. This is nuts!
Nvidia’s stock is up over 250% from just 1 year ago. Intel and AMD have been left for dead.
AI makes Nvidia a lot of money. It’s an emerging market, and for the moment, Nvidia is leading the way. I don’t think they’ve abandoned or don’t like gamers, but it is all but guaranteed that Nvidia is throwing most of their capital at AI development and research.
I’ve started experimenting with this monitor and different input sources. I recently tried an Apple TV 4K (Gen 3) on it, and holy hell it looks good. Ironically my old Samsung Q90R also looks better on the Apple TV 4K.
This tells me something; it’s not the monitor that’s the issue… it’s the...