Also, in Fortnite’s case, most players will not be willing to sacrifice frame rate for ray tracing. It’s not the type of game where you want to do that.
In any case, the repeated claim that AMD can’t do ray tracing is somewhat erroneous, because it really depends on how it’s done. They’re...
It depends a lot on the use case, in my opinion. I bought a 7900XTX because it was on sale and I needed something to pair with my 4K monitor. The 4080 (only on sale, not at it’s ridiculous MSRP) would have been my first choice because I was interested in playing with RTX features, but after...
It’s part of Unreal Engine 5, so it’s a safe bet we’ll be seeing a lot of Lumen use in the future. From what I’ve seen, it looks more or less as good as RT without nearly as much of a performance hit, so I expect we’ll be seeing lots of it.
I think people need to separate "ray tracing" from "RT", because RT is one implementation of it and is designed specifically to work with RTX, so no wonder it favours Nvidia. Technologies like Lumen look fantastic and AMD hardware is every bit as good as Nvidia's in benchmarks that I've seen...
Zero chance any regulator would allow this to happen. I mean cloud gaming is what’s holding up the Activision Blizzard acquisition, and almost no one actually cares about cloud gaming. Can you imagine what the competition bureau would say about Nintendo given that? Even if Nintendo was ok...
Gaming is about $8 billion per year of revenue for Nvidia. They are not going to exit the space. People need to remember that AI is a huge part of their business, but the most recent two quarters had Nvidia grow their revenue guidance from $11 billion to $16 billion. Clearly, their primary...
Their drivers have come a long way with Arc to be fair, but they also know they can't sell Arc for a high price not only because of the driver inferiority, but also because it's simply markedly less powerful. They're competing on the middle to lower end right now, which really isn't a bad place...
I hope this happens, but I'm not holding my breath. Also, if they get to 4080 performance, it won't be half the price. My guess is that they'll pull an AMD and do 4080 - 10%, or 4080 - 15%. As much as we gamers want some competitor to come in and flood the market with cheap GPUs, it's not...
How many Canadians love Rogers, Bell, or Telus? How many Canadians love dealing with their big 5 bank? How many Americans love Comcast? How many remain customers in spite of all that and have been for years? I deal with companies I hate all the time for various reasons, typically because I...
Agreed. With AMD on consoles and companies needing a way to enable better performance on weaker console hardware, FSR is the easy way for them to go. We’ve all seen how most studios have PC gaming as an afterthought as it is.
Whatever, you get my point. This is a ridiculous thing to be this angry about. Just use FSR and move on with your life guys, or don’t, whatever. This is being made out to be a way bigger deal than it actually is. My advice is to spend your time enjoying the game instead of white knighting a...
I really can’t believe the passion and anger some folks have over the exclusion of a frame generation technique in a video game. Honestly, if you’re this angry and personally offended about a computer hardware and/or software corporation not including your preferred upscaling technology in a...
It’s around RTX 3090 to RTX 3090 Ti performance, so certainly no slouch, but a generation behind Nvidia. That’s for RT titles though. In Lumen, they’re right in line with Nvidia last I checked.
The follow up question is whether or not that matters to you. If you play a lot of single player...
I'm sure we'll see more in the near future as UE5 increases in adoption, but based on the design principles of both Lumen and RT, I would say it would be basically impossible in a like for like comparison for Lumen to consume more computational power compared to RT based on what I've read. The...
I’m not a technology expert, so I would need to differ to a software developer who’s more familiar with the nuts and bolts, but based on everything I’ve read about how Lumen works vs RT yes, it is computationally less expensive which is one of the advantages, but the trade off being made is that...
ASUS TUF was the 3080 I was interested in, but the 10GB VRAM buffer gave me pause, and then the market went haywire shortly after. I think it was the best bang for buck cars in the series though, that was a good buy.
There was a benchmark here for the reference 7900 XTX. These guys are saying...
CDPR is really trying to push the graphical envelope with that game, and I love them for it. Really great way to see what the future of gaming can look like.
Lumen looks impressive and is far less computationally expensive than RT. RT is technically “better”, but will it be enough for developers to want to put it in when they can just use Lumen and get close enough at a fraction of the computational cost? In some cases, I’m sure, especially studios...
Anything less than a 4090, at this point, is not a 4K ray tracing card. Even the 4090 takes a massive hit, but it's typically playable (I'm looking at you, Cyberpunk). DLSS or FSR help immensely, but those features aren't free. You'll take a latency hit and it remains an approximation. An...
They also corrected the vapour chamber problem that plagued earlier ones in production as far as I'm aware, so it's probably a fine choice. Pound for pound though, it's not the quietest solution either way if that matters to you, although one thing I did see with the reference model that was...
Noise will be a function of the cooler on the particular card you’re looking at more than anyone else. Most of these heatsinks now are 3 slot+, so yes, you’ll have a card that kicks out a lot of heat, but with a properly designed heatsink, you won’t have a whole lot of noise to deal with, at...
That's one input, but Nvidia's margins say this is mostly on Nvidia. Frankly, that's fine. They're a publicly traded company, it's their job to charge what the market will bear. It's up to consumers to tell them the price it too high and so far we haven't.
Why is that a weird mindset? The performance gap between the 7900 XTX and the 4090 is massive. They're not even in the same league. Intel has offerings for certain use cases that compete well enough with Ryzen on the top end that you can consider both depending on the use case (although...
Power consumption is a really minor factor to consider in my opinion, at least if you live in a market where electricity is relatively inexpensive. Generally, the primary reason to consider it is PSU size and heat, unless you live in a market where electricity prices are insane like they are in...
If they have no option capable of competing with the 4090 and openly state they have no intention of delivering one then yes, bringing AMD into the discussion is pointless because 4090 shoppers will not consider them. You will be buying the 5090 because that will be your only option if you want...
The price is relevant because AMD stated they're not trying to compete in that category. At the top of the chain, Nvidia is unchallenged, and they charge accordingly for that. There's no need to bring AMD into the discussion because they won't be considered by that type of buyer anyway. That...
Yes, the price argument, because price is always a factor. The 4090 currently has no challenger and the 7900 XTX is not intended to be one; AMD said it isn't. You're comparing a $1000 product to a $1600 product, which is an irrelevant comparison. You might as well be comparing a Corvette to a...
The 7900 XTX falls short compared to the 4090 in raster...
...it also falls short on price by $600 MSRP, so I would hope the 4090 would be a better product.
People lined up on launch day to buy a $1600 RTX 4090, which was Quadro-pricing a relatively short time ago, as you pointed out. As long as consumers are willing to do that, Nvidia will be happy to feed it to them. They're clearly doing a price-discovery exercise this release to see what the...