Captain Newmackwa
Limp Gawd
- Joined
- Mar 20, 2017
- Messages
- 317
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Fire Strike Ultra | Time Spy Extreme | Port Royal | ||||
NVIDIA GeForce RTX 4090 AIB | 25,708 | 2.04 | 19,431 | 1.89 | 25,414 | 1.86 |
NVIDIA GeForce RTX 4090 Stock | 25,256 | 2.00 | 18,892 | 1.84 | 24,886 | 1.82 |
NVIDIA GeForce RTX 4080 16GB | 17,465 | 1.39 | 13,977 | 1.36 | 17,607 | 1.29 |
NVIDIA GeForce RTX 3090 Ti | 14,007 | 1.11 | 10,709 | 1.04 | 14,851 | 1.09 |
NVIDIA GeForce RTX 3090 | 12,607 | 1.00 | 10,293 | 1.00 | 13,642 | 1.00 |
NVIDIA GeForce RTX 3080 Ti | 12,457 | 0.99 | 10,042 | 0.98 | 13,226 | 0.97 |
NVIDIA GeForce RTX 3080 12GB | 11,593 | 0.92 | 9,358 | 0.91 | 12,151 | 0.89 |
AMD Radeon RX 6950 XT | 15,038 | 1.19 | 10,644 | 1.03 | 10,788 | 0.79 |
AMD Radeon RX 6900 XT | 14,333 | 1.14 | 9,952 | 0.97 | 10,398 | 0.76 |
AMD Radeon RX 6800 XT | 12,950 | 1.03 | 9,203 | 0.89 | 9,536 | 0.70 |
AMD Radeon RX 6800 | 10,568 | 0.84 | 7,648 | 0.74 | 7,812 | 0.57 |
4090 AIB | 1.93 |
4090 Stock | 1.89 |
4080 16GB | 1.34 |
3090 Ti | 1.08 |
3090 stock | 1.00 |
3080 Ti | 0.98 |
3080 12GB | 0.91 |
6950 XT | 1.00 |
6900 XT | 0.95 |
6800 XT | 0.87 |
RX 6800 | 0.71 |
Gotta reel em' in somehow.Nvidia is really going for the "Oh well I guess I'll just buy a 4090" crowd this time. Good lord.
41% more performance for 33% more money. On a flagship GPU... Value gets better as you go up the line... It's backwards.
Rumors from people that reviewed the card still under embargo
- 4090 about 80% above a 3090 in pure raster, sometime higher
- In Ray tracing often over double of a 3090
- 4090 probably safe to drop-in a system with a 3090, without the power spike issue which make it significantly easier to drive than a 3090TI
- Lovelave perf-watt would be excellent and would keep much of it if you cap the card at lower power, which would be specially good for the Laptop version
For many of us, this is not an AMD vs. Nvidia, but an AMD/NVIDA vs consumers. Hardware Unboxed got much trouble from Nvidia when they chose to review the cards for how they perform in most games, rather then having a heavy focus on some limited RTX feature in games. They stood their ground and became a more reliable source then many of the other reviews that were RTX- and especially ray tracing heavy. Even though I did buy a 2080 TI card myself, it was never for the raytracing performance, but the VR performance coming from a GTX 1080.Imho RT performance should remain hidden in reviews, tucked somewhere to separate page and ignored otherwise only when Ada Lovelace has better RT performance than RDNA3.
Should AMD manage to improve RT performance of RDNA3 past Nvidia levels then it should go in to overall performance charts and RT performance presented as the biggest advantage.
It is very simple. What do you not get?
You all feel the same way, no matter GPU vendor preference. Even you Jensen lovers silly you
Agreed. Someone did a breakdown of it here. Of the number of games that have ray tracing, most of them are indie titles with relatively little or even not good reviews. Of the more popular titles, a lot (like WoW) slap on raytracing mostly for bragging instead of significant graphics improvement.ray tracing and its importance have been so misleading and a lot of reviewers and even forum members should be ashamed pushing it to be of some dealbreaking importance with the 2XXX and even the 3XXX series. At least if they dare to look back and see how little impact it had the last years.
RT is the future, and it has to start somewhere. Pretty much how all new technologies in rendering have begun. Never great the first few years and then picks up over time.Agreed. Someone did a breakdown of it here. Of the number of games that have ray tracing, most of them are indie titles with relatively little or even not good reviews. Of the more popular titles, a lot (like WoW) slap on raytracing mostly for bragging instead of significant graphics improvement.
The number of titles that actually use ray tracing good is pretty small.
I get that. Invest in it when it makes sense to. I just refuse to believe the sheer number of people who say they need raytracing actually use it significantly.RT is the future, and it has to start somewhere. Pretty much how all new technologies in rendering have begun. Never great the first few years and then picks up over time.
I use it in every game that I own if it supports it.I get that. Invest in it when it makes sense to. I just refuse to believe the sheer number of people who say they need raytracing actually use it significantly.
Right, I would too if I already had the card. But I'd hardly be buying one so I can play WoW with slightly better shadows.I use it in every game that I own if it supports it.
I feel like it has been has of now and for many Turing buyer they will change their card before having had significant benefit from it, some title RT has a big enough impact where it becomes possible to see if it is on or off without having to look at the menu or be an expert that know what to look for, say Control:Ray tracing is a bonus feature at best, worthless at worst. It was highlighted as something important and worthwhile during the launch of the 2XXX series, but have been pretty much worthless in most cases. With my 2080 TI, it has been something I have turned on, just to see, and then turned off again when I wanted to play the game. The only game I have found it worthwhile playing with is in Minecraft RTX. As long as its something that "might" be useful mostly on halo products, I dont see it becoming "a nessesary to have" feature in the near future. Raytracing is still out of reach for the masses for developers to make it essensial in games. A mere marketing feature, like it was when first introduced with the 2XXX series IMO. Some reviewers focus on the value of raytracing performance in the previous generation made the reviews misleading. Lets hope they have a more realistic and sane approach to it with the 4XXX series.
One 12 GB SKU conveniently absent from these tables... Everyone knows its shit already.3dmark expressed in absolute and 3090 ratio:
https://videocardz.com/newz/nvidia-...cores-leaked-at-least-82-faster-than-rtx-3090
Fire Strike Ultra Time Spy Extreme Port Royal NVIDIA GeForce RTX 4090 AIB 25,708 2.0419,431 1.8925,414 1.86NVIDIA GeForce RTX 4090 Stock 25,256 2.0018,892 1.8424,886 1.82NVIDIA GeForce RTX 4080 16GB 17,465 1.3913,977 1.3617,607 1.29NVIDIA GeForce RTX 3090 Ti 14,007 1.1110,709 1.0414,851 1.09NVIDIA GeForce RTX 3090 12,607 1.0010,293 1.0013,642 1.00NVIDIA GeForce RTX 3080 Ti 12,457 0.9910,042 0.9813,226 0.97NVIDIA GeForce RTX 3080 12GB 11,593 0.929,358 0.9112,151 0.89AMD Radeon RX 6950 XT 15,038 1.1910,644 1.0310,788 0.79AMD Radeon RX 6900 XT 14,333 1.149,952 0.9710,398 0.76AMD Radeon RX 6800 XT 12,950 1.039,203 0.899,536 0.70AMD Radeon RX 6800 10,568 0.847,648 0.747,812 0.57
In Average:
4090 AIB 1.934090 Stock 1.894080 16GB 1.343090 Ti 1.083090 stock 1.003080 Ti 0.983080 12GB 0.916950 XT 1.006900 XT 0.956800 XT 0.87RX 6800 0.71
that is their master plan lol. 4090 probably has higher margins anayways.The 4080 pricing is trash for that performance increase. Everyone is def hopping on the 4090 once more benches come out.
I feel like it has been has of now and for many Turing buyer they will change their card before having had significant benefit from it, some title RT has a big enough impact where it becomes possible to see if it is on or off without having to look at the menu or be an expert that know what to look for, say Control:
Spider Man remaster do give some hope (if this is really in real time and not some trick which considering it will be launched has a playable demo in November I imagine it is not too much trickey, lot of camera focus and blur tricks to make it possible, but still, I feel that not far from what mid budget render far movie just 15 years ago could have look):
The day game studio save a lot of dev time and assets in not having to do all the cheating to simulate light a lot of value could be passed back to the RT buyers (and by then pathtracing, Pixar like render quality in real time will be a thing), but the in between when the game has to include and the people must have made everything for RT-off and just a mix of Rt vs regular when it is on, the value is less obvious, DLSS value was more obvious (and many different equivalents was already in place for that reason), and it exploded much faster.
Now it depend on what we mean by near future, expected useful life time of a 4090 (7-8 year's) maybe it will, the next generation of consoles having a word to say about it being my guess, if we already here today:
We could be a single doubling RT performance away in just 2 years to make it viable and potential game changing, and the playstation 6-Xbox-X2 having enough of that power to make it fully mainstream in 2026, the way AI progress, what would the denoising and predicting part will make possible with just 4 time the raw power that will be available by then is hard to predict.
Is it Turing rage allover again?The simplicity in this, is that reviews should reflect the value any feature would have for consumers, not the marketing value of the feature for the GPU vendor, regardless of who it is. Anyone who bought 2XXX series due to their ray tracing performance can look back on it now and see how little it mattered compared to how much some of the reviews painted it out to be.
Yes you areI´ve had only Nvidia cards the last 5-6 years at least. But I will have no issues going AMD either. Not all of us have some heavy preference or loyalty to either of the companies and we normally see things from a consumer perspective.
I disagree.Edit: I do get your post was a bit tongue in cheek, but ray tracing and its importance have been so misleading and a lot of reviewers and even forum members should be ashamed pushing it to be of some dealbreaking importance with the 2XXX and even the 3XXX series. At least if they dare to look back and see how little impact it had the last years.
Why do people constantly equate "RT simply isn't as important NOW as it's made out to be" with "RT will never be important"?Nvidia choose RT as natural way to move graphics forward.
Is it 'needed' for games? If we want graphics to stop looking more fake than KK butt then yes we need RT.
Reviews rarely reflect the top games played, look at Steam top 100 games being played and count the RT capable ones (which does not mean RT is being used by the player if it has it). Those are the games the majority of Steam users, thus PC gamers in general want to play and are playing. Now I was surprised as hell seeing Cyberpunk at #10 when previously it drifted off the top 100 chart, a game I got bored with and left sometime ago. Anyways seeing a reviewer taking notice and testing what is needed to get a good game experience for the top 100, being in touch with what people actually want and are playing, would be very useful and knocking it out of the park. Anyways virtually all of those games don't need a 40 series card or even a 30 series of cards to have a great gaming experience.Why do people constantly equate "RT simply isn't as important NOW as it's made out to be" with "RT will never be important"?
Nobody is saying ray tracing isn't the future. But how many of the top played games even have it? How about have it and actually look good?
As usual, we'll be using rx7xxx and rtx4xxx cards mostly for rasterized work. I bet that will still hold true when the next generation comes out. Hence why it is still by far the most important metric to consider for most when determining what they'll get out of their card in real performance.
Reviews and marketing should reflect that.
It is not easy for most reviewers to make any future type projection (even would they make their own 3d engine or play with UE to do it, the studio result could end up quite more optimized).Exactly! For Turing, even though some reviewers had an extreme focus on raytracing, it has been pretty much a sidenote. Same goes with Ampere and this generation its the same. Ray tracing is not there yet for it to be more then a sidenote and reviews should reflect that for consumers sake. Some years from now, it might have more value for the general consumer, but they (reviewers) really should tone it down, not to give consumers the wrong impression of its value. If not, we´ll at least see which reviewers consumers can trust in how they review the products as shown earlier with Hardware Unboxed.
Cyberpunk got some revival from a Netflix tv show I think.Reviews rarely reflect the top games played, look at Steam top 100 games being played and count the RT capable ones (which does not mean RT is being used by the player if it has it). Those are the games the majority of Steam users, thus PC gamers in general want to play and are playing. Now I was surprised as hell seeing Cyberpunk at #10 when previously it drifted off the top 100 chart, a game I got bored with and left sometime ago. Anyways seeing a reviewer taking notice and testing what is needed to get a good game experience for the top 100, being in touch with what people actually want and are playing, would be very useful and knocking it out of the park. Anyways virtually all of those games don't need a 40 series card or even a 30 series of cards to have a great gaming experience.
So right there. It is hard for folks to see what is missing then what is right in front of them. It would not be that difficult to import into Unreal 5 million+ polygon models to see how a GPU handles very dense polygon models or scenes. Then add in lights and increase the ray count. Then making mega shaders (multiple shaders in one which benefits how well a cache works). Today reviewers are not pushing the understanding or making better the reader except in rare occasions. When everything was new, years past, the reviewers did push understanding, ability in general of the readers on PC tech and building. Totally stagnant today.It is not easy for most reviewers to make any future type projection (even would they make their own 3d engine or play with UE to do it, the studio result could end up quite more optimized).
Say how much focus some reviewer did put on the 16 gig of VRAM offer by AMD versus 8-10 on NVIDIA direct competition, has of now has been a complete sidenote, would it still be true in 2023, would it have been has true with COVID and supply issues........
That more the jobs of a technologist than the reviewers we have in mind that usually have not much significantly more the first clue than their potential buyer, they can show how much visual quality it adds in current title at what cost, can pop up a list of known upcoming games and let them decide and it is usually what they do.
One issue is Nvidia somewhat hold over the industry and a tie lip in general, maybe there are a world where reviewer when a reflex, DLSS 3, Resizable bar, cache instead of high memory bandwidth, raytracing technology, amount of VRAM choice that significantly different from 2 potential consumer option you have an interview with some of the best one working on Frostbyte, Unreal, 4A, RED, Crytek, id, Sony-Microsoft-Ubisoft and drivers people that had access to engineering sample to play with, that give us their opinion on the tech and their informed opinion on the near and mid future, but usually it is a big nothing, a lot of those reviewer feel like they never coded a shader in their life and yet reviews those products simply because they loved playing game and playing with computers when they were young. The conversation among them on the resizable bar was a bit of a glaring example of that (or around Direct Storage), those things stay impressively unclear in the online world in part because of how much unclear it seem to be to reviewer (and how little they have their own little 3d engine around, to try those tech and explain us the current difference they see).
TLDR; Turing and Ampere reviewer for the most part had no idea beyond current tech demo, current games and announced game, will have the same for Lovelace.
I agree, a balance of pushing the card and also what is useful would reflect a better approach in general. I wonder how many will buy a 4090 but predominantly will play CSGo or some other game that would go way beyond any usefulness of the 4090, where a 3070 would actually be overkill? My view, if your GPU is rendering faster than your monitor capability to show that data then that is mostly wasted performance, unless one thinks tearing is useful and mismatched information on the one frame showing multiple frames is useful and that mismatched data/image can enhance reaction time due to the faster rendering (I generally would say it is mostly detrimental at this stage).Cyberpunk got some revival from a Netflix tv show I think.
I think it is ok for reviews to just have some of the top games played (a la GTA 5), the top game played worldwide will be significantly different to what a potential buyer (or tech enthusiast watcher) would be interesting in when they look for the review of a 4090, a series of how much a 4090 do better than a 3080TI or 6900x in Dota 2, Apex, Counter Strike, Team Fortresss, Fallout 4, Destiny 2 would get old fast.
Probably more interested in what the hard to run 2022-2023 ( Stalker 2, The Witcher 4, Ark 2, plague tale 2) title would do than the old game, a bit like CPU reviews does not necessarily use a lot of space for very common task than a mid range 10 year's old CPU do very well and fast enough.
I agree, a balance of pushing the card and also what is useful would reflect a better approach in general. I wonder how many will buy a 4090 but predominantly will play CSGo or some other game that would go way beyond any usefulness of the 4090, where a 3070 would actually be overkill? My view, if your GPU is rendering faster than your monitor capability to show that data then that is mostly wasted performance, unless one thinks tearing is useful and mismatched information on the one frame showing multiple frames is useful and that mismatched data/image can enhance reaction time due to the faster rendering (I generally would say it is mostly detrimental at this stage).
I'm sad to see PC-gaming spinning completely out of whack from a value perspective. This has always been an expensive hobby, there has always been a better value in gaming consoles, at least when they are somewhat new. This value lowers year-over-year as the price on the console does not decrease linearly compared to whatever hardware they feature. But as an enthusiast, that has always been willing to put a whole bunch of money into my computer rig to stay ontop of thing. I love everything new in tech, and I really dislike fiddling with optimising my settings to achieve good performance. I've always spent top dollar on getting whatever graphics card that will provide me with Very High / Ultra settings at my display resolution while also giving me a stable 60 FPS. In later years my bar has moved from having a rig that gives me a stable 2560x1440@60FPS experience, into something that will provide me with a stable 3440x1440@120FPS experience, still obtaining Very High+ settings, and if there are some ray-tracing goodness available I prefer to be able to utilise them without my FPS plummetting.
Seems like this is growing more and more unlikely by the day. This RTX 4090 seems like my perfect graphics card. It's the very first card on the market where I can feel comfortable that I should be able to achieve 3440x1440@120FPS regardless of my in-game settings. But this price point is getting ridiculous.
I have never been about getting the most for my money. There is no such thing as being an enthusiast and getting great value for money. I've paid for the top-of-the-lines cards in the past, even though the performance per dollar drops a lot when moving into the most expensive cards. The whole reason has been that this is my primary hobby and I don't mind paying extra for getting that little bit of extra performance that ensures I'll hit my performance goals without having to optimise a lot of settings and sacrifice visual quality in the games I'm playing.
Normally If you were an XX80 Ti user, you would look at whatever is the new generation of XX80 Ti. It would normally be in somewhat the same ballpark in terms of price compared to your previous XX80 Ti model making it all about generational improvements. Will this new generation of XX80 Ti, the card within my price range, offer a decent enough improvement to warrant an upgrade? And I would mostly conclude with yes, yes it does. These models were basically "price brackets". And I feel it's been like this forever until the RTX 3000 series where we had a price hike and XX80 Ti became XX90 at a much higher cost. That felt somewhat okay to me, considering inflation and whatnot. But having yet another price hike this time around makes RTX 4000 feel extremely awkward to me. The performance gains are truly impressive, efficiency is looking very good even though the max power usage of the card is kinda ridiculous on its own. But most of the generational gains go down the drain when NVIDIA yet again gives it a massive increase in price.
I could get the RTX 3090 at launch for about 15'000 NOK, the RTX 4090 is costing 21'000 NOK. This comes off as extremely strange to me. It makes the improvements from one generation to the next feel almost irrelevant when NVIDIA is also pricing the card at levels one would expect an RTX 3095 to exist with almost similar performance levels. When looking at this card as an RTX 4090, the performance uplift is very impressive. But when the card is not keeping the XX90 pricing levels from the previous generation it doesn't feel that impressive anymore.
The generational leap in performance would be comparing the RTX 3090 to the RTX 4080 16GB as they are both existing within the same price bracket. And I suspect that outside of improvements to RT that won't be all that impressive. One thing is for certain if the future puts even high-end (XX80) levels of performance at 15'000 NOK price points my days as a PC enthusiast is for sure over.
Uh, most people? 1440p/144hz can still take a pretty good card to actually run without relying on stuff like DLSS.(OK they are better at running 1080p/1440p on some games but those resolutions have been trivial to run for the last few generations of card so who cares?)
Hmm, so the 'new' GA102 3070 Ti compared to the regular GA104 3070 Ti has the exact same number of cuda cores, the exact same memory speed and amount of vram, the exact same bus speed and bandwidth, but I'm guesing a higher price?Today's driver update adds new device ID's, including a GA102 3070 Ti.
https://videocardz.com/newz/nvidia-...3070-ti-ga102-and-new-rtx-3060-series-support
Well we shall worry oubout it over time then.RT is the future, and it has to start somewhere. Pretty much how all new technologies in rendering have begun. Never great the first few years and then picks up over time.
The pattern I typically see is:Well we shall worry oubout it over time then.
The pattern I typically see is:
-nvidia releases a new, proprietary technology to push GPU sales
-AMD releases a half-baked version of it in their next-gen
-eventually, 3-4 gens later it works and is open source and no one uses the nvidia solution anymore
So it's like... thanks for pushing development / new features nvidia, but since it's probably just a tech demo I won't base any purchasing decisions around it (except rare cases like minecraft/quake where the base game is so easy to run that you can actually benefit from it now)
see also: hairworks
There is nothing proprietary with ray tracing. There are agnostic extensions in Vulkan now, and it was always baked into DirectX 12 Ultimate (feature level 12_1).The pattern I typically see is:Well we shall worry oubout it over time then.RT is the future, and it has to start somewhere. Pretty much how all new technologies in rendering have begun. Never great the first few years and then picks up over time.
-nvidia releases a new, proprietary technology to push GPU sales
-AMD releases a half-baked version of it in their next-gen
-eventually, 3-4 gens later it works and is open source and no one uses the nvidia solution anymore
So it's like... thanks for pushing development / new features nvidia, but since it's probably just a tech demo I won't base any purchasing decisions around it (except rare cases like minecraft/quake where the base game is so easy to run that you can actually benefit from it now)
see also: hairworks