UnknownSouljer
[H]F Junkie
- Joined
- Sep 24, 2001
- Messages
- 9,041
No one with any sense. Not until we see reviews.Soooo... Who's going to be buying a 7900?
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
No one with any sense. Not until we see reviews.Soooo... Who's going to be buying a 7900?
The issue they will have is simply zero experience with such designs.
Next few gens of CPUs and GPUs are going to be fun. Both Intel and Nvidia will be going that way as well. AMD will either reap the benefit of a lead... or perhaps loose it completely. I hope they make hay while they can the next 1-3 years in both CPU and GPU.That's what I keep coming back to. AMD threaded the needle and found technology to make -- let's be honest -- outdated, older nodes, perform damn near as well as the cutting edge, all while using less silicone and less power. Sure, at the added cost of construction. But that's something Intel and Nvidia will have to learn to do, too. Mixing and matching is the way forward.
And they have years of experience implementing it that Intel and Nvidia just can't even.
It's AMD 3D chess at this point. To further mix metaphors, they've turned a sow's ear into a silk semiconductor.
There is also always the possibility that NV and or Intel fall on their face and have a generation that stinks.
Yeah but their FSR 3 gets that 2x FPS increase thanks to their new “Motion Engine” aka their AI frame generator just like Nvidia’s Optical Flow. I can’t imagine AMD’s AI generated frames will look any better, let alone the latency issues it’s bound to have as well.DLSS 3.0 looks like trash with it's interpolation. FSR's not bad it basically does the same thing and they announced FSR 3.0 will be dropping in 2023 which they're saying can give up to 2x fps at 4k and then they got this Hyper-RX thing coming too.
just nvidia thinking $1600 is a great price to charge for graphics cards is enough to "woo" me to AMD
Honestly? It's going to be best to wait for reviews.I'm out of the loop. How bad is the raytracing on rdna 3? Can it at least match a 3090? And what about some of the more active raytracing games like control or 2077?
Is AMD only doing a version of dlss 3.0 or are they going to do the same style as dlss 2.0 also?Yeah but their FSR 3 gets that 2x FPS increase thanks to their new “Motion Engine” aka their AI frame generator just like Nvidia’s Optical Flow. I can’t imagine AMD’s AI generated frames will look any better, let alone the latency issues it’s bound to have as well.
The 7900xt or the card formerly known as the 4080 12GB are the cards that interests me but I’m still eagerly awaiting reviews.
FSR 2.1 is a thing.or are they going to do the same style as dlss 2.0 also
Honestly if you are at 1440p then the 3090 is more than good enough at Ray Tracing but for all their talk of 4K and 8K I don’t see how these are going to stack up there. But 4K is a 3% and that is handily going to NVidia with the 4090 for now, so AMD can talk a big game there but I think their 8K is what most others I know of call 6K, which these will also do fine.I'm out of the loop. How bad is the raytracing on rdna 3? Can it at least match a 3090? And what about some of the more active raytracing games like control or 2077?
If you have the $$ for the best you pay for the best. The rest of the world does some sort of (performance/price) equation that makes sense in reality for them (wife/kids/bills etc..don't want to bore ya with reality for the 99% ya)When it comes to Nvidia, how its usually been. If you want the best you have to pay for the best. It's the reason I got a 3090 on release day. Same goes for the 4090 RTX. It is the fastest card and will be once the 7900xtx is released imo.
Depends what we mean by that, but if you mean does a card has strong has the 7900xtx should beat a 3090 with RT enabled in some game yes, seem a possible mixed result for the more active title:I'm out of the loop. How bad is the raytracing on rdna 3? Can it at least match a 3090? And what about some of the more active raytracing games like control or 2077?
I'll likely be getting the 7900 xtx either way, but I'm on a 4k oled cx10 for my pc display and after this I'm never going back to non emissive displays. But I am going to continue to need more performance to drive this screen. It looks like the 4090 would be a better card for my display but not going that route.Honestly if you are at 1440p then the 3090 is more than good enough at Ray Tracing but for all their talk of 4K and 8K I don’t see how these are going to stack up there. But 4K is a 3% and that is handily going to NVidia with the 4090 for now, so AMD can talk a big game there but I think their 8K is what most others I know of call 6K, which these will also do fine.
Honestly I love Ray Tracing, it is the new thing but it’s still not here totally yet and won’t be until the next major console refresh. Until the consoles can do ray tracing the industry must be raster first ray-traced second.
The 7900 cards are what AMD needs, they are “cheap” to both produce and sell, they force Nvidia to respond and generate a lot of positive stuff for AMD on both the consumer and investor side of things. Assuming AMD can get the chips out in good volumes they utterly ruin Nvidia’s ability to sell the 3000 series overstock and that’s a big deal for AMD.
With how hard it seem to be to get those cards and with how easy return policy tend to be (or to resell them use), not sure it is at all a bad strategy to pre-order one before reviews if it would be an option to someone.No one with any sense. Not until we see reviews.
You can always play the return song and dance. Also generally not worth it. At the end of the day whether you're looking to flip the card or use it, when you pre-order you're just gambling, hoping it's worth your time and cash. Because ultimately you do not know what it is you're even buying.With how hard it seem to be to get those cards and with how easy return policy tend to be (or to resell them use), not sure it is at all a bad strategy to pre-order one before reviews if it would be an option to someone.
I still don't see how they are going to keep the 4080 16GB at $1199 compared to the 7900xtx. And I have no doubts with how shitty Nvidia is with slower cpus that the 7900 xtx will match or beat the 4090 with some of the cpus people are actually using. I think at 1440p the 7900xtx is going to be faster than the 4090 in many if not most games even with a top end cpu. On techpowerup the 4090 is only 30% faster than the 6950 xt at 1440p.So this means the 7900 XT would've been around 50% faster than the 4080 12 GB for the same price.
At 1440p either the 7900 XTX or the 4090 will leave you CPU bottlenecked with even the 13900k. So no difference, between them at 1440p. They are 4K cards through and through.I still don't see how they are going to keep the 4080 16GB at $1199 compared to the 7900xtx. And I have no doubts with how shitty Nvidia is with slower cpus that the 7900xtx will match or beat the 4090 with some of the cpus people are actually using. I think at 1440p the 7900xtx is going to be faster than the 4090 in many if not most games even with a top end cpu. On techpowerup the 4090 is only 30% faster than the 6950 xt.
Everything changes with ray tracing though as AMD is WAY behind.
Not anywhere near as much on AMD if you actually look at how AMD has been scaling at lower resolutions compared to Nvidia. The last two generations of top end Nvidia cards do poorly at 1440p relative to AMD because Nvidia has piss poor hardware scheduling. There are cases where an AMD card that is not even half as fast as an Nvidia card at 4k can match or beat it lower resolutions without a top end cpu. Of course it can vary wildly depending on the game.At 1440p either the 7900 XTX or the 4090 will leave you CPU bottlenecked with even the 13900k. So no difference, between them at 1440p. They are 4K cards through and through.
I like how you're telling Brackle how the forums "used to be" and then calling him "bruh".If you have the $$ for the best you pay for the best. The rest of the world does some sort of (performance/price) equation that makes sense in reality for them (wife/kids/bills etc..don't want to bore ya with reality for the 99% ya)
[H] used to be THE place for the most from the least. Not the most from the most with effort ZERO!
Enjoy your card bruh
either I'm seriously out of the loop, or this is the funniest dig at Nvidia I've seen all week XDAnybody else annoyed he keeps saying "x" instead "times?"
Super insightful. Thanks for contributing.
The release date is December 13, so I would expect the embargo to lift close to that date.What's the review date on these?
December 13th.What's the review date on these?
LTT has a video out today and IIRC he said right out that's when the reviews drop.The release date is December 13, so I would expect the embargo to lift close to that date.
I'm happily using audio via HDMI on my Samsung QN90B from my RX 6800 XT. I also occasionally plug in a generic "ECCO" brand TV to watch streaming via PC in bed.Maybe a dumb question, but what's the HDMI audio-out like on AMD cards like these days? Do they support 7.1, Atmos, DTS:X, etc. ? What about instant-on audio? My last experience with an AMD card was 5 years ago and all of those things were issues. It didn't support all of the standard formats and there was a roughly 1-2 second delay with all new audio sources.
I'm happily using audio via HDMI on my Samsung QN90B from my RX 6800 XT. I also occasionally plug in a generic "ECCO" brand TV to watch streaming via PC in bed.
HTH.
*deer in headlights look*Cool, so does it output bitstreamed versions of Atmos/DTS/etc. without any delays? When I searched for it online, all I saw were threads like this...which seem to have all sorts of conflicting information: https://community.amd.com/t5/driver...porting-hdmi-audio-drivers-anymore/m-p/364732
Oh no, what a shame...In a just some months of difference of universe that seem like it would have been a hell of good minings cards
There was an older installer package issue where it wouldn't install things if they weren't detected at the time of installation, so if you didn't have your equipment all plugged in when you installed the drivers it would leave things out and you would have to do a full uninstall and reinstall of the driver package on site when things were all hooked up.*deer in headlights look*
I don't even know what you mean. It games great and streams movies great. Audio great.
It also Words and Excels and Visual Studios great. ;-)
Rumored AIB models with higher voltage options should be about 15-20 percent faster:
The architecture was designed to run at 3GHz.
Not my part of the playground, so I'll take your word for it.batching deployments
Yeah it was an annoyance at worst, but once we knew it was a thing we just made sure to have dummy equipment in place so the installer packages could "detect" the components and would install drivers accordingly. AMD has been making good progress in their installers in general for the past few years, but for a solid 18 months or so they weren't great. It was a 2018 - 2020 issue.Not my part of the playground, so I'll take your word for it.