So buy Ampere or wait for AMD response?

RDNA 2 won't be announced until October 28th, so you'll be lucky to see the things in any appreciable quantities before December/January. They will probably have the 3060 out by then.

NewJourney_678x452.jpg

AMD just made waiting for RDNA2 pointless :rolleyes:

I figured they would prioritize Zen 3 over RDNA 2, as Intel has them jumping for once with Tigerlake (and we all know how much more AMD makes on CPUs), but almost a whole month is embarrassing!
 
Last edited:
This is what I've been hearing for years from those who are still stuck on 1080p.

Oh hey, assumptions ;). My main monitor is 1440p 27". I've used 4K monitors before and ended up selling them - mainly because I was tired of 60hz. I know what good dpi looks like on 27-32" 4K, and while it's fine, I don't think it should be the absolute goal, like many are making it. Again, see the backlash against the latest Halo for focusing solely on 4K. Better lighting, more complex AI and worlds, are way more important than more pixels after you've surpassed ~100 dpi. Can it look better at 4K? Of course. Would I sacrifice everything else for it? No way. As for antialiasing, I've found that good implementations of TAA or TXAA eliminate pixel crawl completely, and they're way cheaper than rendering at 4K. I'm not saying let's ban 4K, I'm saying let's not focus so much on 4K and the utter stupidity of 8K, and focus on the actual games. I'm quite happy in fact because either the 3060 or whatever RDNA 2 card comes at $200-300 will definitely let me play games at 1440p and 95hz (and often, at ultrawide 1080p which is even easier to do and the DPI remains the same as 2560 is the native horizontal of a 1440p 16/9 panel). 10bit, ~108 dpi and 95h is why I bought the Pixio I'm using, a perfect middle point (for my usage). Could've easily bought a 4K monitor a couple months ago, but it just wasn't worth it in comparison.

As someone that started gaming at 256 x 192 in the 1980's, then gradually worked my way up the ladder as far as 1600p on my personal desktop, those initial jumps were massive. but once hitting anything close 1000p they became negligible.

You and I come from the same generation and similar experience growing up. 320x240 gaming was plenty detailed, even at 15fps which was quite normal in many games. I can still remember the first time I fired up Tomb Raider at 480p on my PC, I couldn't believe my eyes. Once we got to 1080p, everything else seemed superfluous. Sure, I can see the benefits with my current 1440p and even 4K monitors I've had, for static content. When gaming? Do I really need a few thowsand more pixels to render leaves at the edge of the frame? Am I ever going to see that? Nope. 8K is now the new idiocy that has literally no effect in how you game at humanly appreciable pixel densities. It all seems to be about status signaling (my resolution is bigger than yours, ergo, I am better than you!). How about we forget 8K and focus on making those MicroLed displays a reality? Even a 1080p MicroLed display would look miles better than any 4K LED panel, because our eyes are built to discern contrast, not extreme detail. Sometimes I wonder if we're just too old to understand these new "trends" in graphics (I highly suspect heavy marketing tactics to indoctrinate young users into buying ultra expensive hardware they don't need).


RDNA 2 won't be announced until October 28th, so you'll be lucky to see the things in any appreciable quantities before December/January. They will probably have the 3060 out by then.

AMD just made waiting for RDNA2 pointless :rolleyes:

Frankly I hope you're kidding. How is it not perfectly logical to release a 3060 equivalent when the 3060 comes out? Am I insane? Are people unable to wait a few weeks? Where is this absolute rush to buy shit coming from? Did everyone go crazy? You know you're not forced to only buy within 1 month of release, you can buy cards while they're in market, right? So how is it pointless to have 2 vendors releasing competing products???

I'm amazed at what I read every time one or the other vendor releases cards. Some of y'all... I don't understand your "priorities" or logic in life. Perhaps I'm just old - not even 40 yet. But I just don't get this new attitude of I NEED IT RIGHT NOW and anything else is irrelevant. :rolleyes:

---

Think about this. Say next month you can buy a 3060 for $350. A month later, AMD brings out a 6500 for $300, same performance, similar features (raytracing, image reconstruction, etc). Would anyone really want to buy a 3060 mere weeks earlier without seeing what AMD comes out with? If you want a 3090, sure, I highly doubt they'll be able to offer anything at that level. But anything below it, seriously, are you THAT itching for something new that you just can't wait a month or two? I'm still on a 1060 3GB and I can run literally all AAA games at 2560x1080 between 60-95hz with 0 problems, so why the rush? I'm starting to believe some people here are paid Nvidia or AMD influencers mainly pushing everyone to BUY NOW RIGHT THIS SECOND OR PREORDER DO IT NOW NOW NOW.
 
Last edited:
Oh hey, assumptions ;). My main monitor is 1440p 27". I've used 4K monitors before and ended up selling them - mainly because I was tired of 60hz. I know what good dpi looks like on 27-32" 4K, and while it's fine, I don't think it should be the absolute goal, like many are making it. Again, see the backlash against the latest Halo for focusing solely on 4K. Better lighting, more complex AI and worlds, are way more important than more pixels after you've surpassed ~100 dpi. Can it look better at 4K? Of course. Would I sacrifice everything else for it? No way. As for antialiasing, I've found that good implementations of TAA or TXAA eliminate pixel crawl completely, and they're way cheaper than rendering at 4K. I'm not saying let's ban 4K, I'm saying let's not focus so much on 4K and the utter stupidity of 8K, and focus on the actual games. I'm quite happy in fact because either the 3060 or whatever RDNA 2 card comes at $200-300 will definitely let me play games at 1440p and 95hz (and often, at ultrawide 1080p which is even easier to do and the DPI remains the same as 2560 is the native horizontal of a 1440p 16/9 panel). 10bit, ~108 dpi and 95h is why I bought the Pixio I'm using, a perfect middle point (for my usage). Could've easily bought a 4K monitor a couple months ago, but it just wasn't worth it in comparison.

I must be weird. I think many if not most here agree with you, but for me I much more enjoy 4K 32" 60Hz than 27" 1440P 120Hz. I don't play many twitch type games but I just find native 4K to overcome the limitations of the frame rate. Ultimately I would love 4K 144 Hz but that might really need even the next generations of GPUs. Often when I have to downgrade a game I will play at 1080P instead of 1440P maybe there is just something off to my eye about 1440P.

I REALLY want to upgrade, but I will wait for the AMD annoucement. Raja the hype machine is gone and I just get this feeling that RDNA 2 will be way better than people are expecting. Plus I would not put it past them to have some sort of enhanced SSD features for Ryzen 3 coupled with RDNA 2. There seems to be an air of confidence about them and this next offering. Plus if the leaked benchmarks were real today on 3080 it's about 30% above 2080Ti and I would expect at minimum RDNA 2 will be at or above that likely with more RAM.
 
Use a 4K monitor 60hz, 3440x1440p Ultra wide at 60hz and 144hz HDR 1440p, wish I could combine them all together into one damn super monitor. Flight Simulator looks great on the Ultra Wide, Dirt Rally looks better on the 4K than the Ultra wide, the 144hz, HDR 1440p looks better in HDR and playing Doom, Tomb Raider etc. Not sure what to think, 61 years old, been through non computers, typewriters to word processing machines up through the ages to now and I can sure tell the difference between 4K and not! HDR and not, 144hz compared to 60hz. Some games are better on one more than the other but overall I would want everything combined, 144hz, HDR1000, VRR, Ultra Wide, fast pixel response etc. 5K except nothing is there monitor wise. Need DP 2.0 to go with it, not convinced DSC is visually loose less.
 
Use a 4K monitor 60hz, 3440x1440p Ultra wide at 60hz and 144hz HDR 1440p, wish I could combine them all together into one damn super monitor.

There's no one perfect monitor, eh? Everything's a compromise. At least both Ampere and RDNA 2 will support the high speed and VRR. HDMI 2.1 is guaranteed, not sure that they'd also have DP2? I don't think Nvidia mentioned this, though I might be wrong. AMD definitely hasn't said anything on DP2 though - or about anything else, really.
 
Oh hey, assumptions ;). My main monitor is 1440p 27". I've used 4K monitors before and ended up selling them - mainly because I was tired of 60hz. I know what good dpi looks like on 27-32" 4K, and while it's fine, I don't think it should be the absolute goal, like many are making it. Again, see the backlash against the latest Halo for focusing solely on 4K. Better lighting, more complex AI and worlds, are way more important than more pixels after you've surpassed ~100 dpi. Can it look better at 4K? Of course. Would I sacrifice everything else for it? No way. As for antialiasing, I've found that good implementations of TAA or TXAA eliminate pixel crawl completely, and they're way cheaper than rendering at 4K. I'm not saying let's ban 4K, I'm saying let's not focus so much on 4K and the utter stupidity of 8K, and focus on the actual games. I'm quite happy in fact because either the 3060 or whatever RDNA 2 card comes at $200-300 will definitely let me play games at 1440p and 95hz (and often, at ultrawide 1080p which is even easier to do and the DPI remains the same as 2560 is the native horizontal of a 1440p 16/9 panel). 10bit, ~108 dpi and 95h is why I bought the Pixio I'm using, a perfect middle point (for my usage). Could've easily bought a 4K monitor a couple months ago, but it just wasn't worth it in comparison.
It's a little ironic because you're making the assumption here. All I said is that you've reiterated the same old argument on behalf of people arguing for 1080P, because after all - that's literally the comparison you made, not a comparison between 1440P and 4k.

The framing of the argument from your post (specifically the bolded), is that you have to give up everything if you want 4k, which simply isn't reality. As I mentioned previously, you can have *both*, and that's been the case for couple years. Particularly the point I want to make is that with the 3080, a 4k 120Hz display device should be the sweet spot. 1080P is simply in the past. I'm also not surprised you return your 4k monitors because to be honest - gaming monitors are pretty dinky for their price, particularly when compared to OLED 4k devices. It's impressive to me that people (once again, speaking generally, not specifically about you) spend $xxxx on their rig and then view it on $xxx monitor with $xxx headphones/speakers. You can really heighten the experience of everything significantly if you're actually willing to dig into the enthusiast territory of A/V. The difference between 4k and 1080P in my setup is night and day, on either a 65" or 77" OLED. 1440P for my Acer 27" IPS 144Hz 1440P monitor is *okay* but really any monitor technology is so far behind TV panel technology in terms of IQ that it's not even funny.

But as you originally mentioned - preference heavily factors into everything. I'm only here to make the argument that given the state of where GPU's are about to be, 4k is far from "diminishing returns" territory, and is closer to really just being the standard. Nearly everything is focusing and taking into consideration 4k. Someone could pull out a small list of unoptimized games that don't reach 4k 120Hz even with the new cards, but that would be missing the point by a large degree.
 

We actually agree on many points, but we are focusing on different things - I don't mean to criticize your preferences at all. Much of it comes down to what each of us is sensitive to, visually. After 1440p I'm more sensitive to movement than I am to even more detail, so I choose my hardware accordingly - meaning I don't need to spend $800+ on a GPU because $200-300 more than delivers what I need at this point. I still fundamentally disagree that 1080p is the past, the steam hardware survey says it all:

1599783064042.png


We can do better, but right now if you want to get everything, the highest resolution at the highest framerate, that means spending ~$1000. I'm looking at things from a cost/benefit perspective, not from a performance at any cost viewpoint. From my perspective, 1440p is good enough detail, and lets you get high fps while spending ~$300. If one has to triple the amount of expense to get the same framerate at 4K, the cost outweighs the performance (that's MY preference of course). So bringing it back to our thread topic, if you know Ampere and RDNA2 can give you similar performance, I don't see why anyone wouldn't wait for AMD to announce their stuff. I've read here today that it's "embarrassing" for AMD to take an extra month (seriously WTF) and that waiting is stupid just buy NOW. Considering the price/performance at which we expect AMD to release, it just doesn't make any sense to me to not wait. Only if you want 3090 performance levels does it make sense to buy on release (at MSRP, which, good luck finding that). Even then, I'd highly question why on earth anyone would buy a 3090 for gaming, which it's clearly not designed for - but Nvidia, smart as they are, are shoehorning it into the "Geforce" category because they know a minority will pay the obscene price for it even as a pre-order.
 
IMO Resolution vs Framerate depends on the type of games you play.

Since you can't have the cake and eat it too (yet), you have to choose between one and the other.

Since I don't do any competitive FPS, I really don't need 200+fps, and ever since I started using 4K, I just can't go back to 1440p or lower if only because upscaled resolutions don't look as good as native. So for me resolution is more important and if I have to give up a few FPS to stay at 4K, I'm fine with it.
 
I'll take 1440p at 144hertz all day long than 4k @ 60hertz. All day..long.

Seeing 144fps/144hertz and then looking at 60hertz. No way am I going back. 4k at 100 fps or 1440p at 150fps? I think I'll choose 1440p @ 150fps. Only my opinion.
 
This is what I've been hearing for years from those who are still stuck on 1080p. I used to believe this too, because this is what many view distance charts would suggest. But in person, the difference is pretty massive - even 6-10 ft away. And what you're saying is more accurate in the case of say - Blu-Ray movies. But with games, where the 3d rendering exhibits aliasing and shimmering, it's even more important.

I've been using 1080P temporarily - and boy it's very rough and lacking in sharpness compared to 4k. Foliage is a horror-show with any amount of AA. With the new cards, we don't have to even choose between HFR and 4k - you can have both, and there's really no good reason not to, unless you're just trying to save money. It's really what they would describe to be a "next-gen" experience.

Yeah for movies 1080p + HDR would've been fine. There is a very noticeable difference between 1080p and 4K gaming on a big screen but in my experience 1620p or 1800p is just as good.
 
I'll take 1440p at 144hertz all day long than 4k @ 60hertz. All day..long.

Seeing 144fps/144hertz and then looking at 60hertz. No way am I going back. 4k at 100 fps or 1440p at 150fps? I think I'll choose 1440p @ 150fps. Only my opinion.

Thankfully we don't live in the dark ages anymore and have moved on from that craptastic 1440peee into high refresh 4k displays!
 
Thankfully we don't live in the dark ages anymore and have moved on from that craptastic 1440peee into high refresh 4k displays!

Well right now, for me, there isn't anything nice out there that I would want at 4k. Until 32 to 43 inch IPS 4k 144+hertz gsync monitors are the norm, I am fine with where I am at. The offerings for 4k gaming displays is really lackluster. When good shit comes out, I'll def upgrade.
 
This is what I've been hearing for years from those who are still stuck on 1080p. I used to believe this too, because this is what many view distance charts would suggest. But in person, the difference is pretty massive - even 6-10 ft away. And what you're saying is more accurate in the case of say - Blu-Ray movies. But with games, where the 3d rendering exhibits aliasing and shimmering, it's even more important.

Gaming and movie will be quite different I suspect here.

Because with movies, right now it always come at the cost of more compression, apparently in blind test people prefer 2K to 4K if you have 50 mbits or less and 4K start to be a big winner at 80 mbits, it would not be surprising for something like a bluray or 25 mbps Netflix that 2K > 4K, let alone what a 2K tv could do at the same price than a 4K TV.

There is a bit of the same going on here with video games (PC performance becoming the bottleneck instead of bandwidth), maybe in reality that a game made to be so demanding that it run at the same FPS at 1080p that it does at 4K, a new monitor 1080p at the same price point that the 4K one would in reality be a better experience, but in reality that is not an option that is offered to people (same goes for movies, they do not do high bitrate 1080p for people to use, people end up comparing higher bitrate 4K to lower bitrate 1080p content and that is just an absurd comparison, same would be to compare 4K at only 60 fps and 1080p and an higher fps, crank the graphic quality to an higher quality to go down to the same FPS and see if the "cost" of going 4K is worth it).
 
Gaming and movie will be quite different I suspect here.

Because with movies, right now it always come at the cost of more compression, apparently in blind test people prefer 2K to 4K if you have 50 mbits or less and 4K start to be a big winner at 80 mbits, it would not be surprising for something like a bluray or 25 mbps Netflix that 2K > 4K, let alone what a 2K tv could do at the same price than a 4K TV.

There is a bit of the same going on here with video games (PC performance becoming the bottleneck instead of bandwidth), maybe in reality that a game made to be so demanding that it run at the same FPS at 1080p that it does at 4K, a new monitor 1080p at the same price point that the 4K one would in reality be a better experience, but in reality that is not an option that is offered to people (same goes for movies, they do not do high bitrate 1080p for people to use, people end up comparing higher bitrate 4K to lower bitrate 1080p content and that is just an absurd comparison, same would be to compare 4K at only 60 fps and 1080p and an higher fps, crank the graphic quality to an higher quality to go down to the same FPS and see if the "cost" of going 4K is worth it).
what's the 4k bluray bit rate? Wasn't it like 50mbits? or am I thinking regular bluray.
 
I'll take 1440p at 144hertz all day long than 4k @ 60hertz. All day..long.

Seeing 144fps/144hertz and then looking at 60hertz. No way am I going back. 4k at 100 fps or 1440p at 150fps? I think I'll choose 1440p @ 150fps. Only my opinion.

And..... you got lots of likes because I think we all agree.

As bad as I want 4k 120hz the damn screens where available are 1500+++++ in price.

Asus has a 43" 4k 120hz TV like Monitor but not sure I want that.

There are lots of 1440p 144/165 available for 5 to 700 if im looking correctly.
 
I'm waiting to see what AMD does, plus they have better drivers. I'm also waiting for somebody to release a game I actually want to play.
 
I'll probably wait to see what Linux driver support looks like.

But, if Cyberpunk comes out first, well I'll boot into Windows for that.
 
I can't find a certified ultra HDMI cable that is proven to carry 48gigs, the Amazon monoprice cables say it but don't do it, as they don't have the bar code that can be scanned with the hdmi organization approved at officially certified. We are supposed to camera scan these to verify. So how do I run 4k 120fps on the c9 without those cables, Id like to try on my old games to see how they do.
 
And..... you got lots of likes because I think we all agree.

As bad as I want 4k 120hz the damn screens where available are 1500+++++ in price.

Asus has a 43" 4k 120hz TV like Monitor but not sure I want that.

There are lots of 1440p 144/165 available for 5 to 700 if im looking correctly.


i have a 1440p 144 hertz lg i just bought and i love it. Cost was 799.99 at microcenter. Its a great monitor compared to my older lg 75 hertz screen. (also 34 inch curved.) I really dont want to use a 4k monitor. This is my perfect screen. I drive it with a 1080 ti. I believe there is one out now that is better and updated version of this one if anyone is looking to get one like it.

LG 34GK950F-B 34
 
do a lot of PC gamers actually use a TV as their computer monitor?...is that really a popular thing?...seems like a console thing...isn't a G-Sync/FreeSync 144hz+ dedicated monitor better?...I know LG added G-Sync support last year but is it really better then a dedicated monitor?...I have an LG OLED but I don't use it for gaming, I use it for my 4K movie/Blu-ray/Netflix etc...I have a dedicated 27" G-Sync 144hz monitor for PC gaming
 
do a lot of PC gamers actually use a TV as their computer monitor?...is that really a popular thing?...seems like a console thing...isn't a G-Sync/FreeSync 144hz+ dedicated monitor better?...I know LG added G-Sync support last year but is it really better then a dedicated monitor?...I have an LG OLED but I don't use it for gaming, I use it for my 4K movie/Blu-ray/Netflix etc...I have a dedicated 27" G-Sync 144hz monitor for PC gaming

I used a Samsung 4K 40" TV since 4K 60Hz over HDMI was possible, 5+ years. July, I went with a 3440x1440 144Hz monitor.
I wouldn't go back to a 4K TV or monitor. Speed is more more important right now. If you play multiplayer, you'll be at a great disadvantage because the competitive players have a FPS edge.

4K 120Hz will take too much power to drive every game. I rather get a 3090 and run near to max settings on 3440x1400.
 
do a lot of PC gamers actually use a TV as their computer monitor?...is that really a popular thing?...seems like a console thing...isn't a G-Sync/FreeSync 144hz+ dedicated monitor better?...I know LG added G-Sync support last year but is it really better then a dedicated monitor?...I have an LG OLED but I don't use it for gaming, I use it for my 4K movie/Blu-ray/Netflix etc...I have a dedicated 27" G-Sync 144hz monitor for PC gaming
My daughter has a 43" Samsung 4K hooked up to a 1600 AF / RX 580 8Gb .. she like to plays Sims in 4K 60Hz .. I thought about switching her over to the RX 5500 XT single fan model as her only complain was the fans on the RX 580 being so load .. lol
 
Samsung has had FreeSync TVs for a few years, they were pretty nice seeing 4K HDR gaming for the first time.

Ultimately, I felt 60 fps was too low to enjoy and the performance requirements on 4K are kinda crazy for what you're getting, but I wouldn't mind giving it a shot again on the high refresh panels coming soon on HDMI 2.1.

For now, though, I'm happy at 3440x1440.
 
I expected AMD to make some noise or else they don't have a competitor.

However...
There are a few important clues.

Their announcement is late October.
Rumors of 3070 Ti/Super
Rumors of 3080 20gb
No rumors about AMD 6xxx performance.

I can safely assume that top tier amd card coming this year will have more vram than 3080, but will be slower (or close).
It will beat 3070 at a slightly higher price point, therefore rumors about 3070 Ti/S with 16gb vram before 3070 is even out. Although not sure if this even is a rumor. More like a leak from a major pc manufacturer.

So my guess is, top AMD card will be within 10-15% of 3080 but with quite a bit more vram.
It will compete with 3070Ti/S price and performance wise.
It won't beat 3080, but because it's so close to 3080 and has more vram, it will most likely eat a lot of 3080 10gb sales so Nvidia will be forced release 3080 with 20gb at a higher price point.
 
The X35, Asus PG35VQ and the AG353UCG has all the features wanted, 200hz, 1440p UW, HDR1000, but the price though.... Hopefully they will launch new monitors that supports hdmi 2.1 or dp2.0 so 200Hz 4:4:4 10-bit will be possible. Maybe in 2022.....
 
I cannot stress this enough. AMD is good! AMD is SOO GOOD!
People need to hold off on Ampere and WAIT, WAIT, WAIT!





That way I can order my 3090 in peace and not have to deal with F5'ing my face off!

LMAO you had me at the top part, I thought it might be some other MLID wannabe that thinks he has some insider information.
Anyway, waiting for AMD might be a long time now. Sony is barely scraping by with 50% yields on the PS5 APU so things likely aren’t looking too great for Navi 21.
 
RDNA 2 won't be announced until October 28th, so you'll be lucky to see the things in any appreciable quantities before December/January. They will probably have the 3060 out by then.

View attachment 278094

AMD just made waiting for RDNA2 pointless :rolleyes:

I figured they would prioritize Zen 3 over RDNA 2, as Intel has them jumping for once with Tigerlake (and we all know how much more AMD makes on CPUs), but almost a whole month is embarrassing!


According to supply sources it will be far later than that before AMD has any appreciable amount of product to release to the public. They haven't even put any invoices to 3rd party parts for orders to even make things on scale for the cards. Let alone let any AIB partners know anything about the cards yet. IgorLabs and others say that they usually have 3 month lead in time to start ramping up once those invoices start going out before anything hits the market. So looking until after the new year in 2021 to even try to get your hands on an AMD card. That is going to cost them a ton of sales.
 
I expected AMD to make some noise or else they don't have a competitor.

However...
There are a few important clues.

Their announcement is late October.
Rumors of 3070 Ti/Super
Rumors of 3080 20gb
No rumors about AMD 6xxx performance.

I can safely assume that top tier amd card coming this year will have more vram than 3080, but will be slower (or close).
It will beat 3070 at a slightly higher price point, therefore rumors about 3070 Ti/S with 16gb vram before 3070 is even out. Although not sure if this even is a rumor. More like a leak from a major pc manufacturer.

So my guess is, top AMD card will be within 10-15% of 3080 but with quite a bit more vram.
It will compete with 3070Ti/S price and performance wise.
It won't beat 3080, but because it's so close to 3080 and has more vram, it will most likely eat a lot of 3080 10gb sales so Nvidia will be forced release 3080 with 20gb at a higher price point.

It will be interesting to see if 6X being a new untested product actually provides more fps than regular 6 vram. 10gb of 6X could be equivalent to 16gb of speed 6 vram, it's just not been independently tested yet.
 
4K 120Hz will take too much power to drive every game. I rather get a 3090 and run near to max settings on 3440x1400.

My 2080ti could max any game @ 3440x1440 @ 100hz so with a 3090 make sure you get a 3440x really high refresh version. I think there are some 165hz panels out there.
 
I have given relief aid to people in a warzone and those people acted with more calm and reason that a lot of the posters in the thread.
It is a GPU.
A hobby thing.
I am planning on a RTX 3090.
I will buy it when CyberPunk2077 launches.

Until then I will just chill and be slighty amused at all the "haste and panic"...
 
LMAO you had me at the top part, I thought it might be some other MLID wannabe that thinks he has some insider information.
Anyway, waiting for AMD might be a long time now. Sony is barely scraping by with 50% yields on the PS5 APU so things likely aren’t looking too great for Navi 21.

Umm 99% of the market is not [H]. The people.on this site project the ultra enthusiast mentality on the regular world. Fact is, if AMD sells a card now or in December it doesnt matter at all. We at [H] do not even define a market that matters. Were like a decimal point of a percent.

The 2060 and 1050ti and rx470/580 dominate the gamkng PC market. 99% of all PC gamers buy a prebuilt with one of those included GPUs.

If the uktra enthusiast consumer suddenly vaporized nV and AMD wouldnt even flinch in the big pic.

AMD waiting to release their cards far after nV means nothing. Its about the 20,000,000 HPs and Dells and other brands that have 3060s and Little Navi strapped in them that matter.

Just because a hardware junkie that wants a $1500 GPU cant find one due to low volume doesnt mean your inability to find a bleeding edge card that 99.9% of regular people would have an aneurysm over if they even knew a gpu that expensive existed, doesnt mean the company is in ruins.
 
Last edited:
If you're concerned about price and are patient, NVIDIA will likely retaliate to a competitive AMD offering. I'm trying to buy a card to power a 77" LG OLED that can go up to 120 FPS at 4k, so I think I'll be stuck weighing the value of the 3080 versus the raw power of the 3090.
Hopefully with a 3080ti for $999.
 
Any word on if the 3090 will break motherboards in half? That thing hanging off sideways does not look too appealing...
 
Any word on if the 3090 will break motherboards in half? That thing hanging off sideways does not look too appealing...

I'd strongly recommend using the screws to secure it to the IO shield; and unless your mobo has a reinforced port don't leave the card installed if shipping it. (Both of these are really good ideas with any 2 slot card from the last 10+ years) Beyond that ... ¯\_(ツ)_/¯
 
I'd strongly recommend using the screws to secure it to the IO shield; and unless your mobo has a reinforced port don't leave the card installed if shipping it. (Both of these are really good ideas with any 2 slot card from the last 10+ years) Beyond that ... ¯\_(ツ)_/¯

I got one of these harnesses to secure my 3090!
5158SY3IabL._AC_SY679_.jpg
 
Back
Top