So buy Ampere or wait for AMD response?

Look into the 5700xt on the used market after everything is released. It's incredibly powerful and fun to game on and I have both a 2080ti and 5700xt. I game on either with utmost enjoyment.
That would be funny, as I recently sold my RX 5700. I might grab it again in 6 months when they're dirt cheap.
 
Now that the Xbox series S is official for $300, if you want next-gen performance at 1080p-1440p, it's certainly quite a good idea to wait. If that's the price you pay, I'd figure there's a $200 GPU in PC equivalent terms.
 
Now that the Xbox series S is official for $300, if you want next-gen performance at 1080p-1440p, it's certainly quite a good idea to wait. If that's the price you pay, I'd figure there's a $200 GPU in PC equivalent terms.
Would make sense for console-centric games, but otherwise, you're committing to the '60FPS or less with fiddly sticks' experience. And once you've committed to 120FPS or higher...
 
'significantly' is way over-exaggerating it...yes Intel leads in paper benchmarks and has the overall gaming performance crown but 98% of people will not notice any difference in real world usage...Zen 3 will be better and you're right that it makes no sense not to wait but let's not go crazy with the hyperbole

No, it's pretty significant especially for video game emulation. We're talking like 20%+ differences in many cases.
 
Would make sense for console-centric games, but otherwise, you're committing to the '60FPS or less with fiddly sticks' experience. And once you've committed to 120FPS or higher...

A) you don’t know that, and we know so little you can’t even make an educated guess
B) a gpu will always perform better when not constrained by console thermal
C) I’d expect 2060 like performance at $200-250, and 2070 like levels at $350. Depending on what you’re playing, that’ll get you 120fps easily on 1080p

unless you’re looking for 3090 performance levels it seems stupid not to wait and see. At worst we’ll have options for similar performance on everything below a 3080, that much is for certain. More options never hurt any consumer.
 
TL;DR for everyone: He basically just said, don't go out and buy a RTX 2000 series card or Ryzen 3000 series processor because new stuff is coming in a couple of weeks. I think all of us here know that.

Did not say anything about waiting to buy a RTX 3000 series card.

Yeah his video was a waste of 10 mins but he got his clicks.
 
So buy Ampere or wait for AMD response?

What a fitting and accurate title for a thread.

If you have enough money - buy Ampere.
If Big Navy is better, sell Ampere and buy Navy.
If you havn't money to lose - wait.

I prefer to not buy anything before midle of November.
 
TL;DR for everyone: He basically just said, don't go out and buy a RTX 2000 series card or Ryzen 3000 series processor because new stuff is coming in a couple of weeks. I think all of us here know that.

Did not say anything about waiting to buy a RTX 3000 series card.
He is a visionary.

His stuff is made to get clicks, and it's designed for Reddit luddites. In this respect, he does exemplary work. And I suppose he is getting a valid message across, albeit one that could be stated in about 1:30 including context.

Buy yeah, for anyone halfway knowledgeable, these videos are entirely rudimentary and can seem like they're meant for mouthbreathers At least Tech Jesus and that other Aussie lad don't dumb it down too much.
 
I used to stick my nose up in the air when it came to Youtube tech reviews, but Gamers Nexus and Hardware Unboxed have won me over. Gamers Nexus, in particular, can be every bit as critical on hardware as Old Bennett and his [H] boys were back in the day.

Oh, except for Tiny Tom Logan - liked that guy from the start. Funny dude.
 
Now that the Xbox series S is official for $300, if you want next-gen performance at 1080p-1440p, it's certainly quite a good idea to wait. If that's the price you pay, I'd figure there's a $200 GPU in PC equivalent terms.

Technically, a high end PC with something like the 3090 is the only way you're going to get "next-gen" performance. 1440p isn't next gen. 8k at 60fps+ is "next-gen."

Consoles haven't been "next-gen" for decades. Last time I checked, a Zen 2 CPU was "last gen."
 
Technically, a high end PC with something like the 3090 is the only way you're going to get "next-gen" performance. 1440p isn't next gen. 8k at 60fps+ is "next-gen."

Consoles haven't been "next-gen" for decades. Last time I checked, a Zen 2 CPU was "last gen."

I would argue that 8k is "future gen" as in virtually nobody is going to attempt to play games at 8k. At best you might have some gimicky upscaled DLSS "8k." A better "next gen" standard is high refresh 4k, and I'm sure that the 3090 will do that fine (or at least better than anything else). I can only find one 8k monitor for sale anyway (lots of "5k" widescreens).
 
i wopuldnt be hyped for AMD they ALMOST ALWAYS disappoint especially in the gpu market. Not worth getting 3090 even for 8k which no one but the 1% can afford and any gaming at 60fps is utter cancer to my eyes i need 90+ fps at ALL times. they cant even make a good enough 1440p monitor that supports the latest tech. hell oled has been around for 10+ years and only now its starting to come down in price but then again its still 1000-1500 for anything good. 3090 isnt for gaming unless you need the ram. personally id like to see a 3080ti with 20 gbs that would be enough for any game coming in the next 3-5 years
 
Last edited:
i wopuldnt be hyped for AMD they ALMOST ALWAYS disappoint especially in the gpu market

I don't know. Even their most recent release wasn't all hype...Navi based cards are pretty good for what they are. They generally held their own with the 2060/2070 for somewhat less cost usually. I think you're going to get exactly what they said...A 2080Ti killer. Unfortunately, that's now in the $499 range rather than the $1199+ range.

I might be completely off base here, but the Raja hype train has migrated to Intel.
 
I used to stick my nose up in the air when it came to Youtube tech reviews, but Gamers Nexus and Hardware Unboxed have won me over. Gamers Nexus, in particular, can be every bit as critical on hardware as Old Bennett and his [H] boys were back in the day.

Oh, except for Tiny Tom Logan - liked that guy from the start. Funny dude.

Me too, I used to scorn YouTubers completely but there are some quality outlets including the 2 you mentioned. I still prefer written articles but I've added a few specific YT channels to the rotation as well - GN, HU and DF.
 
I will be rocking my 1080 ti till solid reviews can come out with both sets of products. I dont care about the hype. Amd is playing the cards close to the chest this time and might have something that can compete. 3090 is appealing if its the monster they say it is but i am in no rush for an upgrade.
 
Me too, I used to scorn YouTubers completely but there are some quality outlets including the 2 you mentioned. I still prefer written articles but I've added a few specific YT channels to the rotation as well - GN, HU and DF.

My core problem has never been the inherent quality of audio/video vs text content. It's that I read fast, people talk slowly. This is compounded by the lack of an equivalent to Ctrl-F searching for keywords (or an eyeball search at 5-10x normal reading rate if I don't have suitable keywords but would recognize my search target on sight) so I don't have to listen/watch everything before what I'm looking for.
 
1440p isn't next gen. 8k at 60fps+ is "next-gen."

I fundamentally disagree with that statement. The numbers of pixels rendered or the speed at which they're rendered isn't what qualifies a graphics generation. The quality of the pixels, of the worlds, of the gaming experience is what implies an advancement noticeable enough to be considered next-gen.

I don't want 8K. I don't even want 4K. I'm perfectly happy at 1440p, and I'm usually happy gaming at 1080p. With image reconstruction techniques such as DLSS, brute resolution has become a notion of the past that we really need to reconsider. I want games that look amazing, and that finally take advantage of the ridiculous NVME bandwidth. Resolution alone doesn't do that, notice the backlash against the newest Halo because even at super resolution, it just didn't look as good as most expected (current quality at more resolution = meh, improved quality at current resolution = yay, especially when you'll be sitting at enough distance that you'll never see those extra pixels).

If a $200-300 RDNA2 card will get me that, that's all that many (most?) gamers want for a generational upgrade. It's why I'd absolutely wait a month and see what AMD has to offer, and see what x60 Ampere cards Nvidia plans on releasing.
 
My core problem has never been the inherent quality of audio/video vs text content. It's that I read fast, people talk slowly. This is compounded by the lack of an equivalent to Ctrl-F searching for keywords (or an eyeball search at 5-10x normal reading rate if I don't have suitable keywords but would recognize my search target on sight) so I don't have to listen/watch everything before what I'm looking for.

Agree much nicer, when there is a text article to go with.

Though GN and HWUB have started using extensive indexes now. So it is a LOT easier to skim their videos at least, and both tend to back up a number of their videos with articles as well, though articles tend to lag videos. GN makes it clear, that text earns them practically nothing compared to the videos. Which is probably why everyone wants to do video now.
 
I fundamentally disagree with that statement. The numbers of pixels rendered or the speed at which they're rendered isn't what qualifies a graphics generation. The quality of the pixels, of the worlds, of the gaming experience is what implies an advancement noticeable enough to be considered next-gen.

I don't want 8K. I don't even want 4K. I'm perfectly happy at 1440p, and I'm usually happy gaming at 1080p. With image reconstruction techniques such as DLSS, brute resolution has become a notion of the past that we really need to reconsider. I want games that look amazing, and that finally take advantage of the ridiculous NVME bandwidth. Resolution alone doesn't do that, notice the backlash against the newest Halo because even at super resolution, it just didn't look as good as most expected (current quality at more resolution = meh, improved quality at current resolution = yay, especially when you'll be sitting at enough distance that you'll never see those extra pixels).

If a $200-300 RDNA2 card will get me that, that's all that many (most?) gamers want for a generational upgrade. It's why I'd absolutely wait a month and see what AMD has to offer, and see what x60 Ampere cards Nvidia plans on releasing.

Nah. 320x240 -> 640x480 was earth shattering. Resolution actually matters. If you think about it, one of the most horrible aberrations in graphics is aliasing, and the only way to completely eliminate it is by absurdly high resolutions (anti-aliasing is a hack that will never completely solve it). And guess what? As you raise triangle count, you run into more trouble with aliasing because you effectively end up with more edges.
 
I would argue that 8k is "future gen" as in virtually nobody is going to attempt to play games at 8k. At best you might have some gimicky upscaled DLSS "8k." A better "next gen" standard is high refresh 4k, and I'm sure that the 3090 will do that fine (or at least better than anything else). I can only find one 8k monitor for sale anyway (lots of "5k" widescreens).

For actual next gen games (not Doom Eternal), you're basically going to need 3090 power to do 4k @ 120fps (if it can even do it at all).
 
So this really hints at AMD aiming at the 3070, at least if you look at dates.

Frank Azor on Twitter:
https://twitter.com/AzorFrank/status/1302272863425134592
We are on different schedules that's all. It's not like this is a race that everyone starts on the same day. We each have our own schedules and we are each working hard to meet them. In the meantime there's a lot of interest so we are going to share what we can when we can.

Pretty much what I expected, and posted earlier. There is really no reason to doubt this. Expecting them to deliver simultaneously is not realistic.

This is not like the Super Cards that NVidia dumped on the Navi launch. Those already existed. They just had to tweak some parameters on them.

This is a new generation, and it can only launch when it's ready.
 
Last edited:
Nah. 320x240 -> 640x480 was earth shattering. Resolution actually matters. If you think about it, one of the most horrible aberrations in graphics is aliasing, and the only way to completely eliminate it is by absurdly high resolutions (anti-aliasing is a hack that will never completely solve it). And guess what? As you raise triangle count, you run into more trouble with aliasing because you effectively end up with more edges.
What you're talking about is basic super-sampling, which is the least efficient form of anti-aliasing. What you'd ideally want would be some form of machine-learning algorithm to reduce higher render resolutions down to appropriate output resolutions.

Also, aliasing can happen with even basic geometric detail. Crawling and pixel-popping artifacts result regardless of the triangle count and are more scene-specific; they can even happen just as easily with textures alone.
 
I've lost all my saves. Can i launch the DLC directly or do i need to replay the whole game ?
Frank Azor on Twitter:
https://twitter.com/AzorFrank/status/1302272863425134592


Pretty much what I expected, and posted earlier. There is really no reason to doubt this. Expecting them to deliver simultaneously is not realistic.

This is not like the Super Cards that NVidia dumped on the Navi launch. Those already existed. They just had to tweak some parameters on them.

This is a new generation, and it can only launch when it's ready.

They seem pretty serious about not getting any hype up.

It's all good.

 
What you're talking about is basic super-sampling, which is the least efficient form of anti-aliasing. What you'd ideally want would be some form of machine-learning algorithm to reduce higher render resolutions down to appropriate output resolutions.

Also, aliasing can happen with even basic geometric detail. Crawling and pixel-popping artifacts result regardless of the triangle count and are more scene-specific; they can even happen just as easily with textures alone.

Nah. As we've already seen with DLSS actually altering scenes and removing details, we don't want any form of machine learning algorithm that will be realistic within he next decade at least. DLSS is snake oil.
 
I am going to wait for the 3080ti/super. I can afford the 3090 but I just can't justify the price over a 3080. $1000 3080ti/super will hit the sweet spot. Also will be able to see what AMD had up their sleeve by the also.
 
Nah. As we've already seen with DLSS actually altering scenes and removing details, we don't want any form of machine learning algorithm that will be realistic within he next decade at least. DLSS is snake oil.
DLSS2.0 adds detail, rather than removing it.

Unfortunately, it seems none of the DLSS games are being updated to DLSS2.0, which is a shame. I think the only game that was updated to DLSS2.0 was Control, but don't quote me on that one.
 
DLSS2.0 adds detail, rather than removing it.

Unfortunately, it seems none of the DLSS games are being updated to DLSS2.0, which is a shame. I think the only game that was updated to DLSS2.0 was Control, but don't quote me on that one.

Yup, was control (and mechwarrior) - DLSS is far from snake oil. it's :

 
If you're concerned about price and are patient, NVIDIA will likely retaliate to a competitive AMD offering. I'm trying to buy a card to power a 77" LG OLED that can go up to 120 FPS at 4k, so I think I'll be stuck weighing the value of the 3080 versus the raw power of the 3090.
 
Nah. 320x240 -> 640x480 was earth shattering. Resolution actually matters.

2 words: diminishing returns.

Going from 240p to 480p meant jumping from a measly 76800 pixels to 307200. Going from 1080p to 4K implies going from 2073600 pixels to 6782400. Once you're already at 2 million pixels, sure, things can look better, but they don't look bad at all. Even if you more than triple the pixel count for 4K, if you're sitting 6-10 ft away from your TV, there is no way you can see that extra detail, pixels are too small. Heck, even if 2-3 ft away you can notice the jump from QHD to 4K, there's no way you're noticing that crispness while you're running around a game, much less if you're playing shooters. I realize we all have different visual acuity, but I'd much rather have my monitor display 95hz (the max it can do) QHD or even 1080p than 4K at 60hz.

Give me an Ampere or RDNA2 card that can run games at a smooth 95hz at QHD or 1080p and I'm sold. That's WAY more of an improvement than more pixels. Especially if you render the pixels in the same way we have for the past decade, or games don't actually change much. I want better worlds (NVME storage) and more fluid gaming (95hz) but I don't need more pixels. They'd be nice, of course, but if they mean sacrificing the other two, then I'll pass. If a $200-300 Ampere or RDNA2 can get us that kind of performance - which I think they will - then I see no point in buying anything better. Of course other people will have their preferences. Either way, for anything below a 3090, it just doesn't make sense to NOT wait a month and see.
 
2 words: diminishing returns.

Going from 240p to 480p meant jumping from a measly 76800 pixels to 307200. Going from 1080p to 4K implies going from 2073600 pixels to 6782400. Once you're already at 2 million pixels, sure, things can look better, but they don't look bad at all. Even if you more than triple the pixel count for 4K, if you're sitting 6-10 ft away from your TV, there is no way you can see that extra detail, pixels are too small.
This is what I've been hearing for years from those who are still stuck on 1080p. I used to believe this too, because this is what many view distance charts would suggest. But in person, the difference is pretty massive - even 6-10 ft away. And what you're saying is more accurate in the case of say - Blu-Ray movies. But with games, where the 3d rendering exhibits aliasing and shimmering, it's even more important.

I've been using 1080P temporarily - and boy it's very rough and lacking in sharpness compared to 4k. Foliage is a horror-show with any amount of AA. With the new cards, we don't have to even choose between HFR and 4k - you can have both, and there's really no good reason not to, unless you're just trying to save money. It's really what they would describe to be a "next-gen" experience.
 
If you're concerned about price and are patient, NVIDIA will likely retaliate to a competitive AMD offering. I'm trying to buy a card to power a 77" LG OLED that can go up to 120 FPS at 4k, so I think I'll be stuck weighing the value of the 3080 versus the raw power of the 3090.

The prices NVidia started with, leaves scant room for AMD to undercut them. Even if AMD does, it won't be by enough of a margin for NVidia to care. So I expect NVidia's prices (FE cards) will be solid for many months into the future. AIB might float high on short supply and then trend toward FE prices later.

Just like Microsoft released XBSX at $500, that gives them no question that Sony will go lower, so that price will be steady, without any need to retaliate with price cuts.
 
I have known Frank for over two decades.

That tweet for fucking stupid........
 
DLSS2.0 adds detail, rather than removing it.

Unfortunately, it seems none of the DLSS games are being updated to DLSS2.0, which is a shame. I think the only game that was updated to DLSS2.0 was Control, but don't quote me on that one.

That's not true. Explain the disappearing light rain in Death Stranding if it "adds detail."

People really need to stop taking marketing BS at face value.
 
2 words: diminishing returns.

Going from 240p to 480p meant jumping from a measly 76800 pixels to 307200. Going from 1080p to 4K implies going from 2073600 pixels to 6782400. Once you're already at 2 million pixels, sure, things can look better, but they don't look bad at all. Even if you more than triple the pixel count for 4K...

This.

As someone that started gaming at 256 x 192 in the 1980's, then gradually worked my way up the ladder as far as 1600p on my personal desktop, those initial jumps were massive. but once hitting anything close 1000p they became negligible.

Really the same formula that is applied to "Retina Screen" at 10" can determine PPI for any vewing distance where it becomes basically irrelevant to keep pushing resolution higher.

8K will serve no real purpose beyond the ego stroking of have more pixels than your neighbor.
 
but once hitting anything close 1000p they became negligible.
Draw distance has always been the big one for me. Artifacts related to draw distance limitations can have an effect in specific situations, and that's not something that the majority of games are designed around as they generally have to cater to consoles as well as lower-spec desktops, but it is certainly a realizable advantage to be able to see your targets before they see you.
Really the same formula that is applied to "Retina Screen" at 10" can determine PPI for any vewing distance where it becomes basically irrelevant to keep pushing resolution higher.
The limit is human visual acuity, which needs to account for sub-pixel aliasing and similar artifacts, but yeah, Retina more or less hits that well enough. Might do us well to exceed it a bit to account for UI variability and stuff like text sharpness, but massively overshooting isn't really going to be helpful.
 
Back
Top