Big Navi is coming

Well big Navi is a vague term for me - will the 58xx XT be the same core as the 59xx XT or will there only be a 58xx? I can see a 60 CU about part being less than 300w but anything higher as in a 72 CU part and up??? Would require HBM2 or 3. Will AMD compete with the Titans? They tried with the FE but I think they failed while it is a powerful card in its time.

As for ray tracing support I am not sure how AMD is going to handle that - chiplets do make a lot of sense in that respects, add in some physics hardware, sound, encoding/decoding and keeping the GPU more CU orientated would be very interesting design change. Yet unless this hardware is available to the lower tiers - it won't push ray tracing into the main stream. As for RTX in comparison I consider it a failure so far but does establish a base to build upon which I am sure Ampere will turn up the performance and hopefully make it a very desirable usable feature which right now it is not. So if AMD has a Ray Tracing, physics maybe, encoding/decoding chiplet that is used from top to mid and even lower cards that does ray tracing right - main stream will probably happen. I just don't see that this year.

For this year big Navi may just be a 60CU 384bit memory bus part, no ray tracing - I see that as most likely. Next year AMD I see will have fully functional ray tracing that actually would be worth while, if anemic as RTX I would just pass on it.

In the mean time the 5700 XT Lisa Sue is kicking ASS! :D:D:D:D
 
Hunh..?
We already know what is coming, based on what came last month. Everyone knows a bigger version of little-navi10 is coming, and based on science and math we can extrapolate what big-navi will do.

Hoping... does no good, you have to work within reality. Dr Su gave us a taste of it, we want more.
I expect Big Navi - what ever hell it will be - better compete with the 2080 Ti or at least beat the 2080 Ti Super significantly at a cheaper cost. 5800 XT >or = 2080 Ti $799, 5800 >= 2080 Super $599. Would be the best thing for Nvidia worshipers, will prompt Nvidia to reduce pricing and to release Ampere sooner. If good sign me up for one or the other or both. I usually end up with cards from both for various reasons, except for Turing this cycle.
 
Hunh..?
We already know what is coming, based on what came last month. Everyone knows a bigger version of little-navi10 is coming, and based on science and math we can extrapolate what big-navi will do.

Hoping... does no good, you have to work within reality. Dr Su gave us a taste of it, we want more.
All I see is guesses and assumptions. We do not know what is coming or when. Frankly I don't see why Big Navi is not out now other than it must have had issues. Navi has been worked on for awhile, plus a slower moving higher end part should be less affected by process availability. Too much we do not know, will Samsung be involved, issues etc.? Does TSMC have enough capacity for all the 7nm contracts they have? How long are these parts going to be sold, as in will AMD go back to a yearly cadence (that what they need to do to catch up to Nvidia)?
 
Last edited:
Frankly I don't see why Big Navi is not out now other than it must have had issues. Navi has been worked on for awhile, plus a slower moving higher end part should be less affected by process availability.

I don't either. I've repeatedly stated that I absolutely believe that AMD has the technology to compete with Nvidia's high-end, and since both companies are currently using TSMC, it seems like AMD just continuously chooses not to.

And then we see what they did with Navi: overcooked, slow RAM, crappy stock blowers with terrible stock power settings, and no hint of DXR acceleration- even just enabled in the drivers!

Now, one could take that as, 'hey, maybe they have better stuff coming, right?'. And that would be correct- but AMD has had 'better stuff' coming since they bought ATi, and that graphics division has only lost marketshare since while never really catching up let alone surpassing Nvidia for any notable length of time.


Which leads one to wonder: are they even trying?
 
I expect Big Navi - what ever hell it will be - better compete with the 2080 Ti or at least beat the 2080 Ti Super significantly at a cheaper cost. 5800 XT >or = 2080 Ti $799, 5800 >= 2080 Super $599. Would be the best thing for Nvidia worshipers, will prompt Nvidia to reduce pricing and to release Ampere sooner. If good sign me up for one or the other or both. I usually end up with cards from both for various reasons, except for Turing this cycle.

What is the RTX2080ti Super?

I think you are a tad confused by this latest news.
There was rumor that AMD was working Arcturus an ultra high-end RDNA(2) chip that was coming out in the middle of 2020. But lately, the infoblurbs data doesn't line up and it seems that Arcturus is based on Vega architecture and is not a gaming GPU, but for DATA, Artificial Intelligence, etc... and for Enterprise.

AMD has stated they have been working on big navi right along side of little navi this whole time, using separate teams, that stay in communication and share tid-bits on what they are learning. Dr Lisa Su admitted there was a problem with Navi10 and it had to be re-taped out. She said all went well, and that they are in a better place for having been delayed. As THE OTHER TEAMS were able to incorporate more before their final revisions.

If little Navi was delayed and now on sale... then big navi isn't far behind.


Big navi isn't Acturus, it is juts another Navi chip with more CUs. And next year AMD will release RDNA(2) with ray tracing, etc when Nvidia releases their first 7nm GPU...
 
What is the RTX2080ti Super?

I think you are a tad confused by this latest news.
There was rumor that AMD was working Arcturus an ultra high-end RDNA(2) chip that was coming out in the middle of 2020. But lately, the infoblurbs data doesn't line up and it seems that Arcturus is based on Vega architecture and is not a gaming GPU, but for DATA, Artificial Intelligence, etc... and for Enterprise.

AMD has stated they have been working on big navi right along side of little navi this whole time, using separate teams, that stay in communication and share tid-bits on what they are learning. Dr Lisa Su admitted there was a problem with Navi10 and it had to be re-taped out. She said all went well, and that they are in a better place for having been delayed. As THE OTHER TEAMS were able to incorporate more before their final revisions.

If little Navi was delayed and now on sale... then big navi isn't far behind.


Big navi isn't Acturus, it is juts another Navi chip with more CUs. And next year AMD will release RDNA(2) with ray tracing, etc when Nvidia releases their first 7nm GPU...
yep mis-type 2080 Ti (not super), thanks. Yes separate teams but I am not sure what they are working on - you have consoles, Google Stadia Project, Navi, Arcturus. How many teams does all that take? Well I look forward to big Navi (what ever the hell that will be) if it is released this year great.
 
I sure hope it's more than that. Releasing a GPU without hardware DXR at a price level above Navi today would be disastrous.
lol, I wonder how many RTX users even bother with RTX? Frankly I think it is the other way around for Nvidia the utter lack of enthusiasm from users but that is only my opinion. If AMD releases a lower cost better performing card without ray tracing -> folks will flock to it.
 
lol, I wonder how many RTX users even bother with RTX? Frankly I think it is the other way around for Nvidia the utter lack of enthusiasm from users but that is only my opinion.

I'd say we share the same opinion. However, it's a matter of having the feature on a new card vs. knowing that you're going to want to swap out the card when the games you care about start using said feature. Knowing that AMD has announced support for ray tracing in the next console generation, that's basically every AAA game in the near future.

If AMD releases a lower cost better performing card without ray tracing -> folks will flock to it.

But that's the thing: they can't just release at a 'lower cost', they have to release it so low that Nvidia won't put more price pressure on them immediately a la Super- and they need to do it before Ampere hits and Nvidia can push RTX to lower price points and higher performance levels.

Basically, if AMD was to be successful in releasing a higher-performing Navi without hardware DXR acceleration, they should have already done it.
 
lol, I wonder how many RTX users even bother with RTX? Frankly I think it is the other way around for Nvidia the utter lack of enthusiasm from users but that is only my opinion. If AMD releases a lower cost better performing card without ray tracing -> folks will flock to it.

Yeah, Gameworks was far more popular. (Better known as Gimpworks.)
 
  • Like
Reactions: noko
like this
My guess would be that “little” Navi is out first for the same reason Polaris released before Vega - AMD prioritized the architecture that was to be used in custom SOC markets.
 
  • Like
Reactions: noko
like this
I like to look at Amazon sells, Navi so far is not making much of a dent but really this needs to be looked at once the AIBs release their cards. When a particular 2080 Ti is beating by far the first Navi card sells - some red flags go up. Off subject, I find these numbers way more significant than Steam survey if one was looking at buying stock.

https://www.amazon.com/Best-Sellers-Computers-Accessories-Computer-Graphics-Cards/zgbs/pc/284822

Even more off subject: CPUs
https://www.amazon.com/gp/bestselle...838-b28a9b7fda30&pf_rd_r=9AC5V8E8HRY5TZBDPF2W

Ahmm AMD is really kicking ASS here!
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
yep mis-type 2080 Ti (not super), thanks. Yes separate teams but I am not sure what they are working on - you have consoles, Google Stadia Project, Navi, Arcturus. How many teams does all that take? Well I look forward to big Navi (what ever the hell that will be) if it is released this year great.

Gaming GPU = RDNA.
RDNA is scalable architecture and navi10 is 251mm^2... and equal the 2070 SUPER in gaming performance. For $100 less.


Big navi:
If you do basic rough calculations and add 50% more CUs from Navi10's 40 CU's.... you have a GPU with 60 CUs and more ROPS, etc. But adding 20 more CU's doesn't increase the size of the chip all that much, so even a 60 CU Navi chip, will still be a pretty small chip and as small as perhaps 335mm^2'ish. If so, given economy of scale AMD will be able to sell these bigger navi chips (5800 Series) at price much cheaper than Vega20's Radeon Vii.... $499 and $599, while offering 35%+ increase over the 5700 Series.
 
Of subject, I find these numbers way more significant than Steam survey if one was looking at buying stock.

If the Steam Hardware Survey is not quite perfectly representative, Amazon lists are completely useless. Actually, I take that back- they can be used to wildly misconstrue sales data. And this goes every way you can imagine.
 
Runs great on AMD hardware when said hardware is designed to meet DX spec- where AMD falls short, well, that's obvious too :D
Yes and Nvidia did some good IQ stuff that helped gamers - it was there overuse of ridicules amounts of tessellation that brought the flack to them even to point Nvidia was hurting their own cards over crap like Hairworks.
 
Yes and Nvidia did some good IQ stuff that helped gamers - it was there overuse of ridicules amounts of tessellation that brought the flack to them even to point Nvidia was hurting their own cards over crap like Hairworks.

That was also part of the developer's implementation- as with anything, if the developer gets heavy-handed with processing intensive stuff, things get awry.

Still, the increase in realism was there- but the performance hit in that scenario as specified by the developer was certainly pretty rough all around.
 
If the Steam Hardware Survey is not quite perfectly representative, Amazon lists are completely useless. Actually, I take that back- they can be used to wildly misconstrue sales data. And this goes every way you can imagine.
Yes both do not show accurately what the real numbers are - Nvidia and AMD probably do know real usage via drivers, more so with Nvidia telemetry. Steam Survey is utter crap, Amazon sells tells you what is hot right now and is a very broad base international seller and probably the best we have for such.
 
Steam Survey is utter crap

I do recommend looking into statistics. It's not perfect, but it's far from crap.

Amazon sells tells you what is hot right now and is a very broad base international seller and probably the best we have for such.

Given that 'what's hot right now' is beyond undefined, I stand by my comment that Amazon lists are useless for discussion.

If Amazon were willing to publish sales numbers, then sure. But we don't get that.
 
Gaming GPU = RDNA.
RDNA is scalable architecture and navi10 is 251mm^2... and equal the 2070 SUPER in gaming performance. For $100 less.


Big navi:
If you do basic rough calculations and add 50% more CUs from Navi10's 40 CU's.... you have a GPU with 60 CUs and more ROPS, etc. But adding 20 more CU's doesn't increase the size of the chip all that much, so even a 60 CU Navi chip, will still be a pretty small chip and as small as perhaps 335mm^2'ish. If so, given economy of scale AMD will be able to sell these bigger navi chips (5800 Series) at price much cheaper than Vega20's Radeon Vii.... $499 and $599, while offering 35%+ increase over the 5700 Series.
Fair enough but still guess work while that seems like that is what makes sense - unless it hits the pavement - it is meaningless. I am with you on the prediction, price is too low if 35% higher than the 5700 XT which it should be is my only difference I've see. 330-350mm^2 would be the sweet spot as well.
 
I do recommend looking into statistics. It's not perfect, but it's far from crap.



Given that 'what's hot right now' is beyond undefined, I stand by my comment that Amazon lists are useless for discussion.

If Amazon were willing to publish sales numbers, then sure. But we don't get that.
Na, don't have time for that, will game instead on my GPU - spent too much time here already :). Yes numbers would be nice but seeing AMD having the top CPU sells and 7 out of 10 of the top selling CPU's you know AMD is utterly kicking ass. Plus another retailer has AMD CPUs way outselling Intel, even the 3700x is almost outselling all Intel CPUs combined from that retailer:
https://www.techradar.com/news/amd-...hit-it-almost-outsold-intels-entire-cpu-range

lol on retail sells being useless, even when they represent actual products people are buying - each there own.

Time to game
 
Na, don't have time for that, will game instead on my GPU - spent too much time here already :).

I game on all my GPUs ;). I have a 1080TI in the gaming rig per sig, and whatever I replace it with will have hardware DXR support.

lol on retail sells being useless, even when they represent actual products people are buying - each there own.

But you don't have retail sales data. If you did, then we'd be cracking!

[to wit- no one here has that data, that's essentially proprietary information, which is what makes the Steam survey so useful]
 
I still don't get the crowd that keeps saying

"It has to beat nVidia ~and~ be a lot cheaper"

I never understood why it has to be both, other than the crowd just ~wants~ it to be both. If it's faster, I don't expect it to be cheaper. Although I agree it would be nice, I can't see the shareholders backing that play without a very sound strategy behind it.

And I'm a lot of times (ok, all of the time, because I'm cheap) I'm perfectly ok not buying the fastest card available, so that better darn well be a good value if I'm not going cream of the crop.
 
Last edited:
lol on retail sells being useless, even when they represent actual products people are buying - each there own.

I agree with IdiotInCharge here. Amazon is just showing you what sold well in the last... hour? day? week? month? No way to know, other than it says "updated hourly". It also doesn't aggregate based on chipset - so a card like the AMD RX 580 can simultaneous occupy several spots

Maybe AMD has a sale on 580s, so they show up temporarily as the #3/#4/#5 cards on that Amazon list - but they sold like crap for months leading up to that and certainly aren't the #3/#4/#5 most popular cards in all desktops used for gaming.

(That's totally just a fictitious illustration by the way - for those of you that want to read into that literally. I just picked the 580 because it was near the top of the list and happened to fit my purposes for illustration)

Whereas the SHS would (allegedly, for all the non-believers) show all of that... since it's a snapshot of the total installed base for computers (which presumably at least part of the time) get used for gaming. And doesn't reset over some arbitrary time table - it gets retaken every month with the results re-posted so you can see trends.

Also - apparently Dude Wipes are the #5 top accessory & component for your PC. Yeah... you know who you are.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
It will likely be 10% slower, around the same power draw and hopefully will not have a shitty blower cooler.

agree, one theing i have never understood is AMD's love affair with blowers.....someone high up over there has got to be tired of hearing why did you go with the blower system over and over again....
 
I agree with IdiotInCharge here. Amazon is just showing you what sold well in the last... hour? day? week? month? No way to know, other than it says "updated hourly". It also doesn't aggregate based on chipset - so a card like the AMD RX 580 can simultaneous occupy several spots

Maybe AMD has a sale on 580s, so they show up temporarily as the #3/#4/#5 cards on that Amazon list - but they sold like crap for months leading up to that and certainly aren't the #3/#4/#5 most popular cards in all desktops used for gaming.

(That's totally just a fictitious illustration by the way - for those of you that want to read into that literally. I just picked the 580 because it was near the top of the list and happened to fit my purposes for illustration)

Whereas the SHS would (allegedly, for all the non-believers) show all of that... since it's a snapshot of the total installed base for computers (which presumably at least part of the time) get used for gaming. And doesn't reset over some arbitrary time table - it gets retaken every month with the results re-posted so you can see trends.

Also - apparently Dude Wipes are the #5 top accessory & component for your PC. Yeah... you know who you are.
Updated hourly, the sells would be listed right there with the product. It is only as good for seeing at the time what is selling the most which will vary. Why you don't buy Dude wipes? :p. Check every few days and see what the trend is - AMD CPUs will probably still be 7 or more in the top 10 - look at motherboards as well, only one Intel board in the top ten at this time lol :LOL:. Yep a lot of Intel builders. Yes it can change but that is what it is. As for Steam statistics, nope, we would need to know the samples, households how many systems where counted twice etc. how long data is used, we know virtually nothing about it -> crap statistic. Of course these are my opinions, speculations and analysis. What does this have to do with Big Navi? Absolutely Nothing.

All we know is Big Navi is on track. She did indicate upcoming quarters so maybe that means this year Officially.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
agree, one theing i have never understood is AMD's love affair with blowers.....someone high up over there has got to be tired of hearing why did you go with the blower system over and over again....
Well in the 5700 and 5700 XT case, blower cooler is probably better for most of the buyers - agree on higher end cards that consume more power blower coolers become more limiting. The 5700 XT is around 200w, even OC it is pulling less than 220w in games. It is not if that is the only choice so I am not even sure what the point is. AIB partners have a bunch of cards coming out this month without blowers. Even if Big Navi had a super big blower cooler on it, people will wait for other AIB options. Also OEMs like blower type cards which AMD would be more directly involved in.
 
I expect Big Navi - what ever hell it will be - better compete with the 2080 Ti or at least beat the 2080 Ti Super significantly at a cheaper cost. 5800 XT >or = 2080 Ti $799, 5800 >= 2080 Super $599. Would be the best thing for Nvidia worshipers, will prompt Nvidia to reduce pricing and to release Ampere sooner. If good sign me up for one or the other or both. I usually end up with cards from both for various reasons, except for Turing this cycle.

Big navi won't touch the 2080ti. If it does, Nvidia will release the 2080 Ti SUPER and everyone will want that.
 
I don't either. I've repeatedly stated that I absolutely believe that AMD has the technology to compete with Nvidia's high-end, and since both companies are currently using TSMC, it seems like AMD just continuously chooses not to.

And then we see what they did with Navi: overcooked, slow RAM, crappy stock blowers with terrible stock power settings, and no hint of DXR acceleration- even just enabled in the drivers!

Now, one could take that as, 'hey, maybe they have better stuff coming, right?'. And that would be correct- but AMD has had 'better stuff' coming since they bought ATi, and that graphics division has only lost marketshare since while never really catching up let alone surpassing Nvidia for any notable length of time.


Which leads one to wonder: are they even trying?

Over cooked ? Slow ram ? ... ok blowers most of us don't love them, but they are perfectly logical for reference cards.

https://www.phoronix.com/scan.php?page=article&item=navi-august-2019&num=6

I mean what are some of you folks smoking though... AMD just released a card for $450, that is 15% slower then Nvidias $699 part.
15% lower performance.... for almost 40% less money. Its so good they retired their months old VII cause frankly its also 15% slower then that part and again 40% less expensive.
Their $380 5700 has the same performance as the 2 year old Vega 64 (which debued at 500 bucks) granted that isn't a massive jump... still its a solid move in performance per dollar. Something NV hasn't given us for a long time now.

Granted NV offers the RTX vaporware stuff. (I joke its not vaporware but with what 3 or 4 actual games using it after a year it may as well be) It may have been a major sticking point a year ago when RTX hype was peak, today ? Few people really care... I think most of us realize we aren't getting ray tracing right now unless we spend over a grand for a video card... cause even the 2080 RTX on performance is suspect. Ray tracing may well be a killer feature next generation... and more realistically probably the generation after the next one is when it will become usable.

5700 is a killer part. The incoming wave of aftermarket cooling will be great, hopefully there are some options that don't kill the bang for the buck factor. 5800 if AMD is going for the moral win with the extreme performance segment, doesn't need DXR either. Equal 2080ti in good old raster performance and sell it for a 100-200 bucks less and they will move. I know I'm strange but I'm more looking forward to the 5600s... it's time to put poor polaris out of its misery. Retire the 570/580/590... I would be very much in on a 5690 with vega56 level performance and a $279 msrp.
 
Well in the 5700 and 5700 XT case, blower cooler is probably better for most of the buyers - agree on higher end cards that consume more power blower coolers become more limiting. The 5700 XT is around 200w, even OC it is pulling less than 220w in games. It is not if that is the only choice so I am not even sure what the point is. AIB partners have a bunch of cards coming out this month without blowers. Even if Big Navi had a super big blower cooler on it, people will wait for other AIB options. Also OEMs like blower type cards which AMD would be more directly involved in.

That is a good point... OEM venders probably do like reference cards with blowers. Makes it easier for them to sell higher margin aftermarket cooler options. If the reference design has a top notch cooling setup... why would consumers pay anything extra for OEM designs.
 
  • Like
Reactions: noko
like this
Yes and Nvidia did some good IQ stuff that helped gamers - it was there overuse of ridicules amounts of tessellation that brought the flack to them even to point Nvidia was hurting their own cards over crap like Hairworks.

Tesselation is set by the developer...the "Too much tessealation" was nothing but an AMD whine over that their solution SUCKED in perofrmance compared to NVIDIA....I wrote this once, because I hate lies like that:
https://hardforum.com/threads/no-amd-300-series-review-samples.1865193/page-7#post-1041667964

LOL

This is hillarious, but bear with me

All the ocean simulation is run on a small square and after that tiled out.
So that mesh you are seeing is not true indication of the load of how the games runs.
After that the game-engine does a z-buffer writeback for something that is called "GPU Occlusion Query"
It not rendered.
It's occluded ergo it is NOT drawn.

I would almost call that article dishonest.
Why?
Because they:
- Isolate the mesh
- Render the mesh at maximum tesselation
- Disable occlusion culling
- Disable dynamic tesselation
- Disable adaptive tesselation.

And the present it as "This is how the game works!"

But this is not what the games does.
The claim smells like another PR FUD thing.

It's all documented here:
http://docs.cryengine.com/display/SDKDOC4/Culling+Explained

People should read that...and if they do not understand it...they should stop posting about the topic.
Simple as that.
If you post about stuff you don't know anything about, in a negative manner...you become a FUD'ster.

(Besides I doubt NVIDIA had anything to with that "ocean"...it seems more like something coded by the devs themselfes to be frank.)

.

 
That was also part of the developer's implementation- as with anything, if the developer gets heavy-handed with processing intensive stuff, things get awry.

Still, the increase in realism was there- but the performance hit in that scenario as specified by the developer was certainly pretty rough all around.
Revisionist history? We all know what went down as it was well documented.
 
The usual suspect pushing muh partial scene DXR at peasant resolution being the greatest thing ever (it's not.. we got at least 20 more games and 5-10 years to go to see what RT can actually do when supported properly), in order to justify AMD being priced six million dollars less than their competitor.
In fact, some here would wish that AMD pay you to take their GPU, that's how slow, hot, loud, poorly drivered and designed they are. In fact, you should all just buy 780Tis because AMD can't compete anywhere outside of low end. Like with S3 or something.

Remember every time you read this tripe; nearly a year out there are 2 or 3 RT titles tops. Playing at 1080p on a 1200 USD GPU with partial scene lighting is not how RT will succeed. Nor will 720p on the midrange/entry high end stuff.
Supporting RT at this point is meh and is definitely not a useful feature considering the absolute lack of support and peasant speeds available. It's like making a card with DP 3.0 now because latest and greatest and if you don't have that then you are nothing. Meanwhile, nothing uses it. But at least you can run around screaming 'AMDEEEE DONT HAVEEE DEEEPEEEETHREEEEEEeeeeeeeeeeeehhh'.
Sure you got to start somewhere, but that somewhere was not now. A generation too soon.

Remember folks, the 5700XT does not compete with the 2070S. Never did and never will, it's literally not RTX so it sucks. And it literally will move your computer around with the 1000ft/lb turbojet blower.

P.s. blowers suck
Just in case you blower guys didn't hear that? I am legally deaf now thanks to my vega apparently.

Remember, Nvidia is better than even your dad, at everything, in the snow uphill both ways.
 
The usual suspect pushing muh partial scene DXR at peasant resolution being the greatest thing ever (it's not.. we got at least 20 more games and 5-10 years to go to see what RT can actually do when supported properly), in order to justify AMD being priced six million dollars less than their competitor.
In fact, some here would wish that AMD pay you to take their GPU, that's how slow, hot, loud, poorly drivered and designed they are. In fact, you should all just buy 780Tis because AMD can't compete anywhere outside of low end. Like with S3 or something.

Remember every time you read this tripe; nearly a year out there are 2 or 3 RT titles tops. Playing at 1080p on a 1200 USD GPU with partial scene lighting is not how RT will succeed. Nor will 720p on the midrange/entry high end stuff.
Supporting RT at this point is meh and is definitely not a useful feature considering the absolute lack of support and peasant speeds available. It's like making a card with DP 3.0 now because latest and greatest and if you don't have that then you are nothing. Meanwhile, nothing uses it. But at least you can run around screaming 'AMDEEEE DONT HAVEEE DEEEPEEEETHREEEEEEeeeeeeeeeeeehhh'.
Sure you got to start somewhere, but that somewhere was not now. A generation too soon.

Remember folks, the 5700XT does not compete with the 2070S. Never did and never will, it's literally not RTX so it sucks. And it literally will move your computer around with the 1000ft/lb turbojet blower.

P.s. blowers suck
Just in case you blower guys didn't hear that? I am legally deaf now thanks to my vega apparently.

Remember, Nvidia is better than even your dad, at everything, in the snow uphill both ways.

Quoted for posterity
 
The usual suspect pushing muh partial scene DXR at peasant resolution being the greatest thing ever (it's not.. we got at least 20 more games and 5-10 years to go to see what RT can actually do when supported properly), in order to justify AMD being priced six million dollars less than their competitor.
In fact, some here would wish that AMD pay you to take their GPU, that's how slow, hot, loud, poorly drivered and designed they are. In fact, you should all just buy 780Tis because AMD can't compete anywhere outside of low end. Like with S3 or something.

Remember every time you read this tripe; nearly a year out there are 2 or 3 RT titles tops. Playing at 1080p on a 1200 USD GPU with partial scene lighting is not how RT will succeed. Nor will 720p on the midrange/entry high end stuff.
Supporting RT at this point is meh and is definitely not a useful feature considering the absolute lack of support and peasant speeds available. It's like making a card with DP 3.0 now because latest and greatest and if you don't have that then you are nothing. Meanwhile, nothing uses it. But at least you can run around screaming 'AMDEEEE DONT HAVEEE DEEEPEEEETHREEEEEEeeeeeeeeeeeehhh'.
Sure you got to start somewhere, but that somewhere was not now. A generation too soon.

Remember folks, the 5700XT does not compete with the 2070S. Never did and never will, it's literally not RTX so it sucks. And it literally will move your computer around with the 1000ft/lb turbojet blower.

P.s. blowers suck
Just in case you blower guys didn't hear that? I am legally deaf now thanks to my vega apparently.

Remember, Nvidia is better than even your dad, at everything, in the snow uphill both ways.

I run 3440x1440 with RT and 80FPS in BF4. Metro runs at 100fps. That’s 2.5x the pixels of 1080p. Stop being so dramatic. Really looking forward to Cyberpunk 2077. Even “partial” RT on the lowest setting looks vastly better than the traditonal rasterized methods.

I do agree with AMD’s skipping RT with the 5700/5700xt. Any card priced higher than that I would want to have RT though. I am not replacing a “big Navi” $800 card in 9 months with a proper RT card...
 
We know...DXR is “useless”...until AMD supports it.../yawn
No, dxr is useless until we have good implementations of it... If a couple of AAA games got released that did a good job in the very near future, AMD will be caught without support and give people a reason to stick with Nvidia. If dxr sucks in current gen and the only way to use it is @720p on a 2070+ or 2080p on a $1k card, with "meh" results, then AMD is fine until said games start coming or hardware performance is good enough to support.
This love/hate stuff is so old, get over it. AMD made a choice not to support it. Their cards are mid level, competing against other cards that *currently* can't handle dxr effectively. This is subject to change at any time, but to this day we haven't seen it's usefulness. If some games actually implement it effectively, then it will no longer be useless. If AMD adds support at that time, it's not because AMD added support that's it's no longer useless, it's because the implementation of it (in both hardware AND software) is finally useful.
As of right now, it's still useless. If AMD implements it in next gen and it still sucks, then it's still useless. I don't care who supports it if I can't use it.
 
Back
Top