Poll: AMD RX 5700 Series vs NVidia RTX Super. Which gets your money?

AMD RX 5700 Series vs NVidia RTX Super. Which gets your money?

  • NVidia RTX 2060 (old 6GB version)

    Votes: 0 0.0%
  • AMD RX 5700

    Votes: 18 7.2%
  • NVidia RTX 2060 Super

    Votes: 12 4.8%
  • AMD RX 5700 XT

    Votes: 81 32.5%
  • NVidia RTX 2070 Super

    Votes: 30 12.0%
  • Already have close enough/better AMD card for now, waiting on next big thing

    Votes: 19 7.6%
  • Already have close enough/better NVidia card for now, waiting on next big thing

    Votes: 89 35.7%

  • Total voters
    249
Status
Not open for further replies.
I got my order in for the EVGA super 2070 XC gamer card now july 31 delivery date!
EVGA web site is blown out of 2070 super since launch date.
Got in on a Pre order ( stock) deal from amazon.
Was up for a a few hours but the restock order is now closed .
I guess I lucked out.
MY son got in on same type of deal for XC Ultra gamer card.

watched The EVGA 20 Anniversary stream and got a EVGA gaming mouse 9.99$
 
You are full of shit:

Code:
Direct3D 12 feature checker (July 2019) by DmitryKo (x64)
https://forum.beyond3d.com/posts/1840641/

Windows 10 version 1903 (build 18362.239 19h1_release) x64
Checking for experimental features SM6 TR4 FL1 META

ADAPTER 0
"NVIDIA GeForce RTX 2080"
VEN_10DE, DEV_1E87, SUBSYS_1E8710DE, REV_A1
Dedicated video memory : 8010.0 MB (8399093760 bytes)
Total video memory : 40734.5 MB (42713180160 bytes)
Video driver version : 26.21.14.3136
Maximum feature level : D3D_FEATURE_LEVEL_12_1 (0xc100)
DoublePrecisionFloatShaderOps : 1
OutputMergerLogicOp : 1
MinPrecisionSupport : D3D12_SHADER_MIN_PRECISION_SUPPORT_NONE (0) (0b0000'0000)
TiledResourcesTier : D3D12_TILED_RESOURCES_TIER_3 (3)
ResourceBindingTier : D3D12_RESOURCE_BINDING_TIER_3 (3)
PSSpecifiedStencilRefSupported : 0
TypedUAVLoadAdditionalFormats : 1
ROVsSupported : 1
ConservativeRasterizationTier : D3D12_CONSERVATIVE_RASTERIZATION_TIER_3 (3)
StandardSwizzle64KBSupported : 0
CrossNodeSharingTier : D3D12_CROSS_NODE_SHARING_TIER_NOT_SUPPORTED (0)
CrossAdapterRowMajorTextureSupported : 0
VPAndRTArrayIndexFromAnyShaderFeedingRasterizerSupportedWithoutGSEmulation : 1
ResourceHeapTier : D3D12_RESOURCE_HEAP_TIER_2 (2)
MaxGPUVirtualAddressBitsPerResource : 40
MaxGPUVirtualAddressBitsPerProcess : 40
Adapter Node 0:    TileBasedRenderer: 0, UMA: 0, CacheCoherentUMA: 0, IsolatedMMU: 1, HeapSerializationTier: 0, ProtectedResourceSession.Support: 1
HighestShaderModel : D3D12_SHADER_MODEL_6_4 (0x0064)
WaveOps : 1
WaveLaneCountMin : 32
WaveLaneCountMax : 32
TotalLaneCount : 2944
ExpandedComputeResourceStates : 1
Int64ShaderOps : 1
RootSignature.HighestVersion : D3D_ROOT_SIGNATURE_VERSION_1_1 (2)
DepthBoundsTestSupported : 1
ProgrammableSamplePositionsTier : D3D12_PROGRAMMABLE_SAMPLE_POSITIONS_TIER_2 (2)
ShaderCache.SupportFlags : D3D12_SHADER_CACHE_SUPPORT_SINGLE_PSO | LIBRARY (3) (0b0000'0011)
CopyQueueTimestampQueriesSupported : 1
CastingFullyTypedFormatSupported : 1
WriteBufferImmediateSupportFlags : D3D12_COMMAND_LIST_SUPPORT_FLAG_DIRECT | BUNDLE | COMPUTE | COPY | VIDEO_DECODE | VIDEO_PROCESS | VIDEO_ENCODE (127) (0b0111'1111)
ViewInstancingTier : D3D12_VIEW_INSTANCING_TIER_3 (3)
BarycentricsSupported : 1
ExistingHeaps.Supported : 1
MSAA64KBAlignedTextureSupported : 1
SharedResourceCompatibilityTier : D3D12_SHARED_RESOURCE_COMPATIBILITY_TIER_1 (1)
Native16BitShaderOpsSupported : 1
AtomicShaderInstructions : 0
SRVOnlyTiledResourceTier3 : 1
RenderPassesTier : D3D12_RENDER_PASS_TIER_0 (0)
RaytracingTier : D3D12_RAYTRACING_TIER_1_0 (10)
AdditionalShadingRatesSupported : 1
PerPrimitiveShadingRateSupportedWithViewportIndexing : 0
VariableShadingRateTier : D3D12_VARIABLE_SHADING_RATE_TIER_2 (2)
ShadingRateImageTileSize : 16
BackgroundProcessingSupported : 1
Metacommands enumerated : 8
Metacommands [parameters per stage]:
Conv (Convolution)
[84: DescIn.DataType, DescIn.Layout, DescIn.Flags, DescIn.DimensionCount, DescIn.Size[0], DescIn.Size[1], DescIn.Size[2], DescIn.Size[3], DescIn.Size[4], DescIn.Stride[0], DescIn.Stride[1], DescIn.Stride[2], DescIn.Stride[3], DescIn.Stride[4], DescFilter.DataType, DescFilter.Layout, DescFilter.Flags, DescFilter.DimensionCount, DescFilter.Size[0], DescFilter.Size[1], DescFilter.Size[2], DescFilter.Size[3], DescFilter.Size[4], DescFilter.Stride[0], DescFilter.Stride[1], DescFilter.Stride[2], DescFilter.Stride[3], DescFilter.Stride[4], DescBias.DataType, DescBias.Layout, DescBias.Flags, DescBias.DimensionCount, DescBias.Size[0], DescBias.Size[1], DescBias.Size[2], DescBias.Size[3], DescBias.Size[4], DescBias.Stride[0], DescBias.Stride[1], DescBias.Stride[2], DescBias.Stride[3], DescBias.Stride[4], DescBias.IsNull, DescOut.DataType, DescOut.Layout, DescOut.Flags, DescOut.DimensionCount, DescOut.Size[0], DescOut.Size[1], DescOut.Size[2], DescOut.Size[3], DescOut.Size[4], DescOut.Stride[0], DescOut.Stride[1], DescOut.Stride[2], DescOut.Stride[3], DescOut.Stride[4], Mode, Direction, Precision, Stride[0], Stride[1], Stride[2], Dilation[0], Dilation[1], Dilation[2], StartPadding[0], StartPadding[1], StartPadding[2], EndPadding[0], EndPadding[1], EndPadding[2], DimensionCount, OutputPadding[0], OutputPadding[1], OutputPadding[2], OutputPadding[3], OutputPadding[4], GroupCount, Activation.Function, Activation.Params.first, Activation.Params.second, Activation.IsNull, BindFlags]
[1: PersistentResource]
[6: InputResource, FilterResource, BiasResource, OutputResource, PersistentResource, TemporaryResource]

CopyTensor
[3: DstLayout, SrcLayout, BindFlags]
[1: Unused]
[31: DstDesc.DataType, DstDesc.Layout, DstDesc.Flags, DstDesc.DimensionCount, DstDesc.Size[0], DstDesc.Size[1], DstDesc.Size[2], DstDesc.Size[3], DstDesc.Size[4], DstDesc.Stride[0], DstDesc.Stride[1], DstDesc.Stride[2], DstDesc.Stride[3], DstDesc.Stride[4], DstResource, SrcDesc.DataType, SrcDesc.Layout, SrcDesc.Flags, SrcDesc.DimensionCount, SrcDesc.Size[0], SrcDesc.Size[1], SrcDesc.Size[2], SrcDesc.Size[3], SrcDesc.Size[4], SrcDesc.Stride[0], SrcDesc.Stride[1], SrcDesc.Stride[2], SrcDesc.Stride[3], SrcDesc.Stride[4], SrcResource, TemporaryResource]

2x2 Nearest neighbour Upsample
[15: SrcDesc.DataType, SrcDesc.Layout, SrcDesc.Flags, SrcDesc.DimensionCount, SrcDesc.Size[0], SrcDesc.Size[1], SrcDesc.Size[2], SrcDesc.Size[3], SrcDesc.Size[4], SrcDesc.Stride[0], SrcDesc.Stride[1], SrcDesc.Stride[2], SrcDesc.Stride[3], SrcDesc.Stride[4], BindFlags]
[1: Unused]
[2: DstResource, SrcResource]

MVN (Mean Variance Normalization)
[67: DescIn.DataType, DescIn.Layout, DescIn.Flags, DescIn.DimensionCount, DescIn.Size[0], DescIn.Size[1], DescIn.Size[2], DescIn.Size[3], DescIn.Size[4], DescIn.Stride[0], DescIn.Stride[1], DescIn.Stride[2], DescIn.Stride[3], DescIn.Stride[4], DescScale.DataType, DescScale.Layout, DescScale.Flags, DescScale.DimensionCount, DescScale.Size[0], DescScale.Size[1], DescScale.Size[2], DescScale.Size[3], DescScale.Size[4], DescScale.Stride[0], DescScale.Stride[1], DescScale.Stride[2], DescScale.Stride[3], DescScale.Stride[4], DescScale.IsNull, DescBias.DataType, DescBias.Layout, DescBias.Flags, DescBias.DimensionCount, DescBias.Size[0], DescBias.Size[1], DescBias.Size[2], DescBias.Size[3], DescBias.Size[4], DescBias.Stride[0], DescBias.Stride[1], DescBias.Stride[2], DescBias.Stride[3], DescBias.Stride[4], DescBias.IsNull, DescOut.DataType, DescOut.Layout, DescOut.Flags, DescOut.DimensionCount, DescOut.Size[0], DescOut.Size[1], DescOut.Size[2], DescOut.Size[3], DescOut.Size[4], DescOut.Stride[0], DescOut.Stride[1], DescOut.Stride[2], DescOut.Stride[3], DescOut.Stride[4], Precision, CrossChannel, NormalizeVariance, Epsilon, Activation.Function, Activation.Params.first, Activation.Params.second, Activation.IsNull, BindFlags]
[1: PersistentResource]
[6: InputResource, ScaleResource, BiasResource, OutputResource, PersistentResource, TemporaryResource]

This is the part that debunk your sorry lies:
RaytracingTier : D3D12_RAYTRACING_TIER_1_0 (10)

You posted a lot of nothing.
Jensen came on stage and told us all they were working with dice... and those vids on how they were working with Nvidia's proprietary methods.

Soon after RTX's release, it became immediately clear to Nvidia that the industry wasn't buying it, and now Jensen has since decided to use Turing's capabilities to just implement Industry standard DirectX Ray Tracing, etc. Instead of trying to shoehorn everyone's game into RTX and paying a team to work with those Developer's for RTX On marketing bs. Now Nvidia is proudly showing off demos of their stuff on DX.

While backtracking on earlier promises. There is news this week, of more of Nvidia backtracking and belaying RTX bullshit, most likely in favor of DirectX ray-tracing.

"We’re working together with Nvidia on that, but raytracing won’t be available at launch. The engineers at Nvidia are still hard at work getting that solution to look as good as possible for the game, and the date is still to be determined. But from what we’ve seen so far, it’ll be good. "
 
You posted a lot of nothing.
Jensen came on stage and told us all they were working with dice... and those vids on how they were working with Nvidia's proprietary methods.

Soon after RTX's release, it became immediately clear to Nvidia that the industry wasn't buying it, and now Jensen has since decided to use Turing's capabilities to just implement Industry standard DirectX Ray Tracing, etc. Instead of trying to shoehorn everyone's game into RTX and paying a team to work with those Developer's for RTX On marketing bs. Now Nvidia is proudly showing off demos of their stuff on DX.

While backtracking on earlier promises. There is news this week, of more of Nvidia backtracking and belaying RTX bullshit, most likely in favor of DirectX ray-tracing.

"We’re working together with Nvidia on that, but raytracing won’t be available at launch. The engineers at Nvidia are still hard at work getting that solution to look as good as possible for the game, and the date is still to be determined. But from what we’ve seen so far, it’ll be good. "

Do you realize all RTX ray tracing uses DXR and always has? DXR is “DirectX ray tracing”. nVidia has tookkits to make implementation easier / more efficient but it’s always on DXR.

I agree with Factum’s aggressiveness for once... good job.
 
Do you realize all RTX ray tracing uses DXR and always has? DXR is “DirectX ray tracing”. nVidia has tookkits to make implementation easier / more efficient but it’s always on DXR.

I agree with Factum’s aggressiveness for once... good job.

I dont think Factum should ever be encouraged with the way he treats people, even someone like Gamer X.
 
5150Joker you just can't talk bad about AMD or recommend AMD when they have a bad/good product. The fanbois won't allow it.

You are right. Seeing that I have several Nvidia GPU's laying around and not having a single Radeon gpu in my main gaming rig. I 100% agree that nvidia cheerleaders don't like to see AMD's dominance asserted.



I don't like to see AMD's dominance either... or do I?

I am in the market for a ton of hardware this year and AMD makes me extremely happy to spend. No matter how many memes you want to post, it is nothing other than you inability to actually form a rebuttal that gets me laughing. As if a handful of Forum cheerleaders here matter. Get out into the real world (Microcenter), & see all the AMD stuff people are walking out the door with. There was a line of people waiting for a 1pm delivery at my local store this past week. Everyone was there for AMD stuff.

Again… you can't come up with one rebuttal ? And instead, jump on the bandwagon of cheerleaders throwing up at everyone's party, because your team is loosing. While others (real people) are sitting back watching the game and about to go buy a new winner of a card. These people, (like me) don't care you are here on the forums mocking them for being logical. Sadly, it's the same lonely people here, unable to come to terms with the new reality and still trying to mock the messenger.


Here, watch this video (if you can stomach it). It illustrates essentially how everyone feels, except a small nugget of fanbois.

 
  • Like
Reactions: noko
like this
Do you realize all RTX ray tracing uses DXR and always has? DXR is “DirectX ray tracing”. nVidia has tookkits to make implementation easier / more efficient but it’s always on DXR.

I agree with Factum’s aggressiveness for once... good job.


Yes.
Turing can use DXR and that is what I am getting at.

Do you realize that RTX is not DXR..? That is Nvidia's own proprietary method, in which they are not marketing anymore or intend on using anymore (too costly to hand do) , in favor of the industry standards. Jensen can no longer afford to MARKET his own form of raytracing... as He tried to push it on the industry... and the Industry said NOPE. That is why Nvidia's own paid engineers are working FOR developers, to get their RT to work. Otherwise, these Developer's would just use DXR API, as most will be doing in the future. And as such, if they have issues they would be working with Microsoft.



Additionally, nobody cares one smidgen about Nvidia's deep learning AI and all that bullshit either. It wasn't a factor when I bought my RTX2080. Frames per second is all that matters to me and most everyone else. Nvidia's ray tracing and tensor cares don't do anything for Gamers.
 
I guess this is too complicated for you to understand, bit I will keep repeating it!

RaytracingTier : D3D12_RAYTRACING_TIER_1_0 (10)

And I will expand with this:
upload_2019-7-23_7-42-33.png


You are full of either lies or ignorance.

And RTX is the Name NVIDIA gives to the SKU's that support hardware raytracing via DXR.
It is really simple:
RTX = hardware support for DXR
GTX = No hardware support for DXR.

What will you next lie be?

(And I think you are the one that got banned a month from Beyond's forum now, because they didn't care for you lies...right?)
 
Yes.
Turing can use DXR and that is what I am getting at.

Do you realize that RTX is not DXR..? That is Nvidia's own proprietary method, in which they are not marketing anymore or intend on using anymore (too costly to hand do) , in favor of the industry standards. Jensen can no longer afford to MARKET his own form of raytracing... as He tried to push it on the industry... and the Industry said NOPE. That is why Nvidia's own paid engineers are working FOR developers, to get their RT to work. Otherwise, these Developer's would just use DXR API, as most will be doing in the future. And as such, if they have issues they would be working with Microsoft.



Additionally, nobody cares one smidgen about Nvidia's deep learning AI and all that bullshit either. It wasn't a factor when I bought my RTX2080. Frames per second is all that matters to me and most everyone else. Nvidia's ray tracing and tensor cares don't do anything for Gamers.
Read and learn:
https://developer.nvidia.com/rtx/raytracing/dxr/DX12-Raytracing-tutorial-Part-1

You seem to be unable to understand the difference between PR names (RTX) vs the actual technology being used (DXR)...
 
Yes.
Turing can use DXR and that is what I am getting at.

Do you realize that RTX is not DXR..? That is Nvidia's own proprietary method, in which they are not marketing anymore or intend on using anymore (too costly to hand do) , in favor of the industry standards. Jensen can no longer afford to MARKET his own form of raytracing... as He tried to push it on the industry... and the Industry said NOPE. That is why Nvidia's own paid engineers are working FOR developers, to get their RT to work. Otherwise, these Developer's would just use DXR API, as most will be doing in the future. And as such, if they have issues they would be working with Microsoft.



Additionally, nobody cares one smidgen about Nvidia's deep learning AI and all that bullshit either. It wasn't a factor when I bought my RTX2080. Frames per second is all that matters to me and most everyone else. Nvidia's ray tracing and tensor cares don't do anything for Gamers.

That’s still not what RTX is regardless how many times you say it.
 
Read and learn:
https://developer.nvidia.com/rtx/raytracing/dxr/DX12-Raytracing-tutorial-Part-1

You seem to be unable to understand the difference between PR names (RTX) vs the actual technology being used (DXR)...


Why do you keep repeating yourself. We know this that Turing CAN do DXR, but it doesn't just work.....

You are avoiding the actual problem here. And once again, avoiding my links and actual quotes from Developers. I think Nvidia should have 40+ teams working with all the Game Developers, making DXR for all their games. That way no game developer ever has to work with, or develop ray tracing, because Nvidia's teams will do it for them. (We are not talking about drivers here, but game development.)

So the game developers don't have to ever worry about it...
 
Why do you keep repeating yourself. We know this that Turing CAN do DXR, but it doesn't just work.....

Citation needed...

You are avoiding the actual problem here. And once again, avoiding my links and actual quotes from Developers. I think Nvidia should have 40+ teams working with all the Game Developers, making DXR for all their games. That way no game developer ever has to work with, or develop ray tracing, because Nvidia's teams will do it for them. (We are not talking about drivers here, but game development.)

So the game developers don't have to ever worry about it...

NVIDIA has alway had people helping developers code for their hardware.
Be it DirectX 10, Hairworks. Gameworks, PhysX...DXR.
It's part of their eco-system.

You are either the most stupid, ignorant, retarded potato I have encountered on this forum...or you are the biggest fanboy liar...which is it?
 
Here, this should start to give you an idea of how Turing's architecture is being shoehorned into trying to use real time ray tracing as a thing... on an expensive CAD/Pro chip not meant for realtime ray tracing.

It just doesn't work...


Hairworks, Gameworks, PhysX all proprietary stuff. Nobody is falling for Jensen traps anymore... (see SONY, Microsoft & Google please)
 
Here, this should start to give you an idea of how Turing's architecture is being shoehorned into trying to use real time ray tracing as a thing... on an expensive CAD/Pro chip not meant for realtime ray tracing.

It just doesn't work...


Hairworks, Gameworks, PhysX all proprietary stuff. Nobody is falling for Jensen traps anymore... (see SONY, Microsoft & Google please)

Dude. I'm bout to buy a 2080 Super just to spite you.
 
Why do you keep repeating yourself. We know this that Turing CAN do DXR, but it doesn't just work.....

You are avoiding the actual problem here. And once again, avoiding my links and actual quotes from Developers. I think Nvidia should have 40+ teams working with all the Game Developers, making DXR for all their games. That way no game developer ever has to work with, or develop ray tracing, because Nvidia's teams will do it for them. (We are not talking about drivers here, but game development.)

So the game developers don't have to ever worry about it...

nvidia has been working with dozens of developers since like forever, helping them implement nvidia technologies in their games/game engines. So what's your point?
 
Here, this should start to give you an idea of how Turing's architecture is being shoehorned into trying to use real time ray tracing as a thing... on an expensive CAD/Pro chip not meant for realtime ray tracing.

It just doesn't work...


Hairworks, Gameworks, PhysX all proprietary stuff. Nobody is falling for Jensen traps anymore... (see SONY, Microsoft & Google please)


From the link:

"NVIDIA’s RTX acceleration is directly tied into Microsoft’s DirectX Raytracing API (DXR), which is a part of the DirectX 12 API. Because NVIDIA’s hardware directly interfaces with DXR to accelerate ray traced effects, only games running on DirectX 12 can make use of the technology. As Vulkan is a competitor to DirectX 12, id Software must find a way to have Doom Eternal and its game engine leverage the RTX hardware without using DXR. "

DO
YOU
READ
THE
SHITE
YOU
LINK
?

LOL
 
None for the time being. 1080Ti is serving me well at 1440p. My upgrade path for GPU will be sometime next year when next gen drops and it'll be nVidia simply because I already have a pair of 27" 1440p GSync displays that I don't intend to upgrading anytime soon.
 
nvidia has been working with dozens of developers since like forever, helping them implement nvidia technologies in their games/game engines. So what's your point?



My point is, none of that past-practice matters anymore. SONY & Microsoft & Google all under the RDNA flag. Mainstream is where everyone's at...

Nvidia doesn't have Industry pull with game developers anymore ! Because Nvidia has shown Gamers over the last 5 years they care more about non-gaming GPUs than gaming GPUs. (Mining, AI, etc) And Nvidia just isn't a gaming company anymore, and Jensen has much egg on his face (RTX and then back to GTX 1660?). I have already shown where several Developer's won't be releasing their "rtx solution" upon release, but waiting for a specific date in the future... with full-featured dxr. Not the hobbled mishap that is Turing's RTX.

It like you guys keep telling me how great Nvidia is for doing this, while not accepting how great it is for everyone else under the AMD umbrella of RDNA Gaming and helping everyone. AMD brings open standard into the fold, instead of forcible trying to make industry use their proprietary solution.



You don't have to argue with me, just find another RTX2080 owner and ask them.... did they buy it for ray-tracing, or for the performance. Nvidia's next GPU in late 2020 might be able to do real time ray-tracing, but lets not talk about Nvidia's current crop of cards, they are simply a flop all the way around.

And I bought one for $800... because 3440 x 1440p g-sync. (I suggest anyone who has a widescreen g-sync, to push it with a 2080.)
 
I had a 5700XT, but I got fed up with the driver issues and returned it. I ended up "downgrading" to a 2060 (non-super). I really wanted to like the 5700XT, but if AMD can't get their shit together with games and crashing... I'll still probably pick up an aftermarket 5700XT in a few months when the drivers have matured.

That being said, I still loathe Nvidia's business practices and pricing. I wouldn't give them full price for one of their overpriced cards ;). And I didn't buy it for RTX "features" either. But if they are going to offer it for the same price as the 1660Ti, I'll play around with it.
 
From the link:

"NVIDIA’s RTX acceleration is directly tied into Microsoft’s DirectX Raytracing API (DXR), which is a part of the DirectX 12 API. Because NVIDIA’s hardware directly interfaces with DXR to accelerate ray traced effects, only games running on DirectX 12 can make use of the technology. As Vulkan is a competitor to DirectX 12, id Software must find a way to have Doom Eternal and its game engine leverage the RTX hardware without using DXR. "

DO
YOU
READ
THE
SHITE
YOU
LINK
?

LOL


Yes I read it, and if you were not so obnoxious about it, you might want to look at the word "interfaces"....


Plz stop with you attacks and understand that Nvidia themselves said in an early interview with a Developer when RTX was being released, how they tie in Turing's tensor and such to make ray-tracing piggy back off new technology (ie DXR). He avoided the question of does it do dxr natively. It can't! Thus the reason why Nvidia's engineers has to spend 5 months working with a game Title, to get their "RTX On" working and why there is a 30% performance hit. (Turing barely has tier-1 dxr, & chokes on it.)

It doesn't just work...
 
I had a 5700XT, but I got fed up with the driver issues and returned it. I ended up "downgrading" to a 2060 (non-super). I really wanted to like the 5700XT, but if AMD can't get their shit together with games and crashing...
I don't think this is a common story. I did have 1 crash since I got the 5700XT card, but that was on the first day and hasn't happened since I updated drivers. I also had black flashes, so I have to disable Afterburner.

Aside from that, the card has been working great. I had far more problems with my 2080 Ti at launch, so don't think Nvidia is immune from bugs and issues.
 
My point is, none of that past-practice matters anymore. SONY & Microsoft & Google all under the RDNA flag. Mainstream is where everyone's at...

Nvidia doesn't have Industry pull with game developers anymore ! Because Nvidia has shown Gamers over the last 5 years they care more about non-gaming GPUs than gaming GPUs. (Mining, AI, etc) And Nvidia just isn't a gaming company anymore, and Jensen has much egg on his face (RTX and then back to GTX 1660?). I have already shown where several Developer's won't be releasing their "rtx solution" upon release, but waiting for a specific date in the future... with full-featured dxr. Not the hobbled mishap that is Turing's RTX.

It like you guys keep telling me how great Nvidia is for doing this, while not accepting how great it is for everyone else under the AMD umbrella of RDNA Gaming and helping everyone. AMD brings open standard into the fold, instead of forcible trying to make industry use their proprietary solution.



You don't have to argue with me, just find another RTX2080 owner and ask them.... did they buy it for ray-tracing, or for the performance. Nvidia's next GPU in late 2020 might be able to do real time ray-tracing, but lets not talk about Nvidia's current crop of cards, they are simply a flop all the way around.

And I bought one for $800... because 3440 x 1440p g-sync. (I suggest anyone who has a widescreen g-sync, to push it with a 2080.)

You keep saying that RTX is sort of a "subset" of DXR features, but its already been stablished that the RTX series fully supports ALL DXR features. You also say nvidia doesn't have industry pull with game developers, yet nvidia has more game developer programs (hence the accusations of pushing gameworks, hairworks, RTX, etc...) than AMD could ever dream of.

Nvidia main income is STILL gaming cards. Yet its the main player in most of the markets its in (datacenter, AI, mediacreation, driving, etc). So it does care about gaming.
 
My point is, none of that past-practice matters anymore. SONY & Microsoft & Google all under the RDNA flag. Mainstream is where everyone's at...
AMD is on PS4 and X Box One and it didn't really give AMD any advantage over Nvidia in discrete GPU for over 6 years, what makes you think RDNA will be any different as Nvidia still owns the Mainstream market? You say past practice doesn't matter but it is past practice that brought AMD into this current position where they still have trouble competing NVidia at every product stack for 4 years.
 
Last edited:
I don't think this is a common story. I did have 1 crash since I got the 5700XT card, but that was on the first day and hasn't happened since I updated drivers. I also had black flashes, so I have to disable Afterburner.

Aside from that, the card has been working great. I had far more problems with my 2080 Ti at launch, so don't think Nvidia is immune from bugs and issues.

In my case it locked up Assassin's Creed Odyssey so it was unplayable. I swapped in a 1060 I had sitting around and it worked fine. I never installed Afterburner, so I can't comment on that.
 
As stated earlier --still holding out for AIB 5700xt . My plan is to do an all AMD personal rig again after parting my Vega 64/x470 box. I want to wait for x570 bios to get better number one and the 3900x to come back in stock number two so no real rush. Eyeing the MSI MEG Ace mobo right now also.

'Fanboy' has been tossed around a lot in the last page or so of this thread and personally AMD will always hold a place in my heart since I cut my teeth building on Slot A and up to Phenom 2. Picked it back up with Ryzen. In the end I'm a fanboy of good hardware at decent prices and while RTX as a whole seems a touch high there can be deals if you look around. At the end of the day buy the shit you like at the price you are willing to pay and FFS stop shitting on other peoples choices-- even when they are wrong.. :ROFLMAO:


So I had to got to Microcenter today on a surprise iPad replacement trip for my Wife.. They had an EVGA 2060 super Ultra open box for 356.96 and I picked it up. Personally I think the 2060 super under 400 is a pretty solid card and am 'super' pleased with it so far. With 5 PCs in the house between mine and the Kids' it will get some good use if I run it for a while or for a day and well.. I do love a deal. :happy:

this one:

https://www.evga.com/products/product.aspx?pn=08G-P4-3067-KR
 
AMD is known to overbuild so that would be my first choice for one I'd place a liquid cooler on.
 
AMD is known to overbuild so that would be my first choice for one I'd place a liquid cooler on.
Not really 'overbuilding'. They NEED to use heavier duty VRMs to withstand the torture their cards may go through. Once OC'd, power demands go through the roof. See the GN vid towards the end.
 
The aftermarket RTX 2060 Supers are putting down some impressive numbers for $400. TPU and Guru3d have them matching the RTX 2070. Overclocking gets you to 1080ti performance.

For almost twice the price, you would think Nvidia could have given the 2080 Super a 384/352 bit bus. I am thinking it would have been MUCH closer to 2080ti performance. Basically a 1660ti(x2) across the board.
 
You keep saying that RTX is sort of a "subset" of DXR features, but its already been stablished that the RTX series fully supports ALL DXR features. You also say nvidia doesn't have industry pull with game developers, yet nvidia has more game developer programs (hence the accusations of pushing gameworks, hairworks, RTX, etc...) than AMD could ever dream of.

Nvidia main income is STILL gaming cards. Yet its the main player in most of the markets its in (datacenter, AI, mediacreation, driving, etc). So it does care about gaming.

Wow... talk about someone who has a coolaid mustache.

Respectfully, You are wrong.
I appreciate how you are trying to knock me, but it is futile my friend. I've already provided proof and you havn't bothered to read, or care. It just doesn't work...

Compliance? I am not going to argue with you, when there is a chart (in this thread I believe), showing YOU that Turing only supports lvl1 basic DirectX ray tracing protocols, not all of them. Not to mention, it can't do any of them in real time.




Point is, (again) that Turing, thus RTX's ray tracing is a moot point. The RTX 2000 series will always do ad-hoc ray tracing and be snickering at and joked about. Jensen is 100% boxed out of the up-&-coming holiday buying season. I am sure Dr Lisa Su will have the 5800 Series out in full force, for high-end Gamer dollars.

Jensen l8t...
 
Wow... talk about someone who has a coolaid mustache.

Respectfully, You are wrong.
I appreciate how you are trying to knock me, but it is futile my friend. I've already provided proof and you havn't bothered to read, or care. It just doesn't work...

Compliance? I am not going to argue with you, when there is a chart (in this thread I believe), showing YOU that Turing only supports lvl1 basic DirectX ray tracing protocols, not all of them. Not to mention, it can't do any of them in real time.




Point is, (again) that Turing, thus RTX's ray tracing is a moot point. The RTX 2000 series will always do ad-hoc ray tracing and be snickering at and joked about. Jensen is 100% boxed out of the up-&-coming holiday buying season. I am sure Dr Lisa Su will have the 5800 Series out in full force, for high-end Gamer dollars.

Jensen l8t...

Not trying to knock you, you are doing a great job knocking out yourself. :D:D

BTW Turing fully supports LEVEL 0 tier which is the current DXR spec. No card supports level 1 tier yet (I don't think its even finalized).

Anyway this has been fun. Sorry your cruzade against RTX didn't find any followers. Better luck next time.

But hey, maybe Navi20 will bring full support for DXR in realtime and finally take back the crown. I mean you can always dream... :rolleyes::rolleyes:
 
This whole thread is just so bizarre. Typically companies do a better job of hiring more believable "social media influencers" :ROFLMAO:

I will be sitting out the AMD GPU party for a while since I don't tend to buy mid range GPUs. The AMD processors on the other hand...
 
This whole thread is just so bizarre. Typically companies do a better job of hiring more believable "social media influencers" :ROFLMAO:

I will be sitting out the AMD GPU party for a while since I don't tend to buy mid range GPUs. The AMD processors on the other hand...

I think they spent all the money on Ryzen + Ryzen marketing so Team Red had to go searching on Craigslist for social media viral marketing lol.
 
This whole thread is just so bizarre. Typically companies do a better job of hiring more believable "social media influencers" :ROFLMAO:

I will be sitting out the AMD GPU party for a while since I don't tend to buy mid range GPUs. The AMD processors on the other hand...
I only buy mid and low end stuff, so this stuff is important for me. I have like 4-5 boxes I try to keep in decent shape, so spending a lot on one isn't always my to priority.
 
Not trying to knock you, you are doing a great job knocking out yourself.

BTW Turing fully supports LEVEL 0 tier which is the current DXR spec. No card supports level 1 tier yet (I don't think its even finalized). Anyway this has been fun. Sorry your cruzade against RTX didn't find any followers. Better luck next time. But hey, maybe Navi20 will bring full support for DXR in realtime and finally take back the crown. I mean you can always dream...


So you admit knowing, yet lying?

You do know that is considered trolling, right? If you know Nvidia's current crop of RTX don't support all of DXR, then why did you try and rebuttal an argument, with a lie? All you are doing is showing that you need to lie for nvidia… why?


Who are these people supporting RTX..? I own one and I don't care one bit about my card's ability to do ad-hoc raytracing, for awe sake. I have at least a dozen friends who own an RTX and other trying "RTX on" out, not a single one uses "RTX on" in gaming. As such, nobody touting RTX, uses it...

Do any of you use RTX on these games..? Then how are you defending it..?




That is why your attacks on my character are fake and laughable. AMD is dominating Nvidia in gaming hardware and there is nothing SUPER can do about it. Jensen tried, but Dr Su has more coming. RDNA FTW...
 
I only buy mid and low end stuff, so this stuff is important for me. I have like 4-5 boxes I try to keep in decent shape, so spending a lot on one isn't always my to priority.

i'm of the similar mindset.. sure having top of the line shit was awesome when i was younger but these days gpu's are starting to surpass the software and even the mid range stuff can run most games over 100fps @ 1080p/1440p.
 
Status
Not open for further replies.
Back
Top