And now AMD asks you to spend a lot of money to get nothing, no ray tracing, no AI upscaling. Zero, nill. Who is the better value now?
I would take 16GB of 1TB/s HBM2 VRAM over ray tracing (supported on one whole game thus far) any day.
If they can drop the price by another $100, this would be a huge win for AMD.
 
I would take 16GB of 1TB/s HBM2 VRAM over ray tracing (supported on one whole game thus far) any day.
16GB of anything gets you nothing. RTX and DLSS are tangible features with tangible effects, also more future proof. You get support of DXR. Vastly less power consumption too.
 
16GB of anything gets you nothing. RTX and DLSS are tangible features with tangible effects, also more future proof. You get support of DXR. Vastly less power consumption too.
RT is next to worthless on the 2080 and barely passable on the Ti. There’s nothing future proof about it. The people buying these cards generally have more than enough power to spare, but yes that is definitely something NVIDIA is more efficient with. Defending ray tracing at this point is just comical. It’s not a selling point. NVIDIA used it as marketing hype and it fell very short. It has potential but it’s not ready for prime time. Maybe next gen. “16 GB of anything gets you nothing” is also utterly incorrect.
 
16GB of anything gets you nothing. RTX and DLSS are tangible features with tangible effects, also more future proof. You get support of DXR. Vastly less power consumption too.

Well we don't really know much about power consumption quite yet. When Kyle gets his hands on a VII and writes one way or the other on the subject I'll take his word as truth.

As for the future proof nature of Tensor cores... I'm not so sure about that.

So far Nvidia has been the only company to release tensor cores on consumer parts. Perhaps AMD adds tensor flow to Navi... perhaps not. If they don't RTX and DLSS die. Its that simple. If neither of those technologies find their way into PS5 / Xboxnext / and Streaming services. Then they are a lost cause. At that point they will always be a feature with little support... half assed implementations... and a great way to halve your frame rates for questionable IQ gains.

On the other hand if they do find their way into Navi and thus into the next gen consoles and streaming servers... then Nvidias cores will likely be the inferior (or at least less optimized) option... and the current generation will likely be sub standard in comparison. Nvidias own 7nm replacement down the road may make RTX cores a feature worth having... but again only if game developers actually target the feature, which leaves the fate of RTX up to AMD.
 
RT is next to worthless on the 2080 and barely passable on the Ti. There’s nothing future proof about it. The people buying these cards generally have more than enough power to spare, but yes that is definitely something NVIDIA is more efficient with. Defending ray tracing at this point is just comical. It’s not a selling point. NVIDIA used it as marketing hype and it fell very short. It has potential but it’s not ready for prime time. Maybe next gen. “16 GB of anything gets you nothing” is also utterly incorrect.
You judge RTX based on one game, wait for more before you can judge anything. Also DXR support is not a joke, it's a DirectX version. 2080 also has Tensor Cores for AI acceleration.
but again only if game developers actually target the feature, which leaves the fate of RTX up to AMD.
No, DXR is an industry standard now, the industry is moving into the direction of ray tracing, AMD is just late to the party as usual.
 
No, DXR is an industry standard now, the industry is moving into the direction of ray tracing, AMD is just late to the party as usual.

AMD has already nailed down PS5, the next generation Xbox... and Googles streaming server business.

Like it or not.. If AMD doesn't support tensor cores, Real time ray tracing in games will not be a thing for YEARS.

I believe Navi will in fact feature tensor hardware... but at this point they haven't announced that.

OpenGL / DX are filled with feature flags that aren't used cause they simply didn't catch on with developers. So far with one shipping RTX game... I'm not convinced its a feature with legs.
 
Power Consumption is a joke (two 8 pin connectors), no USB type C too

YVtZnp8qUIeMiQXq_thm.jpg
pFrw741SzcjPGTL4_thm.jpg


https://www.techpowerup.com/251390/amd-radeon-vii-hands-on-at-ces-2019
 
AMD has already nailed down PS5, the next generation Xbox... and Googles streaming server business.

Like it or not.. If AMD doesn't support tensor cores, Real time ray tracing in games will not be a thing for YEARS.

I believe Navi will in fact feature tensor hardware... but at this point they haven't announced that.

OpenGL / DX are filled with feature flags that aren't used cause they simply didn't catch on with developers. So far with one shipping RTX game... I'm not convinced its a feature with legs.

This is a really good point.
 
16GB of anything gets you nothing. RTX and DLSS are tangible features with tangible effects, also more future proof. You get support of DXR. Vastly less power consumption too.

My 2070 has done zero ray tracing and zero DLSS so far, and it does not look like it will be doing those things any time soon, so much for tangible.
The only RTX feature I've used is space invaders, and that makes me feel it is about as future proof as my GTX480 that died two weeks after the warranty was out.
 
It is entertaining to read comments I have seen some embarrassingly stupid videos on youtube where people keep yelling Nvidia marketing propaganda and because AMD does not have the Nvidia marketing gimmicks AMD does not compete and the card is underwhelming.

From a technical perspective HBM2 16GB and the throughput is pretty good feature to have at higher resolutions your gpu becomes the bottleneck all of the features which use frame buffers are also effected by this.

That is how it is hardware features being used 100% of the time, none of the features are promises to maybe one day when there is enough software maybe use some cores you bought a life time ago, because that is how it is.

The promise of ray tracing on it self is good but the company that provides this in hardware had to already patch the game and drivers to not use hardware so extensively and tone down the rays being traced. When you read this then you know the hardware is not going to impress you talking about underwhelming....

You get to pay $700 for a product without promises without the need of developers doing check mark features.
 
RT is next to worthless on the 2080 and barely passable on the Ti. There’s nothing future proof about it. The people buying these cards generally have more than enough power to spare, but yes that is definitely something NVIDIA is more efficient with. Defending ray tracing at this point is just comical. It’s not a selling point. NVIDIA used it as marketing hype and it fell very short. It has potential but it’s not ready for prime time. Maybe next gen. “16 GB of anything gets you nothing” is also utterly incorrect.

I bought a RTX 2080 Ti zero day .. waited several hours outside Microcenter in fact. I never ever ... EVER thought I would get to use RT with it lol. I know better.

The smart kids bought the RTX 2080 ti for the raw horse powah! Not 1st generation RTX
 
Well, he does have a point. It really doesn't have anything new and the performance is still below that of a 1080 ... around the performance of a 1070 ti ... maybe?

For those of you that don't truly really know your numbers and there are a surprising amount of you on here, I witness it every single day and or you rely on 2nd hand hear-say ... the 1080 ti is actually 2% faster than the nVidia 2080. So AMD says this new card is just below a 2080 ... that put's it at about a 1070 ti ... right?

Proof - https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-vs-Nvidia-GTX-1080-Ti/4026vs3918
So you act like the rest of the world ignorantly "thinks" the 2080 is slightly faster than the 1080 ti and then you set us straight with your useless link to userbenchmark as your "proof"? :rolleyes:

Maybe bother to look at an actual review using several games?

https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2080_Founders_Edition/33.html
 
It is entertaining to read comments I have seen some embarrassingly stupid videos on youtube where people keep yelling Nvidia marketing propaganda and because AMD does not have the Nvidia marketing gimmicks AMD does not compete and the card is underwhelming.

From a technical perspective HBM2 16GB and the throughput is pretty good feature to have at higher resolutions your gpu becomes the bottleneck all of the features which use frame buffers are also effected by this.

That is how it is hardware features being used 100% of the time, none of the features are promises to maybe one day when there is enough software maybe use some cores you bought a life time ago, because that is how it is.

The promise of ray tracing on it self is good but the company that provides this in hardware had to already patch the game and drivers to not use hardware so extensively and tone down the rays being traced. When you read this then you know the hardware is not going to impress you talking about underwhelming....

You get to pay $700 for a product without promises without the need of developers doing check mark features.

Lol if anything RTX is maturing faster than DX12, although BF5 is a bad showcase game for RTX though (even DX12 implementation is shit), had it been Cyberpunk 2077 people would have been a lot more tolerable with the 60fps at 1080p.
 
Lol if anything RTX is maturing faster than DX12, although BF5 is a bad showcase game for RTX though (even DX12 implementation is shit), had it been Cyberpunk 2077 people would have been a lot more tolerable with the 60fps at 1080p.
Absolutely not. I'm not paying 1200€ to play at 1080p/60, even if the card pleasures my willie while running.
 
When Nvidia talks smack and backs it up, we see all the negativism as shown in this thread. When AMD spews delusions of grandeur with their GPU offerings and couldn't back it up since the 9800 PRO days plus the CPU division until recently, we have everybody foaming at the mouth cheering and egging them on. What a double standard.

Intel and Nvidia for me until the performance states otherwise.
 
Last edited:
I bought a RTX 2080 Ti zero day .. waited several hours outside Microcenter in fact. I never ever ... EVER thought I would get to use RT with it lol. I know better.

The smart kids bought the RTX 2080 ti for the raw horse powah! Not 1st generation RTX
Yes, the anti-Nvidia crowd will continually try to frame the argument like the only thing RTX cards can do is raytracing.

Raytracing has always been the holy grail. And if it was AMD instead of Nvidia doing the heavy lifting of blazing this trail and creating this bleeding edge market segment and taking action to break what would otherwise be a chicken/egg cycle of it never happening, the raytracing antagonists wouldn't be shutting up about how wonderful it is.
 
Last edited:
It's a foreign concept to him that a person would rather call an Uber or risk driving themselves to a hospital then calling an ambulance and eat a four figure medical bill.

It's a foreign concept to me that you even have to make that choice. Thank goodness for the NHS!


If all there is in the $350 price area is a 2060 then i'll have a Vega 56 instead please.
 
Well, he does have a point. It really doesn't have anything new and the performance is still below that of a 1080 ... around the performance of a 1070 ti ... maybe?

For those of you that don't truly really know your numbers and there are a surprising amount of you on here, I witness it every single day and or you rely on 2nd hand hear-say ... the 1080 ti is actually 2% faster than the nVidia 2080. So AMD says this new card is just below a 2080 ... that put's it at about a 1070 ti ... right?

Had AMD said it would perform at the level of a 2080 Ti ... for $499 or $599 .. I would sell my RTX 2080 Ti in a heartbeat lol.

Speak for yourself? A V64 is already as fast as a 1080 (faster than a 1070Ti) and you're saying a 7nm VII at 25% faster is 'around' a 1070Ti now?
And you really think AMD would release a card as fast as a 2080Ti for 30-40% of the price? Lol come on. They ain't giving that shit away to you.

And now AMD asks you to spend a lot of money to get nothing, no ray tracing, no AI upscaling. Zero, nill. Who is the better value now?
Yeah mr objective here, DLSS is a nothing (FFXV demo lol!) and RTX is a low fps, low res cluster fuck which doesn't even ray trace much of the scene. I bet you said the same about the 780Ti vs the 290X - look at where that extra gb of ram got 290X users over lifetime of ownership, 290X is noticeably faster in later years especially when VRAM comes in to play. Calling it now, VII will definitely age better than the 2080.
 
If Cryptocurrency had never used GPUs the high end 'battle' would be largely irrelevant. It would be like when Mercedes unveil their latest S Class, a few folks would go "Ohhh that's nice! I look forward to that in my $20000 car in a few years time!"

The real battle is the mid to low end offerings that ordinary folks with bills and mortgages buy.

I'm wondering if the next gen of cards will actually cut back on a lot of the 'computational add ons' if crypto continues to crash.
 
Perhaps AMD adds tensor flow to Navi...
FYI tensor flow works on GCN cards even 290x etc..
Another thing you might find interesting is people getting Ray Tracing working on the prior generation Titan V... tensor core for 'RTX' is just a marketing name, nothing much changed.
 
Lol, he must feel threatened if he has to thrash talk a competitor to this level.

yup

And now AMD asks you to spend a lot of money to get nothing, no ray tracing, no AI upscaling. Zero, nill. Who is the better value now?

Underwhelming yes, but just generally shit talking as a ceo isn't very professional though.
Well, the Die is small, it's the first 7nm product.
AI upscaling DLSS looks horrible, Raytracing isn't much of a value add and the performance of VII is not bad but the price is too high.
Dismissing and agreeing here, You don't buy an RTX2080 for it's "RTX" functions but rather it's traditional graphics performance.
RTX3080 will probably be a different story as games and tech matures.

I'm pretty much saying the RTX lineup is underwhelming and the VII is even more underwhelming, but I didn't expect amd to do anything useful in the graphics space so wasn't disappointed either.
With my screens now working on an nvidia card I might wait for RTX3080 and buy that once that comes :)
 
well nice drama from jensen. mean while both company work together behind the scene on how they can make better profit haha
 
Yeah mr objective here, DLSS is a nothing (FFXV demo lol!) and RTX is a low fps, low res cluster fuck which doesn't even ray trace much of the scene. I bet you said the same about the 780Ti vs the 290X - look at where that extra gb of ram got 290X users over lifetime of ownership, 290X is noticeably faster in later years especially when VRAM comes in to play. Calling it now, VII will definitely age better than the 2080.
36 games are set to recieve DLSS, FF15 ready received it, BFV and Anthem are getting it this year.
And ray tracing adds visual quality and adheres to DXR standards, your Vega 2 can't even run DXR, it's future usefulness amounts to zero! it's like buying a DX9 card when we have DX12. Many more games are set to use DXR: Metro, Control, Tomb Raider, and Vega won't even run them. With an RTX 2080 you get options: enhance image quality with RT, get more fps with DLSS, or just play the game as a console port. With Vega you get nothing but higher power consumption.
 
As a proud owner of a R9 390, FreeSync display and a custom watercooling loop I'd say the V7 is pretty impressive.

Does nVidia offer more performance, and better efficiency at a higher price?
Absolutely.
Do they have a total dick as a CEO?
Absolutely.
Do I care about PR of companies I buy products from?
Absolutely.

I might pick one up but I'm probably going to hold out for some more details on Navi.
 
So you act like the rest of the world ignorantly "thinks" the 2080 is slightly faster than the 1080 ti and then you set us straight with your useless link to userbenchmark as your "proof"? :rolleyes:

Maybe bother to look at an actual review using several games?

https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2080_Founders_Edition/33.html


Brushing this aside, I think the real story is the $700 to $800 these ... cough .. cough ... Radeon 7's are gonna cost. The 7 stands for the 7 in 1070 Ti performance I heard. Not sure if that's true but makes sense.

Getting back to nVidia, tho, you're right, they are killing it man.
 
And now AMD asks you to spend a lot of money to get nothing, no ray tracing, no AI upscaling. Zero, nill. Who is the better value now?

You don’t get nothing. You get, apparently, 2080 raster performance and 16GB of VRAM for $699. That’s not nothing, particularly considering early benchmarks of the 2080 are showing you “get ray tracing”, but at a performance hit significant enough that most gamers are either not going to use it, or will be limited as to where they actually can. I don’t consider ray tracing in this generation of Nvidia cards as something that would make me want to go out and make a purchase given how terrible it’s performing.
 
Brushing this aside, I think the real story is the $700 to $800 these ... cough .. cough ... Radeon 7's are gonna cost. The 7 stands for the 7 in 1070 Ti performance I heard. Not sure if that's true but makes sense.

Getting back to nVidia, tho, you're right, they are killing it man.

I have to ask: Is this trolling or intentional Nvidia shilling? Is it paid for?

Why would AMD name their GPU after an Nvidia GPU's performance? That would be like AMD proclaiming they're a follower of Nvidia, and of a mid-tier GPU from their previous generation. That would be self-opposing PR.

According to the benchmarks shown, Radeon 7 is equal to RTX 2080 performance, or slightly higher in performance than an RTX 2080. It isn't GTX 1070 Ti performance. The GTX 1070 Ti is 2 tiers of GPU performance beneath the RTX 2080.

The price of Radeon 7 does suck though - just like the prices of Nvidia's RTX cards.
 
36 games are set to recieve DLSS, FF15 ready received it, BFV and Anthem are getting it this year.
And ray tracing adds visual quality and adheres to DXR standards, your Vega 2 can't even run DXR, it's future usefulness amounts to zero! it's like buying a DX9 card when we have DX12. Many more games are set to use DXR: Metro, Control, Tomb Raider, and Vega won't even run them. With an RTX 2080 you get options: enhance image quality with RT, get more fps with DLSS, or just play the game as a console port. With Vega you get nothing but higher power consumption.

LOL! Man, you sure do love your Nvidia. DLSS is of no value on the hardware that exists, unless you have no issue running your games at 1080p on a $1200 card. :D Now, the fact that some bought this card with a promise of what might be is on them. Me? I would rather have a card that can be fully utilized on all games now and not be limited in the future. Heck, and this VII is just a refresh and improvement on what already exists, makes me look forward to the new stuff that should be out later this year. (Either that, or you got jealous that Lisa's leather jacket was better.) :D

Edit: Right, games that add DXR as an after thought will not run on VII cards because, reasons.........
 
You judge RTX based on one game, wait for more before you can judge anything. Also DXR support is not a joke, it's a DirectX version. 2080 also has Tensor Cores for AI acceleration.

No, DXR is an industry standard now, the industry is moving into the direction of ray tracing, AMD is just late to the party as usual.

Nvidia certainly has a better version of Space Invaders right now. AMD needs to follow on closely.
 
As an AMD fanboi I'm underwhealmed by the VII.

7nm but not really any better price/performance to show for it.

I might still buy it when it comes out, to replace my Vega 64, but can't say I'm excited about the launch.

Anyways, I'll wait to see [H]'s review, then decide.

EDIT: any chance this card is the equivalent of the Vega 56, and there is actually a 2nd card that is more powerful?
 
Last edited:
You judge RTX based on one game, wait for more before you can judge anything. Also DXR support is not a joke, it's a DirectX version. 2080 also has Tensor Cores for AI acceleration.

No, DXR is an industry standard now, the industry is moving into the direction of ray tracing, AMD is just late to the party as usual.

I bet you were one of those guys who bought a stand alone PhysX card card weren't ya? :D
 
Back
Top