lostin3d
[H]ard|Gawd
- Joined
- Oct 13, 2016
- Messages
- 2,043
I dismiss Jensen and his jacket as underwhelming.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
You don’t get nothing. You get, apparently, 2080 raster performance and 16GB of VRAM for $699. That’s not nothing, particularly considering early benchmarks of the 2080 are showing you “get ray tracing”, but at a performance hit significant enough that most gamers are either not going to use it, or will be limited as to where they actually can. I don’t consider ray tracing in this generation of Nvidia cards as something that would make me want to go out and make a purchase given how terrible it’s performing.
That's because you're hearing two different crowds. One wants top of the line performance, they already have the best so when the next gen doesn't improve other than some features you cant use yet it's disappointing. The other is looking to upgrade, they're a few generations back and just want as close to the best they can get for their money. One cares more about performance, the other cares more about money, features be damned either way.When it released people were complaining that the 2080 was only trading blows with the 1080TI in most games. And with AMD you get 1080TI performance, the same feature set, a couple of years late, and all for the same price...Yay.
You're also praising nvidia cards to high heaven based on RTX support in 15 games that hasn't yet materialized in any meaningful way. Wait until RTX comes out before calling it the second coming. I mean, I'm sure it'll be great myself, but I'm not fooling myself either. I can play games just fine without it.You judge RTX based on one game, wait for more before you can judge anything. Also DXR support is not a joke, it's a DirectX version. 2080 also has Tensor Cores for AI acceleration.
No, DXR is an industry standard now, the industry is moving into the direction of ray tracing, AMD is just late to the party as usual.
Only thing Jensen got right was the last part which is true.
I'm personally not arsed about RT/DLSS either, I just want as much rasterisation performance as possible for my money. But objectively speaking they do add value, it's just that value ranges from "Nothing" to "Must own" depending on the person.
Frankly, as much as it hurts to say, depending on the actual retail prices (vs the MSRP), the RTX 2080 actually is better value. If this came in at $600 it would be a much more interesting launch for like-minded people. $500 and the internet would be on fire.
This is DirectX10 all over again where Nvidia was first to have it but turning on DX10 shadows the games run twice as slow. Sound familiar? Then AMD released their DX10 cards but they were DX10.1 and that gave a massive performance boost to games that utilized it like Assassin's Creed, except that magically the support for DX10.1 was removed and never seen again.You judge RTX based on one game, wait for more before you can judge anything. Also DXR support is not a joke, it's a DirectX version. 2080 also has Tensor Cores for AI acceleration.
I see you know nothing of graphics history. AMD/ATI has been late a number of times but they're also innovators most of the time. They have too as Nvidia has more market share. The few times that Nvidia beat AMD/ATI was DirectX8 and DirectX10. AMD had to create Mantle to push the industry to make Vulkan and DX12.No, DXR is an industry standard now, the industry is moving into the direction of ray tracing, AMD is just late to the party as usual.
Minor correction: DX12 was in development concurrently with mantle. When mantle came out, many ideas were taken from it and DX12 as it was to create what we know today as DX12 and Vulkan.. AMD had to create Mantle to push the industry to make Vulkan and DX12.
This is DirectX10 all over again where Nvidia was first to have it but turning on DX10 shadows the games run twice as slow. Sound familiar? Then AMD released their DX10 cards but they were DX10.1 and that gave a massive performance boost to games that utilized it like Assassin's Creed, except that magically the support for DX10.1 was removed and never seen again.
I see you know nothing of graphics history. AMD/ATI has been late a number of times but they're also innovators most of the time. They have too as Nvidia has more market share. The few times that Nvidia beat AMD/ATI was DirectX8 and DirectX10. AMD had to create Mantle to push the industry to make Vulkan and DX12.
As for Ray-Tracing you don't really need DXR to do it. Hybrid Ray-Tracing is not a new thing and could be done on DX11 as show by this Japanese demo. Both the Unreal engine guys and Battlefield V said they had Ray-Tracing working without Nvidia's RTX cards. Makes one wonder if you just need a good CPU instead of wasting half a GPU to do Ray-Tracing?
As an AMD fanboi I'm underwhealmed by the VII.
7nm but not really any better price/performance to show for it.
I might still buy it when it comes out, to replace my Vega 64, but can't say I'm excited about the launch.
Anyways, I'll wait to see [H]'s review, then decide.
EDIT: any chance this card is the equivalent of the Vega 56, and there is actually a 2nd card that is more powerful?
No, if AMD did this instead of NVIDIA, we’d still be bashing it. It’s not anti-something just because people complain about something. It’s a legitimate concern.Yes, the anti-Nvidia crowd will continually try to frame the argument like the only thing RTX cards can do is raytracing.
Raytracing has always been the holy grail. And if it was AMD instead of Nvidia doing the heavy lifting of blazing this trail and creating this bleeding edge market segment and taking action to break what would otherwise be a chicken/egg cycle of it never happening, the raytracing antagonists wouldn't be shutting up about how wonderful it is.
Speak for yourself? A V64 is already as fast as a 1080 (faster than a 1070Ti) and you're saying a 7nm VII at 25% faster is 'around' a 1070Ti now?
And you really think AMD would release a card as fast as a 2080Ti for 30-40% of the price? Lol come on. They ain't giving that shit away to you.
Yeah mr objective here, DLSS is a nothing (FFXV demo lol!) and RTX is a low fps, low res cluster fuck which doesn't even ray trace much of the scene. I bet you said the same about the 780Ti vs the 290X - look at where that extra gb of ram got 290X users over lifetime of ownership, 290X is noticeably faster in later years especially when VRAM comes in to play. Calling it now, VII will definitely age better than the 2080.
Fun side note: Jensen, himself, is an alumnus of AMD -- so that makes all 3 places he describes (AMD, NVIDIA, Intel) "holders" of AMD alumni ...
Family rivalry.He's also Lisa Su's uncle.
I think the strong selling points for the VII are the 16 GB of HBM 2 and FP64 performance. The gaming performance should be nearly identical to the RTX 2080.
I don't think Nvidia has comparables in the same price range.
Radeon VII
FP16 (half) performance
27,648 GFLOPS (2:1)
FP32 (float) performance
13,824 GFLOPS
FP64 (double) performance
864.0 GFLOPS (1:16)
RTX 2080
FP16 (half) performance
20,137 GFLOPS (2:1)
FP32 (float) performance
10,068 GFLOPS
FP64 (double) performance
314.6 GFLOPS (1:32)
Quadro RTX 5000 $2300USD (16GB RAM).
FP64 (double) performance
348.5 GFLOPS (1:32)
Titan V is an FP64 monster though at $3000, (12GB of RAM)
FP64 (double) performance
7,450 GFLOPS (1:2)
Not with G-Sync he prefers a good old CRT.You don't like playing Space Invaders?
I was really pysched up this CES to make an upgrade from the 1080TI I have and was looking forward to what AMD had for me.
Jensen pretty much told the truth. I don't like the demeanor or tone but everything he said was pretty much how I felt. AMD releasing what is pretty much a 1080TI with more VRAM in 2019 at $700 is very underwhelming indeed. I was really pysched up this CES to make an upgrade from the 1080TI I have and was looking forward to what AMD had for me. I didn't expect Nvidia to allow Freesync on their GPUs nor AMD to following Nvidia with their pricing schemes.
If the Radeon 7 was cut down to 8GB of HBM2 and $499 we'd have different story here, unfortunalely that isn't the case. BTW I have found a few 2080s at the $699 pricing. Now the 2080ti still seems to hover at $1200+, just can't find one at $999.
You keep harking on about a temporary deal that’s now gone. The minimum is currently $699: https://www.nowinstock.net/computers/videocards/nvidia/rtx2080/Exactly, this is Vega with a 20% performance bump, no new features, and a big increase in cost. There was an MSI 2080 in the Hot Deals section for $630 that came with Anthem and BFV; I'm guessing that NV has a bit of wiggle room on the price if they need it.
I have to ask: Is this trolling or intentional Nvidia shilling? Is it paid for?
Why would AMD name their GPU after an Nvidia GPU's performance? That would be like AMD proclaiming they're a follower of Nvidia, and of a mid-tier GPU from their previous generation. That would be self-opposing PR.
According to the benchmarks shown, Radeon 7 is equal to RTX 2080 performance, or slightly higher in performance than an RTX 2080. It isn't GTX 1070 Ti performance. The GTX 1070 Ti is 2 tiers of GPU performance beneath the RTX 2080.
The price of Radeon 7 does suck though - just like the prices of Nvidia's RTX cards.
I dismiss Jensen and his jacket as underwhelming.