Here are AMD's Radeon VII Benchmarks

i am honestly really surprised they managed to tweak this thing enough to get another 29% or so average. I mean if it can overclock a little more and get another 5-10% that is cherry on top lol.

In that I commend AMD for really being able to do this even though it was a test run on 7nm. I think that is the only reason they released it otherwise we wouldn't have seen it.

Everyone has to understand this is literally Vega done right on 7nm. I mean no one was expecting their next gen right? LOL! So no need to be disappointed.
 
Great step forward for AMD compared to their current-gen, and a huge leap forward for closing the gap on competition.

Yeah, I'd buy an AMD card that's equivalent to a 1080ti, and I haven't bought anything but Nvidia since the 680 days. Ray tracing may be all that and a bag of chips one day, but it's been years since DX12 was introduced and game developers still don't utilize that entire feature set. If AMD can just focus on getting some performance cards into the hands of the masses, keep an eye on the ray tracing ball without allocating half the chip to something that might not be picked up by developers for a card generation or longer, they can sell some cards.

And about DX12 and AMD - one of the very first big DX12 headines I remember was how DX12 opened the door to offloading a bunch of work to the CPU, and the prospect of (then rare in home PC) multi core - more than 4 - CPUs being a potential source of vast improvement in games.

And here we are, AMD makes the dangdest big ol huge whopping Threadripper with cores out the wazoo and uses a 4c/8t chip to showcase a card about 4 years after DX12 was announced. Yeah, all games don't use DX12, but the whole mess just seems like it played out sloppy.
 
And here we are, AMD makes the dangdest big ol huge whopping Threadripper with cores out the wazoo and uses a 4c/8t chip to showcase a card about 4 years after DX12 was announced. Yeah, all games don't use DX12, but the whole mess just seems like it played out sloppy.

4 years of DX12 and it still routinely runs worse than DX11. I think I’m glad it isn’t featured in all games.
 
Yada, yada, who cares about the 2080, not supporting my narrative, yada, yada, AMD sucks and are stuck in the past, yada, yada. Yawn. They released what appears to be a very solid product and upgrade for more than just gamers. But hey, guess you are expecting miracles just barely a year after the debacle left behind by the other dude that got canned. (And barely two years after AMD just started turning things around.)

Would I buy the specific card? Probably not but, that is because of what I already have and the fact that although I game more than I used too, I do not game near enough to buy one. (Vega56, RX580 and RX570 owner.) What do you expect AMD to do, take it chin at a loss? LOL :eek::LOL::rolleyes:

It's a solid product, sure. It's a generation behind nVidia in terms of performance, and does nothing to compete with nVidia in terms of price. While not quite like the Bulldozer debacle, at least AMD still priced those processors where they should be at. With the exception of the FX-9xxx junk.

So? Few people spend 1k+ on a graphic card, especially with mining going away. Why make a product that few will buy? Even this 700 dollar card is too much. People buy 1060s and 1070 priced cards. AMD should have announced those instead.

Why? Because it matters more than just for gamers. It matters in compute, where the big money is. The halo effect of having the best card does affect mid-range and low-end sales, whether you want to admit it or not. The vast majority of buyers do not spend hours combing online reviews of the particular card they're looking to buy. They see nVidia is the best and naturally gravitate to the nVidia name. Same thing that Intel pulled (among other deceptive practices) to almost crush AMD.

the crapping in this thread is legendary amd made a card that competes with nvidias rtx 2080 that is where this was aimed at to give people a choice at that price point. it is not amd's fault that the high end gtx 2080 is equal in performance to a 1080ti and priced higher than last gen's 1080 for that you need to take your vitriol to nv.

AMD sets the MSRP on their cards, and that is 100% their fault. You don't pin that on nVidia. AMD massively undercut Intel with Ryzen, why couldn't they do the same against nVidia?

So, essentially you guys want AMD to price this card lower, so that nvidia lowers prices and in turn you can buy nvidia cheaper, so then in your eyes this card is a success. This VEGA VII is perfect. Its faster than V64 its got more memory and doubles the bandwith while at it.
Its based on 7 nm tech and its 30% faster in gaming while on a first driver and over 60% faster in compute workloads while is going to be only $50 more. Did I say its perfect... oh and we don't give a shit if you guys will have to pay $ 2,400 next go round because we will not be buying it. Its exactly what I said will happen in my earlier threads and exactly what I and many others wanted....:D keep on hating

People were expecting AMD to help put a stop to nVidia's price gouging, like they did with Intel. AMD 100% failed on that front.
 
People wanted a new card some were hoping for mid range Navi others for a Vega refresh. I just fail to see the sense in the drunk hopes of AMD trying to control the pricing of its competitors. WTF is next >? You people will want AMD to pay for your divorces...:D
 
People were expecting AMD to help put a stop to nVidia's price gouging, like they did with Intel. AMD 100% failed on that front.
Maybe you are, I'm expecting them to price their product to make a profit, and make a product worth buying. Looks like they did a pretty good job of the second, we'll have to wait until 2h19 to find out whether their pricing paid off.
 
  • Like
Reactions: noko
like this
People wanted a new card some were hoping for mid range Navi others for a Vega refresh. I just fail to see the sense in the drunk hopes of AMD trying to control the pricing of its competitors. WTF is next >? You people will want AMD to pay for your divorces...:D

People were hoping for RTX 2080 performance at $500 prices, and lower tier cards at competitive prices. Vega VII is an indicator that Navi will not be price competitive.

Maybe you are, I'm expecting them to price their product to make a profit, and make a product worth buying. Looks like they did a pretty good job of the second, we'll have to wait until 2h19 to find out whether their pricing paid off.

Worth buying... only if you did not buy a GTX 1080ti or an RTX 2080. At $500 or $550, I would agree with you, it's worth buying. At $700... no, it's not, especially when we've had the same performance level at $700 for two years already.
 
Great step forward for AMD compared to their current-gen, and a huge leap forward for closing the gap on competition.

Looks more like they end up the same relative spot vs the competition that they were vs the 10 Series.

then:
Vega 64 ~= GTX 1080

now:
Vega 7 ~= RTX 2080.

In both generations AMD managed to match NVidias second best.
 
People were hoping for RTX 2080 performance at $500 prices, and lower tier cards at competitive prices. Vega VII is an indicator that Navi will not be price competitive.



Worth buying... only if you did not buy a GTX 1080ti or an RTX 2080. At $500 or $550, I would agree with you, it's worth buying. At $700... no, it's not, especially when we've had the same performance level at $700 for two years already.
They aren't completely braindead at AMD. They know someone will buy it at that price or they wouldn't have set it there. What they don't know is how long they can keep that price before it's no longer profitable (see point 2).
 
  • Like
Reactions: bGm0
like this
and lower tier cards at competitive prices. Vega VII is an indicator that Navi will not be price competitive

Navi will have the exact performance of the GTX 2060 at the same price (the market price of GTX 2060) when Navi is released in October
 
Will the VII even be a real release? Looks like it's being released because it tested well gaming performance wise so guessing the amount of VII cards being made is really low. Otherwise AMD would of kept them as MI50 Instinct cards. Too bad MI60 is 32GB, I would like to see what it's rebadge Radeon with all 64 CUs would of benchmarked at.
 
Some folks I think are forgetting that AMD will have other versions of 7nm Vega, like a 52 CU version with 8 or 16 gb of HBM. Plus probably a 16/32 gb 64 CU version to replace the Vega FE.

Each should just pick what best suits them and not worry that someone else chooses different. Vega VII is an utter beast! 1TB/S memory bandwidth, 60% better compute! This will make an incredible video editing power house combined with HBCC. Not to mention Pro-Render options that are well supported.

Also Crypto mining performance dealing with Cryptonight algorithms which are very heavy memory bandwidth designed will make this untouchable. Maybe not good news for some.

For me the Vega VII makes the 2080 look like junk and not even close. Still prudent to wait for the launch and see real results. Was not planning on yet another video card but this is changing my mind. No problem ditching one of the 1080 ti’s or two to help pay for this.
 
"Moar Features" - things that cannot be effectively used until the next generation or two of cards past this one, yep.

Trying to forecast the future to avoid facts again, are we?

Vega 7 does exactly 0FPS in ray tracing, and that isn't going to change ;)
 
Trying to forecast the future to avoid facts again, are we?

Vega 7 does exactly 0FPS in ray tracing, and that isn't going to change ;)

Forecast the future? Dude, [H]'s own testing bears out what I just said. But hey, I guess paying $1200 for not all that playable 1080p RTX stuff is worth it too you. Oh, and you are right, the Radeon 7 will not do raytracing, so? It has features that can fully utilized today, not some half baked features. (By the way, there is no such thing as the Vega 7, it does not exist.)
 
Trying to forecast the future to avoid facts again, are we?

Vega 7 does exactly 0FPS in ray tracing, and that isn't going to change ;)
Good thing raytracing isn't used in video editing and rendering...well not yet at least. That's what noko was praising it for.
 
All this talk about Ray Tracing. I haven't seen anything about it that would remotely justify the purchase price. Reflections in water puddles? Big whoop! Also, we've been gaming on rasterized cards since conception, a few more years aren't gonna hurt us. Besides that there are two games that use it, BFV & Anthem. I have a funny feeling that Ray Tracing is going to be the next PhysX in terms of usage by developers, or until its used on the next tier of consoles.
 
Forecast the future? Dude, [H]'s own testing bears out what I just said. But hey, I guess paying $1200 for not all that playable 1080p RTX stuff is worth it too you. Oh, and you are right, the Radeon 7 will not do raytracing, so? It has features that can fully utilized today, not some half baked features. (By the way, there is no such thing as the Vega 7, it does not exist.)

Whats funny is how great he says the 2000 series chips are yet he clings to his 1080ti. But mostly he just good for trying to derail threads with his useless input.
 
Junk because it brings more features for the price?

:ROFLMAO:
Na just junk compared to the Vega VII from my prospective.

For me the Vega has more usable features. Faster Ram, real support for FreeSync and HDR. can effectively address memory way beyond what is on the card which is double what NVidia is willing to give you for the price, more featured packed drivers with UI of today vice some slapped together browser interface that looks like something from 2001. Works with VRR UHDTV’s, and from a company that does not intentionally bend you over, some folks think that is ok, each there own I suppose on what they like.
 
Will the VII even be a real release? Looks like it's being released because it tested well gaming performance wise so guessing the amount of VII cards being made is really low. Otherwise AMD would of kept them as MI50 Instinct cards. Too bad MI60 is 32GB, I would like to see what it's rebadge Radeon with all 64 CUs would of benchmarked at.
I think it'll be a real release, but in fairly limited quantities. That's fine, too, as I see this card as a bit of a stop-gap: AMD need a card that competes with Nvidia near the top-end now (and while it would be great if they had a product to match the 2080Ti, I'm happy if they can match the 2080) and then hopefully we'll see what happens with the new architecture rather than just a shrink.

Given that AMD repurposed their MI50 to produce VII then it's a no-brainer really. Gives them a competitive gaming product at 2080 level, a bit of kudos about moving to 7nm first, and minimal expense on their part. On the other hand, it being a pro card in different clothing could well mean that supply is limited if boards are mainly being held back for MI50.

And as ever, I'll wait for reputable sites to review the card before I draw any conclusions, but fingers crossed that AMD have a competitive product here. If so, it's a step in the right direction even if Nvidia still hold the top performance crown.
 
Alright I'll play you kiddie game.
Two years from now would I rather have a card with 16GB or an 8GB card with first gen RT?

Do I want a 16GB Re-Re-Spin of FuryX, or a new design with new video [de|en]coding, new execution engines (RT and Tensor), new rendering features (Variable rate shading), new connector (USB-C), when they cost the same and perform the same?

I realize some people might choose RAM over a new feature set, but I would lean toward the new features set.
 
Do I want a 16GB Re-Re-Spin of FuryX, or a new design with new video [de|en]coding, new execution engines (RT and Tensor), new rendering features (Variable rate shading), new connector (USB-C), when they cost the same and perform the same?

I realize some people might choose RAM over a new feature set, but I would lean toward the new features set.

This is not a re re re spin of the Fury X. o_O oh, and a not very useful feature set.
 
This is not a re re re spin of the Fury X. o_O oh, and a not very useful feature set.

They sure look like respins to me. GCN with same core configs. Just die shrink, clock speed and a few tweaks to improve performance.

If you could run FuryX at 1800 MHz, and get the memory bandwidth up, it would perform very similar to Vega 7.
 
Just read the eurogamer article with their 8700k numbers and if the final numbers are similar this thing will be a gigantic flop. No wonder Jensen called it underwhelming.
 
They sure look like respins to me. GCN with same core configs. Just die shrink, clock speed and a few tweaks to improve performance.

If you could run FuryX at 1800 MHz, and get the memory bandwidth up, it would perform very similar to Vega 7.

Nope. I am sure there are other tweaks but, the 128 rops are a significant difference in why the Radeon 7 is so much better.
 
Two years from now I'd want a faster GPU. 1080Ti performance from 2017 in 2021? No thanks ;)
In all your fanboy glory you know damn well the 1080ti doesn't have RT, but does have 11GB. Clearly the discussion, and you know it, was a comparison with Nvidia's 8GB 2080.
But ya, a sideways non answer as usual. I should have known better as evidenced by your extensive collection of posts here.
 
In all your fanboy glory you know damn well the 1080ti doesn't have RT, but does have 11GB. Clearly the discussion, and you know it, was a comparison of Nvidia's 8GB 2080.
But ya, a sideways non answer as usual. I should have known better as evidenced by your extensive collection of posts here.
the 1080ti has been addressed many times here. Guess what? They aren't making them any more. The 2080 is too expensive and too immature.
 
In all your fanboy glory you know damn well the 1080ti doesn't have RT, but does have 11GB. Clearly the discussion, and you know it, was a comparison with Nvidia's 8GB 2080.
But ya, a sideways non answer as usual. I should have known better as evidenced by your extensive collection of posts here.

You're aware that the Radeon 7 doesn't have RT either...?

That's why the comparison was made?

Further, the 1080Ti has 11GB not because it needs 11GB- but because that's what was needed to have the necessary memory bandwidth with GDDR. A case of having 'enough' bandwidth. The Radeon 7 has sidnificantly excessive bandwidth, which increases cost unnecessarily.
 
Further, the 1080Ti has 11GB not because it needs 11GB- but because that's what was needed to have the necessary memory bandwidth with GDDR. A case of having 'enough' bandwidth. The Radeon 7 has sidnificantly excessive bandwidth, which increases cost unnecessarily.

Wait a second, we know it needs 16gb's for the HBM2 stack, wasn't there a report that GDDR6 costs 70% more than GDDR5, 6 being used in the 2080 series. Was it necessary?
 
Wait a second, we know it needs 16gb's for the HBM2 stack, wasn't there a report that GDDR6 costs 70% more than GDDR5, 6 being used in the 2080 series. Was it necessary?

Given the supposed almost identical performance between 1080ti, 2080, and Vega VII, I would say no.
 
Back
Top