Just how far away is Vega gaming ed.?

tangoseal

[H]F Junkie
Joined
Dec 18, 2010
Messages
9,743
Really ...? About how far out are we? Im very excited at how these will run with Threadripper when its mated.

I have no real idea when these cards are actually dropping. The gaming cards not frontier of course.
 
Neither does anyone else, but the current rumors are late July, early Aug.

Why would these run better with a TR CPU though? Vega and TR aren't related in any way from what I know.
 
Neither does anyone else, but the current rumors are late July, early Aug.

Why would these run better with a TR CPU though? Vega and TR aren't related in any way from what I know.

My comment has nothing to do with better. Just with. I am going to trade up my Ryzen 7 for a Threadripper and am curious how good the card will run when mated with that procesor. I anticipate the T Ripper will drop before Vega. Has nothing to do with better or worse.
 
My comment has nothing to do with better. Just with. I am going to trade up my Ryzen 7 for a Threadripper and am curious how good the card will run when mated with that procesor. I anticipate the T Ripper will drop before Vega. Has nothing to do with better or worse.

My point is that Vega will run just as well on your Ryzen 7 as on a TR as on an i5, i7 or i9. The card's performance is going to be independent of the CPU.
 
My point is that Vega will run just as well on your Ryzen 7 as on a TR as on an i5, i7 or i9. The card's performance is going to be independent of the CPU.
Bro your running in circles with me/us. We know that. I was just letting you know. Im going to be running thread ripper soon. Im excited at the claimed expected performance of the cards. And yes to be theoretical the cega cards when in a pair will in fact have potential to run better on TRipper due to both crossfire cards running at 16x native instead of 8x8. So as I was quite ambiguous I meant to say crossfire vega not a single card which in either case 8x vs 16x has shown zilch of a difference on pcie 3.0. But maybe with vega it will at last who knows. Its doubtful.
 
Bro your running in circles with me/us. We know that. I was just letting you know. Im going to be running thread ripper soon. Im excited at the claimed expected performance of the cards. And yes to be theoretical the cega cards when in a pair will in fact have potential to run better on TRipper due to both crossfire cards running at 16x native instead of 8x8. So as I was quite ambiguous I meant to say crossfire vega not a single card which in either case 8x vs 16x has shown zilch of a difference on pcie 3.0. But maybe with vega it will at last who knows. Its doubtful.

I think the confusion for me is how you're saying "mated to that processor". I read it like you were implying there was some advantage to running an AMD CPU with an AMD GPU. Since that's clearly not the case from this explanation you just wrote, we are probably on the same page here.
 
As soon as they launch you get to fight with miners everywhere for them at inflated prices.
 
I dont get mining... how many blocks can possibly be left lol... the only people finding coin now are the 1+ mw/month farms in power usage. I think it was fun but nkw its fucking retarded.
 
I dont get mining... how many blocks can possibly be left lol... the only people finding coin now are the 1+ mw/month farms in power usage. I think it was fun but nkw its fucking retarded.

pools like nicehash are still working well for steady reliable income (you won't ever find a coin so to speak, but you get a portion of the profit as the pool finds them)--- so long as crypto currency stays inflated values.
 
pre-order @ msrp or forever regret it. and if you dont like it sell it for more :p

Ain't this the damn truth. People are missing the point here. Vega may or may not perform as well as NVidia cards, but they are going to sell the hell out. Pre-order these bad boys, play with them for a while, then sell them off. If you can't sell off when ETH drops, you still have lot of GPU power in your hands.

More importantly, use the ETH craze to invest in AMD and make money off this.
 
Ain't this the damn truth. People are missing the point here. Vega may or may not perform as well as NVidia cards, but they are going to sell the hell out. Pre-order these bad boys, play with them for a while, then sell them off. If you can't sell off when ETH drops, you still have lot of GPU power in your hands.

More importantly, use the ETH craze to invest in AMD and make money off this.

or take that money and buy like a dozen vega cards. the used market for thease things will be WAYYYY higher then what they cost at launch and who dosnt want a dozen of thease.
 
Amd needs to disable mining capability and make a gamer ONLY card. How is far beyond my understanding but damnit make it.
 
Last edited:
I will jusy put this here:
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080_PCI_Express_Scaling/24.html

Amazing that the GTX 1080 only takes a 4% hit with pci-e 4x! 8x matches 16x at all resolutions.

You are looking very wrong a it. Try again comparing by a per game basis and you will be surprised.. not every game behave the same with different pci-e bandwidth. Some games take a huge hit, others aren't affected.. a performance summary isn't going to reflect that.
 
So essentially 16x is a waste of lanes for gpus
Not quite, the graph is an overall summary of the average performance hit across all lane arrangements.

Some games show barely any changes until you hit the bottom of the barrel (1.1 x4), some games show hits even at 2.0 x8.

I wouldn't call X16 a strict requirement, but personally by looking at the graphs, 3.0 x8 is the lowest I'm willing to go.

With setups that has lanes to spare, not much of an issue, especially HEDT ones, with x8 Ryzens, this could be problematic, but again, game specific.
 
Araxie, for 4x that is true, but with 8x all of the games look to take no hit since it was 100% of 16x performance at all resolutions.

I would suspect that if you use cards with double the performance of a GTX 1080, you wil get the same hit at 8x as you did with the GTX at 4x.

Then again, the performance hit of 4x was not any worse on the 1080 than it was on the 290x:
http://www.tomshardware.com/reviews/graphics-performance-myths-debunked,3739-3.html

HOWEVER, in SLI setup, performance impact for 4x was massive and even 8x was measurable:
http://www.guru3d.com/articles_pages/pci_express_scaling_game_performance_analysis_review,17.html

That was tested using an x58 system with 40 or so pcie lanes so the GTX 980 cards were getting the full 8x lanes. What about with a system with 16x lanes? Will each card only get 6x since they will still have to share with SSD and USB 3.1? Please correct me if I am off base.
 
again all depends on the games tested, look at this for example with an old GTX 980

wolfenstein_1920_1080.gif


some games are really bandwidth sensitive that aren't always tested by all reviews

in the more recent GTX 1080 review.. there are again 4 - 5 FPS differences in certain games.

warhammer_1920_1080.png


jc3_1920_1080.png


farcryprimal_1920_1080.png


when you are at that point where you are bandwidth limited by the bus you face things that aren't strictly related to the FPS averages but you may face, stuttering, more frequent drops in FPS, more pronounced FPS drops, jittery frametimes, so a worse experience in general, those are things generally not noticeable in a single average FPS chart however are very noticeable to the end user in the gameplay, sometimes games are not as smooth as someone may think and then you see people with 1080Ti running on a good old 2600K aging well in power but not at platform level by being limited to PCI-E 2.0 worse yet if running SLI or even worse if its Xfire as AMD cards run Xfire through the PCI-E Bus adding more saturation to the starved bus.

about your last sentence:

What about with a system with 16x lanes? Will each card only get 6x since they will still have to share with SSD and USB 3.1? Please correct me if I am off base.

typical intel platforms with 16 Lanes have also dedicated chipset lanes for NVME/m.2, thunderbolt, lancards, soundcards and so on, they have even in some cases separated PCI-E 3.0 lanes that run through the CPU, additional PCI-E 3.0 lanes that run through the Chipset and even PCI-E 2.0 lanes that also run through the chipset and others have additional PLX chipset to run double of the native PCI-E Lanes so someone with a CPU that support only 16 lanes will be able to run SLI or XFire with both cards at x16, typically PCI-E x1 Slots run through the chipset so they doesn't take effect on the CPU native X8 and X16 lanes.. Standard SSD and USB 3.1 have no effect on PCI-E Lanes, those capabilities depend mostly on the Chipset features, that's why in example an i7 5820K is supposed to have only 28 lanes support, the x99 chipset support additional x8 PCI-E Lanes that can be used for addons cards without affect the CPU PCI-E lanes so still be able to run X16 and X8 in multi-GPU or X8 - X8 - X8 if desired..
 
Also, PCI-E never gets shared as a "6x" lanes, the PCI-E arrangemens I have seen so far are, without exception, X16, X8, X4, X1. I can't remember if I ever saw a X2, but I never saw a x6.

moot point when dealing with SLI though, as X6 and X4 are about equal: not happening.
 
Back
Top