He is out there covering his short position on the stock - he's been short since it was $6.How's your crow taste Shintai ?
He is out there covering his short position on the stock - he's been short since it was $6.How's your crow taste Shintai ?
How's your crow taste Shintai ?
Don't tell that to Schiptai he says it is impossible, and he will point out the error of your ways. Then he will hash tag you with a tweet ala Trump. (he could be Trump).Seems like they got their hands on a few 8-Hi stacks.
Don't tell that to Schiptai he says it is impossible, and he will point out the error of your ways. Then he will hash tag you with a tweet ala Trump. (he could be Trump).
Over what? Some limited edition card that got the checkboxes with a price tag to follow? And the HBM2 is clocked at 1875Mhz. Thank god for Samsung eh?
Seems like you are going to wait a bit longer than expected for your gaming Vega. Good thing you are used to #waitforvega![]()
Or he might just go back and edit his posts. Change the white to black.Whoops, that isn't the right response.
The correct answer is...
"I'm sorry guys, I clearly got this one wrong"
Shintai
He has said a number of things on the topic including that no memory maker has them.Shintai said it would not be possible for a gaming card to my understanding.
Where are all the people who claimed that 16GB HBM2 was a foolish fanboy dream now?
He has said a number of things on the topic including that no memory maker has them.
Or he might just go back and edit his posts. Change the white to black.
Whoops, that isn't the right response.
The correct answer is...
"I'm sorry guys, I clearly got this one wrong"
Shintai
Nevertheless, it shows you that you don't need to buy new hardware to do VR. Most gamers seem to be enjoying 1080.
Because going by actual recent catalogue that came out this month there is no 8GB stack HBM2 and AMD only showed a 2-stack GPU at their event with reference to 4-stack being 2018 by certain 'leak' slides that for most are happy to use when it is other data-information.
Meaning a high chance it is faked (especially as it has zero true benchmark figures for a card that is listed unlike Fiji when that leaked on there) and with a figure for the only provided test in the whole of the suite that tbh is very wrong.
So makes it difficult how to treat the rumour of 16GB/1600Mhz that is generated from that benchmark entry.
Cheers
Cold you elaborate on how fast is fast? I have been waiting for a long time now, so I am not sure what fast means.Crank them down enough and you can use your phone!
I'd agree that a 1080 is 'enough', for now- but VR is going to push the requirements fast.
It's a financial call they ain't sharing shit other than "Vega is on track for Q2 release".
Cold you elaborate on how fast is fast? I have been waiting for a long time now, so I am not sure what fast means.
Right, clear as FUD, I mean mud. Really, give me your guesstimate for a time frame.Let me get my crystal ball...
Look, the trend in VR is greater resolution. We've talked about 1440x1440x2, but it'll keep going up; further, game graphics will keep getting pushed, making each of those pixels more costly, meaning more hardware needed to hit 90FPS. This isn't trivial, and we've only barely started to realize the capabilities of modern gaming engines, which can scale well beyond what current hardware can handle.
The only moderating factor, as it always is, will be the console development cycle, but even that is being accelerated!
I'm a bit disappointed no talk of the consumer cards, but at least we know some of the leaks were true.
Right, clear as FUD, I mean mud. Really, give me your guesstimate for a time frame.
You vehemently argued that it is coming fast. But when I asked you to quantify that, guesstimate even, you turn tail and run. Be brave, give a number so we can have it for the record and laugh at you when you are wrong.Seriously, you're trolling now- there's no 'fear', or 'uncertainty', or 'doubt', concerning the vector of VR.
And, there is FUD. Fear of missing it, falling behind, or wasting money. Uncertainty as to the compatibility, platform longevity etc. Doubts in lots of forms.Seriously, you're trolling now- there's no 'fear', or 'uncertainty', or 'doubt', concerning the vector of VR.
And, there is FUD. Fear of missing it, falling behind, or wasting money. Uncertainty as to the compatibility, platform longevity etc. Doubts in lots of forms.
Someone earlier asked how could 16 GB of VRAM be usable? I was just pointing out that newer VR headsets are entering the 4K realm. I think that 4K resolution per eye would require a lot of VRAM. I don't think that is derailing the topic.
Do you?
E.g. currently to drive a 4k high refresh rate device, you're looking minimum 500USD (even at moderate volumes) just to drive it.
I'm a bit disappointed no talk of the consumer cards, but at least we know some of the leaks were true.
existing stuff
20TFLOPS GPUs are a market need, without them, there will not be much demand for such boards.
By releasing a dual 8-pin card that performs like GP104? They have proven us all right, in fact!They've proven you all wrong.
By releasing a dual 8-pin card that performs like GP104?
My own eyes, and i was asleep. Yeah, 8+6, of course. Still 250W+ TDP card with GP104 and even cut GP104 performance in geometry-heavy load.Are you trusting AMD's notoriously bad PR images or your own eyes? I see 8+6
Saw another pic with 8+8. so who really knows. I prefer 8+8, don't want to be lacking with the power.Are you trusting AMD's notoriously bad PR images or your own eyes? I see 8+6
![]()
Some of us including myself was saying that it cannot be 8-Hi for several reasons; from an electronics perspective the heat generated for the bottom die and logic(plus height) and also it being taken out of all the SK Hynix memory catalogues and part references where other products up to Q4'17 such as GDDR6 are given.Where are all the people who claimed that 16GB HBM2 was a foolish fanboy dream now?
I think 8+6 is 300W in this instance with Frontier.My own eyes, and i was asleep. Yeah, 8+6, of course. Still 250W+ TDP card with GP104 and even cut GP104 performance in geometry-heavy load.
I am... less than impressed to say the least. Granted, as was said, AMD always sucked in that, so i guess it should match full GP104 in games... i hope, at least.
By releasing a dual 8-pin card that performs like GP104? They have proven us all right, in fact!
Also, i am glad Fool's Edition is also applicable to AMD cards too now.
Statements like that are blatant bait for flame wars.RumorFull Vega Lineup And Release Date Revealed! self.Amd
I wonder if AMD released this BS ?
well this is no vega for consumers and won't be for a long time....over a year and still tooting their horn...AMD is such a loser company.
Thanks for the charts. However from the presentations, it seemed like he was saying this card is replacing Radeon Fury X.. Page # 33 on the PDF. http://phx.corporate-ir.net/Externa...WxkSUQ9LTF8VHlwZT0z&t=1&cb=636305760276823867Seems like they got their hands on a few 8-Hi stacks. PCI-E Tesla P100 is rated for 9.6 teraflops FP32. Vega Frontier Edition (I won't even bother lol) with its ~13 Tflops of Fp32 and ~25 Tflops (more like 12.5 fp32 apparently) is ~30 faster in deepbench which is fp32 based as per their description on github.
Interestingly enough if you go to the vega website they suddenly stop comparing Vega to the Tesla P100 and switch to the Titan Xp when it comes to specviewperf. Hmm, i'm sure that's unrelated to wanting to compare pro drivers on AMD vs geforce drivers on nvidia, because AMD is ethical.
View attachment 25120
A more honest comparison would have pitted the Quadro P6000 vs Vega Frontier Edition. Full 3840 cores running at quite higher clock speeds than Tesla P100 which is a cut die running pretty low clocks due to it having to fit into tight power budget (1:2 DP chip don't forget) and no AIO watercooling.
Depending on which slide you choose to look at the P100 scores 122 or 133.
![]()
![]()
View attachment 25126
![]()
Nope, just stating how it is.Goal post shifting much brah?
Quote me. Oh, you cannot, how bad.You guys were wrong on 16GB HBM2, didn't even admit it and have to gall to try and shift the discussion into BS territory again.
It's a fact, it performs like cut GP104 and GM204 (rofl) in SPECviewperf.Performs like GP104? LOL. Prepare to be wrong again, but you naysayers will find something else to twist with your negativity won't you?
Thanks for the charts. However from the presentations, it seemed like he was saying this card is replacing Radeon Fury X.. Page # 33 on the PDF. http://phx.corporate-ir.net/Externa...WxkSUQ9LTF8VHlwZT0z&t=1&cb=636305760276823867