Vega Rumors

FuryX gave the 980ti a run for its money and often did better in crossfire.

Vega only matches a 1080 despite being a year newer. Not a good scenario at all.

Still up in the air wether Vega or X299 is a bigger failure....
 
FuryX gave the 980ti a run for its money and often did better in crossfire.

Vega only matches a 1080 despite being a year newer. Not a good scenario at all.

Still up in the air wether Vega or X299 is a bigger failure....

I'd argue it's Vega.

Reason: X299, if it is going to be used for ANY length of time, will have to be improved upon by intel. If intel don't, the manufacturers will probably want to do something with it. IE botched launch, but will be corrected by the end of the year.

With Vega? It doesn't look like ANYTHING will save it.
 
True. If you could drop Coffee Lake-X on the x299 after having some fun with a 7740x for a year, it would be pretty neat.

Still, I would love to pair a 7900x clocked at 4.8 Ghz with tri-fire Vegas. I always wanted to open a dimensional portal to hell.
 
Still, I would love to pair a 7900x clocked at 4.8 Ghz with tri-fire Vegas. I always wanted to open a dimensional portal to hell.
You need 7980X at 4.6 (to account for silicon lottery, you know) and quadfire vegas for portal to hell to open, actually. Of course with Prime95 and Furmark running in the same time.

Source: am literally devil.
 
Woo nelly, this is turning out to be a wild ride. I might need to sit down and fan myself.

So about the same as the scores that popped up a few weeks back. Marginally worse score than an aftermarket 1080.
You need 7980X at 4.6 (to account for silicon lottery, you know) and quadfire vegas for portal to hell to open, actually. Of course with Prime95 and Furmark running in the same time.

Source: am literally devil.

Additionally, you'll need to recite aloud from the Necronomicon chapitre 4, verse 7. It's the passage entitled "The Invocation of the Raja"
You need 7980X at 4.6 (to account for silicon lottery, you know) and quadfire vegas for portal to hell to open, actually. Of course with Prime95 and Furmark running in the same time.

Source: am literally devil.

You'll also need to recite aloud chapitre 7, verse 8 from the Necronomicon. It's the passage entitled "Invocation of the Raja".
 
RX Vega FreeSync vs. GTX 1080 Ti G-Sync Blind Gaming test coming very soon. All done at my house, with gamers with a couple hundred years of twitch gaming experience. All system UEFI and OS set up by me personally.
 
Officially one generation behind.

Anyone here have a nagging feeling that AMD should have just gone with GDDR5X instead of being delayed HBM2?

I dont think they could, if rx vega is around a 1080 performance bracket, and it needs 480 gbs bandwidth, they wouldn't be able to do that with GDDR5x without an increased die size due to the larger bus size.
 
Does somebody already have their grubby little fingers on a RX Vega? ;)
IMG_20170722_175313.jpg
 
You need 7980X at 4.6 (to account for silicon lottery, you know) and quadfire vegas for portal to hell to open, actually. Of course with Prime95 and Furmark running in the same time.

Source: am literally devil.
Hang on, only 1 CPU?

I thought you'd need at least a dual socket setup for that! What happened, did the traffic slow down that you are doing limited time discounts?
 
Was really hoping they would keep the Aluminum shroud with the R cube in corner

how is the build quality on em?

I think there'll be a special edition in that super-glowy-fold-out box we've seen teased.

And then this is the generic reference.
 
Was really hoping they would keep the Aluminum shroud with the R cube in corner

how is the build quality on em?

Picture on the news page.

If this is a Pepsi-Challenge, will we get the video before launch I wonder?

Yes.


Earlier you mentioned your sources have the performance between 1080Ti and the 1080, but current news and rumors have it @1080 level, what do you think now?

Reviews @ July 31?
And what's with the RX2 sticker?

Engineering Sample.

I can't tell if he's being serious or taking a jab at AMDs ridiculous marketing stunt the past week or both.

We took AMD's framework, and put our own spin on it.
 
So no review soon? Is the card loaned from AMD to do just the blind test?
AMD hand delivered this card to my house on Saturday morning, and took it with them Saturday evening. That all said, this card was an engineering sample but I was specifically told that it was representative of retail product. So basically a "reference" card built outside of mass production.

EDIT: I mistyped previously. ASUS has nothing to do with this.
 
Last edited:
Were you able to conduct any VR demos or was it too time constraint?
No, we were packed in VERY tight for what we were doing. Had to narrow down to one game. :\

So this is not close to an end-all be-all comparison at all. Just a quick look.
 
I am really looking forward to the results of this Vega FreeSync vs 1080 Ti G-Sync shootout.

Here's my $1 bet:
- The GTX 1080 Ti will pump out better objectively measured framerates at 1080p, 1440p, and 4K and use less juice doing it.
- At all resolutions, with FreeSync and G-Sync enabled, not even the keen-eyed twitch gamers will be able to tell the setups apart. That is, they will both look outstanding.

I like this comparison's setup. When people see longer bar graphs for 150fps vs shorter bar graphs for 120fps they usually think the former is better, without realizing that they almost certainly can't perceive any differences when both are > 90ish fps.

Vega will be less expensive than the GTX 1080 Ti, and the price disparity only increases factoring in the lower cost of FreeSync vs G-Sync. It will perform equivalently in real world scenarios but lose in lab tests. It will use more power.

I'll wait to see how much less expensive and how much more power will be needed for the flagship Vega card to compete with the 1080 Ti.

Thanks, Kyle.
 
The power of Kyle. Here I have a problem convincing the post office to deliver something at all.


Please let there be numbers.
Sorry, there will be ZERO objective data. But that is exactly what this was all about. Should be fun.

I'm guessing Kyle's blind test is he put two systems with different monitors, but the same hardware. That would be hilarious.
LOL! That is an idea......I like it.
 
Can't we at least hope for your methodology of "max playable settings" to be present in this comparison?
One game, DOOM running under Vulkan. Ultra settings at 3440x1440. We used the preset so no "funny business" could be charged.
 
Well using Doom will really give us an idea of how drivers changed performance from 6 months ago :)
 
Well using Doom will really give us an idea of how drivers changed performance from 6 months ago :)

The problem with Doom is it already runs so fast on even Maxwell that a blind challenge wouldn't show any difference. Using a more taxing game where fps matters would certainly be more realistic since these are the best from both companies.
 
The problem with Doom is it already runs so fast on even Maxwell that a blind challenge wouldn't show any difference. Using a more taxing game where fps matters would certainly be more realistic since these are the best from both companies.
Then this fully subjective preview we pulled off will in no way interest you. Your thoughts are noted.
 
The problem with Doom is it already runs so fast on even Maxwell that a blind challenge wouldn't show any difference. Using a more taxing game where fps matters would certainly be more realistic since these are the best from both companies.

Doesn’t really matter at the end of the day. We basically already know it won’t be competing with Nvidias high end and they will have to rely on subjective tests to make them seem competitive. The raw performance won’t be there.
 
Was it Roy who dropped it off for you? Rumour has it that you two are bffs now and were even spotted golfing together.
 
The problem with Doom is it already runs so fast on even Maxwell that a blind challenge wouldn't show any difference. Using a more taxing game where fps matters would certainly be more realistic since these are the best from both companies.

I only got into sync displays about two months ago and for me, subjectively, they're both nice in the "sweet spot" 85-120 FPS range (IMHO) but I find G-Sync superior on the fringes (30-60 FPS, 120+ FPS, at least on a 144-165 Hz display). So if you get two competing cards and screens that consistently fall into the desirable range I think subjectively you won't see much difference. Nevertheless I am interested in seeing a quick preview as this test suggests.
 
Kyle seems rather available at his keyboard today.... mayhaps waiting for video to finish processing and then upload?


EDIT: Ah I see in the other thread he's still editing.... I look forward to solid viewing tomorrow... maybe.
 
Back
Top