Vega Rumors

Kyle seems rather available at his keyboard today.... mayhaps waiting for video to finish processing and then upload?


EDIT: Ah I see in the other thread he's still editing.... I look forward to solid viewing tomorrow... maybe.
 
Radeon RX 480 reference

Radeon-RX480-Reference-Card.jpg

Was also the 390 reference that never actually got to stores as i showed a few pages back.
 
From their press release.

Current Outlook

AMD’s outlook statements are based on current expectations. The following statements are forward-looking, and actual results could differ materially depending on market conditions and the factors set forth under “Cautionary Statement” below.

For the third quarter of 2017, AMD expects revenue to increase approximately 23 percent sequentially, plus or minus 3 percent. The midpoint of guidance would result in third quarter 2017 revenue increasing approximately 15 percent year-over-year. AMD now expects annual revenue to increase by a mid to high-teens percentage, compared to prior guidance of low double digit percentage revenue growth.


https://venturebeat.com/2017/07/25/...spite-its-most-competitive-chips-in-a-decade/

Advanced Micro Devices posted a GAAP second-quarter loss of 2 cents a share and 2 cents a share profit on an adjusted basis, even as it continued with the launch of its most competitive chips in a decade. But in the third quarter, AMD expects revenues to rise 23 percent above Q2 levels.

Analysts expected AMD to report a loss of 3 cents a share on a GAAP basis, and a breakeven quarter on an adjusted basis. In the same quarter a year ago, AMD reported a profit of 8 cents a share.

Analysts also expected AMD to report sales of $1.16 billion. Actual revenue in the second quarter was $1.22 billion, up 19 percent from a year ago. Sales were driven by the first full quarter of Ryzen processor sales. Average selling prices were up significantly from a year ago.

The GAAP net loss was $16 million compared to net income of $69 million a year ago and a net loss of $73 million in the prior quarter. Loss per share was 2 cents, compared to 8 cents a year ago. On an adjusted basis, net income was $19 million compared to a net loss of $40 million a year ago and a net loss of $38 million in the prior quarter. Diluted earnings per share was 2 cents, compared to a loss per share of 5 cents a year ago and a loss per share of 4 cents in the prior quarter.

Listening to earnings call, AMD increased R&D spending but still careful to keep within operating expenses, AMD also careful using mining taking a long-term place in focus or earnings.
 
I figured he did it for the no excuses angle. Especially since AMD has been showing off Doom for months.

If he did anything but Doom or BF all you'd hear is "it's not optimized yet!"

No. If card is supplied by AMD to kyle and before NDA! And he is told you can only publish this this and this. I don't think he can or he wouldn't be able to bring it to us. I do however think AMD is really going after this blind test like HARD! The card is going to be probably close to 1080. Shit the way they are advertising the blind test I wouldn't be surprised if its 10% slower then 1080.
 
I would like to see more blind tests conducted with various gamers. Just looking at numbers does not always give the gaming experience aspect to the performance. There is a point where increase performance does little to nothing for an individual playing games - could extend the viability of the card in time though. Look very much forward to this test and hopefully will push more testing in this area. FreeSync and GSync does make a huge difference in smoothness. At least FreeSync in my case.
 
No. If card is supplied by AMD to kyle and before NDA! And he is told you can only publish this this and this. I don't think he can or he wouldn't be able to bring it to us. I do however think AMD is really going after this blind test like HARD! The card is going to be probably close to 1080. Shit the way they are advertising the blind test I wouldn't be surprised if its 10% slower then 1080.

The whole blind test thing never worked for Pepsi lol and that is subjective, graphics card performance isn't subjective, its definitive.
 
graphics card performance isn't subjective, its definitive.
Not entirely, there is a subjective aspect. Many people still doubt the value of high refresh monitors and can't seem to tell the difference. So to me (or someone else with a discerning eye) there is an obvious improvement from 60Hz to 120Hz. But someone else says they don't see a difference. We know physically the monitor is pumping out more frames, but to someone that can't experience the extra frames, that monitor (and the GPU to power it) would be a waste. Some people go crazy over input lag, either from software or hardware, while others (like me) barely notice anything. While you can objectively measure lag, that still doesn't explain why some people can "feel" it and some people cannot.
 
Not entirely, there is a subjective aspect. Many people still doubt the value of high refresh monitors and can't seem to tell the difference. So to me (or someone else with a discerning eye) there is an obvious improvement from 60Hz to 120Hz. But someone else says they don't see a difference. We know physically the monitor is pumping out more frames, but to someone that can't experience the extra frames, that monitor (and the GPU to power it) would be a waste. Some people go crazy over input lag, either from software or hardware, while others (like me) barely notice anything. While you can objectively measure lag, that still doesn't explain why some people can "feel" it and some people cannot.


Performance is not subjective, either its there or not. Now subjective things like color, smoothness, etc, yeah that can be there, but when it comes to performance, nothing subjective about it.
 
Performance is not subjective, either its there or not. Now subjective things like color, smoothness, etc, yeah that can be there, but when it comes to performance, nothing subjective about it.
Yes, performance can be measured and compared but what does that performance mean to the buyer or gaming experience is the answer some I would think would be looking for.

I am very sensitive to input lag and have a hard time not understanding how others do not see it yet I can tell the difference between 60fps and 120fps but the benefits of 120fps just does not seem much to me. So folks do differ in what they think is better in other words. I am also very sensitive to color or weak colors, which for me is the weakest thing with the Vive. When I get into a game FPS does not matter that much to me if above 30fps. I do like the smoothness of either vertical sync if kept at 60fps using adaptive sync or >40fps FreeSync and 90fps in VR. So some serious blind studies for gaming would be some excellent data not only to users on what to buy but also developers as well.
 
Yes, performance can be measured and compared but what does that performance mean to the buyer or gaming experience is the answer some I would think would be looking for.

I am very sensitive to input lag and have a hard time not understanding how others do not see it yet I can tell the difference between 60fps and 120fps but the benefits of 120fps just does not seem much to me. So folks do differ in what they think is better in other words. I am also very sensitive to color or weak colors, which for me is the weakest thing with the Vive. When I get into a game FPS does not matter that much to me if above 30fps. I do like the smoothness of either vertical sync if kept at 60fps using adaptive sync or >40fps FreeSync and 90fps in VR. So some serious blind studies for gaming would be some excellent data not only to users on what to buy but also developers as well.


Well its more dependent on the person. For me 30 all the way up to 120 I can tell the difference, but anything above 75, its not bad for me. This is also dependent on the game and API used as well. Ogl tends to be worse than DX when going around 30 FPS. For fast passed shooters higher than 75 always better for me. For driving sims higher than 120.

Games like D3, or Starcaft 2, anything above 30.

So its really all over the map.

I've tested out graphics engines back in the day when playing at 25-30 FPS, it was actually ok in FPS under DX mode but in Ogl, you would get tearing and stuttering. Back then we couldn't test for frame times but pretty sure that was what was going on the Ogl versions. But that problem is still persistent today. Haven't really tested out Vulkan yet in UE4.......

Also monitors make a difference, their refresh ability. Some of the Dell professional monitors I've used work better than gaming monitors, even though their refresh rates are lower.....

There are so many variables that are subjective when it comes to smoothness its really up in the air.

But one thing for certain if min frames are good, and avg frame rates are good, and frame times aren't jumping all over the place, there is a good chance your gaming experience will be better. Those things are performance indicators and those things can be quantified with valid data backing them up.
 
Last edited:
Well its more dependent on the person. For me 30 all the way up to 120 I can tell the difference, but anything above 75, its not bad for me. This is also dependent on the game and API used as well. Ogl tends to be worse than DX when going around 30 FPS. For fast passed shooters higher than 75 always better for me. For driving sims higher than 120.

Games like D3, or Starcaft 2, anything above 30.

So its really all over the map.

I've tested out graphics engines back in the day when playing at 25-30 FPS, it was actually ok in FPS under DX mode but in Ogl, you would get tearing and stuttering. Back then we couldn't test for frame times but pretty sure that was what was going on the Ogl versions. But that problem is still persistent today. Haven't really tested out Vulkan yet in UE4.......

Also monitors make a difference, their refresh ability. Some of the Dell professional monitors I've used work better than gaming monitors, even though their refresh rates are lower.....

There are so many variables that are subjective when it comes to smoothness its really up in the air.

But one thing for certain if min frames are good, and avg frame rates are good, and frame times aren't jumping all over the place, there is a good chance your gaming experience will be better. Those things are performance indicators and those things can be quantified with valid data backing them up.

Solid CRT monitors back in those days were far superior to almost anything pre-100hz+ on very recent LCD techs.

If not even better (at 60-75hz CRT) because nil on the "GTG" times.

EDIT: Although I do remember being able to look off to the side in the dark with my CRT and see the pulsing refresh of light/dark out of the corner of my eye with more cones than rods.
 
Actually that campaign worked really well, what failed was Pepsi sucked at capitalizing on it. http://juiceboxinteractive.com/ideas/pepsi-lost-challenge-won-battle/

It failed,

And when a person tried drinking pepsi, they liked it but it was temporary, it was sweeter than Coke, but after a while those people went back to Coke. Didn't need an article to know that man.

The article you linked to doesn't go into the when a person eats or drinks the same things day in and day out, something new does taste better, but its physiological added to physical, where your body and mind is so used to something that its just bland. So something new changes that.

Just try this, eat the same food (exact same food, like frozen pizza, same brand, same toppings) for one month straight. See if you get sick of it because its just bland. Then try the same type of food made by someone else, it will taste much much better.
 
It failed,

And when a person tried drinking pepsi, they liked it but it was temporary, it was sweeter than Coke, but after a while those people went back to Coke. Didn't need an article to know that man.

The article you linked to doesn't go into the when a person eats or drinks the same things day in and day out, something new does taste better, but its physiological added to physical, where your body and mind is so used to something that its just bland. So something new changes that.

Just try this, eat the same food (exact same food, like frozen pizza, same brand, same toppings) for one month straight. See if you get sick of it because its just bland. Then try the same type of food made by someone else, it will taste much much better.

The article goes into making it a habit and how Pepsi should have given out a free case or two of Pepsi to make it a habit. The greatest difficulty when facing a massive competitor that dominates the market is to get people to even try your product. It also made Coke doubt itself and caused them to rush out a new formula similar to Pepsi called New Coke which made people angry since quite a few still wanted the original formula. Which fractured Cokes customers to the point that Pepsi got tons of business and then it forced Coke to release the original formula as Coke Classic. Coke saved themselves from destruction but they lost tons of customers to Pepsi during that turmoil and New Coke faded into history. People tend to be creatures of habit and the hardest part is trying to get people to change it and Pepsi failed at that but succeeded in making their competitor do something stupid and stealing tons of their business that way. Now if only I could convince my wife to like Coke instead of Pepsi :)
 
Now I'll just grab some popcorn and wait & watch...

I wonder if green side bois will call you sell out now and I also wonder if red side bois now suddenly find your test(s) fair.

:ahappy:
I have already been called an "AMD shill" multiple times for even talking about this.
 
But... we all know you're an Nvidia shill for criticizing the Fury Nano, etc. And an Intel shill for not talking about how awesome Bulldozer was. I just don't know what to believe anymore. :(
Your puny mind just can't grasp that manly Texans can be shills for shills while at the same time counter shilling for completely different shills and secretly in league with the Shill Lords of Santa Clara.

On another note - is there an ETA for the article? I really need to get some work done today and pressing F5 isn't terribly productive.
 
Your puny mind just can't grasp that manly Texans can be shills for shills while at the same time counter shilling for completely different shills and secretly in league with the Shill Lords of Santa Clara.

On another note - is there an ETA for the article? I really need to get some work done today and pressing F5 isn't terribly productive.

Shillception?
 
But... we all know you're an Nvidia shill for criticizing the Fury Nano, etc. And an Intel shill for not talking about how awesome Bulldozer was. I just don't know what to believe anymore. :(
Let's just conclude that Kyle is a shill, but what kind of shill anyone can decide depending on their fanboy classification and applying the name of the rival team :)
 
On another note - is there an ETA for the article? I really need to get some work done today and pressing F5 isn't terribly productive.
I have to do news this morning while I am writing this thing up, but I am hoping for by lunch time PDT...Video is done though. And most of this is video.

upload_2017-7-26_8-54-53.png
 
Will the video be screen-capture or recorded via a camera?
You have never seen DOOM before? I spent no time worrying about showing gameplay. It is all just explanation and interview.
 
I have seen Doom before (the original on my i486 DX) just wondered how the test was recorded ;)
We recorded nothing. We let people play on both stations back to back, then back to back again, pulled these guys and interviewed them.

And we actually kept them on the same map over and over so they would get a good feel for what was going on so they could get some good gaming going on. Couple of guys were playing on Ultra-Violence as well. Damn kids.
 
My one question to you is, why are you not showing any hard data during this comparison?

Card is under NDA for another week.
Why don't you just show us hard data and put your livelihood at stake?

I can relate, all this stuff really appeals to my sense of schadenfreude, but it's not that hard to figure out. This isn't Wccftech.
 
But... we all know you're an Nvidia shill for criticizing the Fury Nano, etc. And an Intel shill for not talking about how awesome Bulldozer was. I just don't know what to believe anymore. :(
Indeed. Should be interesting...
 
Kyle, can you comment on a RX Vega x2 type SKU ? (ie: was that RX2 an oops?)



I am oldschool and been around. There are too many smug smiles in certain corners at these advents. Too many little hints and subtleties, too many stars aligning and insider comments. And no leaks. Makes me wonder/fathom... Is "TitanRipper" true ? Are we going to see AMD put two vega dies together on one SOC ? As some sort of "ALL IN !" style of Card.. ? (People without 1,200watt PSUs and trying to min/max their buying power, need not apply)


As a Gamer, I really don't care about cinebench style of non-gaming benches, i'd rather see overtime studies and histograms, etc. Minimals are what is most important to my games. No maximums. Plus, I have my own non-gaming rigs for hobby horse style of computer craft, OC, Folding, Mining, etc. I am here strickly for and about/focused on GAMING !

Or if One would like, a high-end gamer HEG. That is dreaming about a HEDT GPU that can power 4k Gaming beyond 100Hz+

I am getting close to 50 years now (*waves at Kyle), & Doom isn't getting any better at only 34 widescreen. Where are the 40" OLED 4k HDMI 2.1 @ 144hz Gaming monitors at ? And even more important than that, is Game latencies and less is more ? That is I can not get my head away from why nobody is discussing FreeSync2 ? Which for a gamer is exactly what I want.



I have $$ and retiring in 10 years. Where is the beef!






~ sine wave ~

RX2 is an engineering sample that should be representative to the final product.
 
Kyle, can you comment on a RX Vega x2 type SKU ? (ie: was that RX2 an oops?)



I am oldschool and been around. There are too many smug smiles in certain corners at these advents. Too many little hints and subtleties, too many stars aligning and insider comments. And no leaks. Makes me wonder/fathom... Is "TitanRipper" true ? Are we going to see AMD put two vega dies together on one SOC ? As some sort of "ALL IN !" style of Card.. ? (People without 1,200watt PSUs and trying to min/max their buying power, need not apply)


As a Gamer, I really don't care about cinebench style of non-gaming benches, i'd rather see overtime studies and histograms, etc. Minimals are what is most important to my games. No maximums. Plus, I have my own non-gaming rigs for hobby horse style of computer craft, OC, Folding, Mining, etc. I am here strickly for and about/focused on GAMING !

Or if One would like, a high-end gamer HEG. That is dreaming about a HEDT GPU that can power 4k Gaming beyond 100Hz+

I am getting close to 50 years now (*waves at Kyle), & Doom isn't getting any better at only 34 widescreen. Where are the 40" OLED 4k HDMI 2.1 @ 144hz Gaming monitors at ? And even more important than that, is Game latencies and less is more ? That is I can not get my head away from why nobody is discussing FreeSync2 ? Which for a gamer is exactly what I want.



I have $$ and retiring in 10 years. Where is the beef!






~ sine wave ~


He already said what the RX2 is just a internal numbering system, ie: Vega RX prototype #2. Unlikely to see a Vega X2 variant mainly due to power draw.
 
Kyle, can you comment on a RX Vega x2 type SKU ? (ie: was that RX2 an oops?)
As already mentioned above.

I went 48" 4K JS9000 (certainly less expensive options out there now) on my desktop after having 34" Eyefinity/Surround for 6 years. I love it. Just make sure you get a panel with as little lag possible.
 
As an Amazon Associate, HardForum may earn from qualifying purchases.
LOVE the high end Samsung panels. Shame that wasnt out when i bought my F8500 2.5 years ago.

Looking forward to the read, Kyle. Thanks.
 
Back
Top