New Intel Core Processor Combines High-Performance CPU with Custom Discrete Graphics from AMD

Its faster than GDDR how? And the CPU is still connected to an old fashion PCIe x8.

In terms of power, in terms of per pin speed, in terms of smaller packaging, in terms of shorter/less traces, in terms of linear access. It loses out on cost and random access.
 
Perhaps I misunderstood what you meant when you said "so NVidia can compete with AMD"? G-Sync has been superior in performance to Freesync, I can't say it will be superior to Freesync2, I haven't seen the numbers and use cases on it yet. Cost wise, I also feel that G-Sync is a superior option unless a person is already invested in Freesync capable components or unless you scale your performance needs for an "affordable solution" similar to when I build a machine specifically for 1080P gaming wishing to avoid the higher costs associated with more advanced display options.

So to be clear, when someone says "compete", it would be normal to think the speaker is referring to price points and performance. HDR is a feature but I don't know anyone who would make a major buying decision based solely on this feature, it's just not that ground breaking compared with adaptive sync, high refresh rates, etc which have tremendous impact on game play and not just how pretty the lighting is. HDR is a fad in comparison.

Of course some people must have the latest even if it is a relatively minor feature that requires full support of the display, the display adapter, and the software.

BTW, does idiot know that you quoted him?

Well I think it all depends on the monitor, the res you play at and the card. It's hard to make an honest apples to apples comparison. In some ways I think FreeSync is better because it's part of a broader open standard and cheaper to implement. NVIDIA has to reengineer new features every time a new standard pops up. That means you are locked into NVIDIA's hardware and your quality of video display is dependent on them. AMD can hand it over to the chip manufacturers who might make a better processor or scaller. But Freesync is not superior especially with ultra low refresh rates (<30fps) where tearing starts. Even LFC doesn't help much there.
 
Nvidia popularized GPU compute and made it accessible/easier for programmers. However, they in no way pioneered it. Not even close. Big-iron or otherwise.

They made it available. And just like today, AMD is attempting to 'innovate ahead' by including features that drive up die sizes and TDP that just aren't used in the consumer market; they did that with the first Radeon, and every one after that. Features that were mostly ignored until developers demanded standardization, just like today.

I guess it might be better to say that Nvidia pioneered the GPU compute market, if that makes you feel better. They were all working on these technologies, sort of, but Nvidia literally designed the reference GPUs for DX10, and then wrote the language of big iron GPU compute, before AMD even had a competitive DX10 part.
 
Well I think it all depends on the monitor, the res you play at and the card. It's hard to make an honest apples to apples comparison. In some ways I think FreeSync is better because it's part of a broader open standard and cheaper to implement. NVIDIA has to reengineer new features every time a new standard pops up. That means you are locked into NVIDIA's hardware and your quality of video display is dependent on them. AMD can hand it over to the chip manufacturers who might make a better processor or scaller. But Freesync is not superior especially with ultra low refresh rates (<30fps) where tearing starts. Even LFC doesn't help much there.

Switch from theory to reality: reality is that if you buy AMD you're stuck with FreeSync, and vice-versa, and if you buy Nvidia you're stuck with G-Sync, and vice-versa.

Which company is more likely to regularly deliver top-end performing GPUs? Be honest.


[this is why I didn't mind jumping on the G-Sync bandwagon to give VRR a try; if I had to make a bet, I'd bet on Nvidia, even as I hope that AMD keeps up]
 
They made it available. And just like today, AMD is attempting to 'innovate ahead' by including features that drive up die sizes and TDP that just aren't used in the consumer market; they did that with the first Radeon, and every one after that. Features that were mostly ignored until developers demanded standardization, just like today.

I guess it might be better to say that Nvidia pioneered the GPU compute market, if that makes you feel better. They were all working on these technologies, sort of, but Nvidia literally designed the reference GPUs for DX10, and then wrote the language of big iron GPU compute, before AMD even had a competitive DX10 part.

I was working with early access of OpenCL when I was using GPU's to do similarity analysis (Vector math) in real time sensor data in mass specs. While NVIDIA might have had input, they certainly didn't write it.
 
Switch from theory to reality: reality is that if you buy AMD you're stuck with FreeSync, and vice-versa, and if you buy Nvidia you're stuck with G-Sync, and vice-versa.

Which company is more likely to regularly deliver top-end performing GPUs? Be honest.


[this is why I didn't mind jumping on the G-Sync bandwagon to give VRR a try; if I had to make a bet, I'd bet on Nvidia, even as I hope that AMD keeps up]

If you have an endless budget great. If you don't have the chops to spend $700 on a video card + another $600 on a GSync monitor, Vega 56 might provide an acceptable lower cost/lower res entry price point. 1080, 1200, 1440 is acceptable for many. At the middle ground $200 goes a long way to buying a more powerful card over a more expensive monitor.
 
I was working with early access of OpenCL when I was using GPU's to do similarity analysis (Vector math) in real time sensor data in mass specs. While NVIDIA might have had input, they certainly didn't write it.

I'm not speaking against your experience, but I am talking about CUDA here.

If you have an endless budget great. If you don't have the chops to spend $700 on a video card + another $600 on a GSync monitor, Vega 56 might provide an acceptable lower cost/lower res entry price point. 1080, 1200, 1440 is acceptable for many. At the middle ground $200 goes a long way to buying a more powerful card over a more expensive monitor.

Price is certainly a concern, and it's clear that Nvidia hasn't really tried to cater to budget users with G-Sync, so I would certainly agree that you might be able to get a very good VRR experience for less with AMD and FreeSync. I was only speaking for myself; I'd already invested in Nvidia GPUs at the time, and comparable FreeSync displays next to my requirements weren't yet out (while second-gen G-Sync displays were going on sale...).

If you're going to try it on a budget today, though, I'd point you toward whatever AMD setup you can afford.
 
This is a much different class of iGPU from the one in Raven Ridge. Raven Ridge has 10 Vega compute units, while this thing has 24 Polaris CUs. So technically a 140% bigger GPU (Vega is a newer arch but we can ignore that for the most part). Also this product will have HBM2.. while Raven Ridge has to contend with sharing VRAM with system memory and GPU love memory bandwidth. Raven Ridge targets 15 watts, and this will probably be more like 65 watts.

Basically they are for different markets entirely. Raven Ridge is for ultra portable laptops, capable of gaming, while Kaby Lake G is more for professional workstation replacements or gaming laptops.

Are you saying that AMD would never launch any Raven Ridge processors in configurations other than low power?

...and that there would never be high performance Raven Ridge processors?
 
The volume of HBM2 supply is critical for it to be a consumer budget option, there are no indicators that this is any different for Intel as for AMD. secondly why would AMD help launch a product that will screw themselves with the new https://www.anandtech.com/show/1196...md-apus-for-laptops-with-vega-and-updated-zen

Makes no sense whatsoever ....

How do you know it will screw them in the future?? They don't currently have any product in that stack. Whose to say that if this venture isn't profitable that they cannot stick an even more powerful Vega part in with their own cpu later? Whose to say that the downsides are not worth the upside? Their stock has already started to completely change it's footprint on Wallstreet, which sees this as a stamp of approval of their tech. Last week, you had tools from Morgan Stanley downgrading AMD and this week they are being targeted at $20 and upgraded to buy status. It speaks volumes that Intel chose AMD. I'm not even going to get into what Intel can do for AMD with their fab abilities...
 
In terms of power, in terms of per pin speed, in terms of smaller packaging, in terms of shorter/less traces, in terms of linear access. It loses out on cost and random access.
HBM2 only got an ECC and size advantage over GDDR5X/GDDR6. Even Hynix states so.
 
HBM2 only got an ECC and size advantage over GDDR5X/GDDR6. Even Hynix states so.

Your incorrect, you need quite a few chips of gddr to get the bandwidth while HBM can get it with one chip and this has been a obvious end goal for HBM despite you and Factum and other saying it will never happen. Your buddy Juangra even said it would not use EMIB and he was wrong like usual. It's actually Intels EMIB which makes this simple and more cost effective which just has to be killing you. Your even down to bashing KBL to try to discredit this product and were all having a good laugh watching you meltdown over it.
 
Your incorrect, you need quite a few chips of gddr to get the bandwidth while HBM can get it with one chip.

Hence the size advantage you somehow missed reading :)

But from a consumer standpoint that's it. Then the downsides like cost begins.
 
Hence the size advantage you somehow missed reading :)

But from a consumer standpoint that's it. Then the downsides like cost begins.

With the costs of DDR these days I think the gap is far less now. We also have no idea what HBM is being sold to AMD for, just speculation.
 
Well I think it all depends on the monitor, the res you play at and the card. It's hard to make an honest apples to apples comparison. In some ways I think FreeSync is better because it's part of a broader open standard and cheaper to implement. NVIDIA has to reengineer new features every time a new standard pops up. That means you are locked into NVIDIA's hardware and your quality of video display is dependent on them. AMD can hand it over to the chip manufacturers who might make a better processor or scaller. But Freesync is not superior especially with ultra low refresh rates (<30fps) where tearing starts. Even LFC doesn't help much there.

I'd like to clarify a few things to make sure we are on the same page. Name a none ATI/AMD card that does Freesync and tell me again how "open and non-proprietary" this standard is? If there are only two options, NVidia w/Gsync or AMD w/Freesync, then just how non-proprietary is your standard?

I already laid out parameters for the comparison and specified that an AMD option could be a better choice for someone who is either already invested in that solution, (maybe they already have a powerful PSU and an ATI card, buying a Freesync monitor and a second card would be cheaper than buying an NVidia card and G-Sync monitor. Being willing to accept lower resolutions, graphics setting, etc could all be factors allowing for cost savings.

I am running a Founder's 1070 card to an Acer X34 playing World of Tanks, Mechwarrior online, Fallout4 and Skyrim, and a few other motley titles. These are not all killer graphics intensive titles but they all run perfectly at 3440x1440 @ 75hz settings which don't seem to matter because for example, sitting around with nothing happening I get over 100FPS and in the middle of the action it drops to the 50s and I don't get a single repeated frame, torn image, or hicup or pause, there is no graphics lag or slowness. Sitting on the side running Teamspeak and a browser open is my Dell S2417DG monitor at 2560x1440 with G-sync on as well because some games just don't do 21:9 well, like StarCraft Remastered for example.

The X34 was $1,000 and the 1070 cost me around $450, my PSU is a 650 Watt SeaSonic PSU that probably cost me like $120.
If you can find a 21:9 Freesync comparable to my X34, you will need at least two ATI cards to match my experience and a beefier PSU. And unless you already own some of these components, the price will be almost identical. In simple terms, the second card you need to match the 1070's performance costs just as much as the "G-Sync tax". And as I said earlier, reality is, your just as locked into one manufacturer as you are with the other, this open standard isn't something anyone can actually realize. If you can, then I am wrong and I'd want to be enlightened so point out that non AMD-card that does Freesync.
 
I'd like to clarify a few things to make sure we are on the same page. Name a none ATI/AMD card that does Freesync and tell me again how "open and non-proprietary" this standard is? If there are only two options, NVidia w/Gsync or AMD w/Freesync, then just how non-proprietary is your standard?

I already laid out parameters for the comparison and specified that an AMD option could be a better choice for someone who is either already invested in that solution, (maybe they already have a powerful PSU and an ATI card, buying a Freesync monitor and a second card would be cheaper than buying an NVidia card and G-Sync monitor. Being willing to accept lower resolutions, graphics setting, etc could all be factors allowing for cost savings.

I am running a Founder's 1070 card to an Acer X34 playing World of Tanks, Mechwarrior online, Fallout4 and Skyrim, and a few other motley titles. These are not all killer graphics intensive titles but they all run perfectly at 3440x1440 @ 75hz settings which don't seem to matter because for example, sitting around with nothing happening I get over 100FPS and in the middle of the action it drops to the 50s and I don't get a single repeated frame, torn image, or hicup or pause, there is no graphics lag or slowness. Sitting on the side running Teamspeak and a browser open is my Dell S2417DG monitor at 2560x1440 with G-sync on as well because some games just don't do 21:9 well, like StarCraft Remastered for example.

The X34 was $1,000 and the 1070 cost me around $450, my PSU is a 650 Watt SeaSonic PSU that probably cost me like $120.
If you can find a 21:9 Freesync comparable to my X34, you will need at least two ATI cards to match my experience and a beefier PSU. And unless you already own some of these components, the price will be almost identical. In simple terms, the second card you need to match the 1070's performance costs just as much as the "G-Sync tax". And as I said earlier, reality is, your just as locked into one manufacturer as you are with the other, this open standard isn't something anyone can actually realize. If you can, then I am wrong and I'd want to be enlightened so point out that non AMD-card that does Freesync.

A Vega 56 matches a 1070 or slightly outperforms it and a 650 power supply will work for it as well.
 
I'd like to clarify a few things to make sure we are on the same page. Name a none ATI/AMD card that does Freesync and tell me again how "open and non-proprietary" this standard is? If there are only two options, NVidia w/Gsync or AMD w/Freesync, then just how non-proprietary is your standard?

I already laid out parameters for the comparison and specified that an AMD option could be a better choice for someone who is either already invested in that solution, (maybe they already have a powerful PSU and an ATI card, buying a Freesync monitor and a second card would be cheaper than buying an NVidia card and G-Sync monitor. Being willing to accept lower resolutions, graphics setting, etc could all be factors allowing for cost savings.

I am running a Founder's 1070 card to an Acer X34 playing World of Tanks, Mechwarrior online, Fallout4 and Skyrim, and a few other motley titles. These are not all killer graphics intensive titles but they all run perfectly at 3440x1440 @ 75hz settings which don't seem to matter because for example, sitting around with nothing happening I get over 100FPS and in the middle of the action it drops to the 50s and I don't get a single repeated frame, torn image, or hicup or pause, there is no graphics lag or slowness. Sitting on the side running Teamspeak and a browser open is my Dell S2417DG monitor at 2560x1440 with G-sync on as well because some games just don't do 21:9 well, like StarCraft Remastered for example.

The X34 was $1,000 and the 1070 cost me around $450, my PSU is a 650 Watt SeaSonic PSU that probably cost me like $120.
If you can find a 21:9 Freesync comparable to my X34, you will need at least two ATI cards to match my experience and a beefier PSU. And unless you already own some of these components, the price will be almost identical. In simple terms, the second card you need to match the 1070's performance costs just as much as the "G-Sync tax". And as I said earlier, reality is, your just as locked into one manufacturer as you are with the other, this open standard isn't something anyone can actually realize. If you can, then I am wrong and I'd want to be enlightened so point out that non AMD-card that does Freesync.

Freesync was actually part of HDMI spec.

Not only that but amd doesn't charge a royalty to implement it. With Nvidia you are forced to buy the input/scaler/buffer boards from them.

So yes amd is open in that anyone can implement it without paying a surcharge.

And for the record the 1070 is about even with a Vega 56 in terms of price and performance. And power consumption isn't insane.

And a Gsync monitor may not always win against a Freesync one. The Freesync monitor may have better specs.

I'm not knocking on your choice but you are paying a premium on it.


$500 34" (Generic) 3440x1440 100Hz https://www.newegg.com/Product/Product.aspx?Item=9SIA6BM63P7359
$749 34" LG 3440x1440 75Hz https://www.newegg.com/Product/Product.aspx?Item=N82E16824025429
$799 34" ASUS 3440x1440 100Hz https://www.newegg.com/Product/Product.aspx?Item=N82E16824236803

The list goes on and on.
 
Last edited by a moderator:
A Vega 56 matches a 1070 or slightly outperforms it and a 650 power supply will work for it as well.

So take my challenge, build two identical systems with one exception, one uses AMD and Freesync and the other NVidia and G-Sync and post the costs cause although your new card, if you can buy it, is faster than my 8 month old card, there are still faster NVidia cards available that are at the same price point as yours. All you have to do is be reasonably fair and see how the numbers stack up.

From what I saw just now, the Vega 56 is faster than my 1070, but costs more, a 1070 TI costs about the same but is faster than the Vega 56, the 1080 is faster and cheaper than a Vega-64, and the 1080 Ti just walks away with it all.

But that isn't really the point, the point, as was mentioned by DiggitalGriffen earlier also calls into question what results a buy wishes to acheive so set realistic goal and go for it. But telling me you found a card that is a little better but costs more doesn't go very far in a discussion about competitive pricing. I already stated my card does the job, the Vega-56 could maybe do it better, but will it be cheaper if you try to match my Acer X34?
 
So take my challenge, build two identical systems with one exception, one uses AMD and Freesync and the other NVidia and G-Sync and post the costs cause although your new card, if you can buy it, is faster than my 8 month old card, there are still faster NVidia cards available that are at the same price point as yours. All you have to do is be reasonably fair and see how the numbers stack up.
Are you saying that the [H] gaming blind test was not "reasonably fair"?
 
Freesync was actually part of HDMI spec.

Not only that but amd doesn't charge a royalty to implement it. With Nvidia you are forced to buy the input/scaler/buffer boards from them.

So yes amd is open in that anyone can implement it without paying a surcharge.

And for the record the 1070 is about even with a Vega 56 in terms of price and performance. And power consumption isn't insane.

And a Gsync monitor may not always win against a Freesync one. The Freesync monitor may have better specs.

I'm not knocking on your choice but you are paying a premium on it.


$500 34" (Generic) 3440x1440 100Hz https://www.newegg.com/Product/Product.aspx?Item=9SIA6BM63P7359
$749 34" LG 3440x1440 75Hz https://www.newegg.com/Product/Product.aspx?Item=N82E16824025429
$799 34" ASUS 3440x1440 100Hz https://www.newegg.com/Product/Product.aspx?Item=N82E16824236803

The list goes on and on.

I find some problems with your statement that "Freesync is part of the HDMI spec", because my understanding when reading on it was that AMD convinced the standards people to include code that allows Freesync to function, and that is what is embedded in the DisplayPort (1.2a) specification, not the entirety of Freesync. And that means that Freesync itself is not the Adaptive Sync standard and that Freesync is not part of DisplayPort or any VESA standard, only that it functions using code that is part of the standard and that is a different animal my friend.
https://en.wikipedia.org/wiki/DisplayPort#1.2a
1.2a[edit]
DisplayPort version 1.2a may optionally include VESA's Adaptive Sync.[15] AMD's FreeSync utilizes the DisplayPort Adaptive-Sync feature for operation.

Alas, I can't get those pages to load from this government computer and without knowing where the prices are coming from I can't guess the model numbers for them. It does look to me like the color capabilities of the ASUS are inferior if Newegg's information is correct, 16.7 million vs 1.07 billion on all others. None of the remaining LG monitors can hit 100Hz, except the Generic one, I'll have to see if I can look at that one. In the end, the aren't bad, but the Asus monitor is not only G-Sync but better so you have to pay for better usually.

I understand that you are not beating me up on this, and yes, I will admit that recent releases have made my argument weaker than it was over half a year ago when I was buying my card and monitor, before we knew what Vega would become. But I am not completely swayed because you can't show actually comparable performance at a significantly better price, at best you can save a couple hundred bucks and settle for second best, even buying just one card and leaving PSUs out of it. You spend as much or more for the card for a meager performance increase that is easily bested by the competition at the same price point, and you save about $200 dollars on a monitor that is second best at best.

Now I just got home, logged in to check this, and realized that I never completed posting this reply though it was mostly typed up. Now I have changed it, I could easily have something off or difficult to understand, apologies if it's worded screwy somewhere, I need to read chithanh's post because I have not seen what he is referring too so I need to do my homework and see if I have crow to eat.
 
I find some problems with your statement that "Freesync is part of the HDMI spec",
DisplayPort Adaptive-Sync is a new addition to the DisplayPort 1.2a specification, ported from the embedded DisplayPort v1.0 specification. DisplayPort Adaptive-Sync provides an industry-standard mechanism that enables real-time adjustment of a monitor’s refresh rate of a display over a DisplayPort link.

I stand corrected. Display port was the first to introduce it. Not HDMI. HDMI didn't support it till 2.1

It kind of went down like this (chronologically speaking)

NVIDIA announced G-Sync first.
AMD-Radeon group looked at it and went, "I know that's an optional part of DP spec." No one really looked at it twice to be honest before AMD. This is how AMD counter fired so quickly. It was because it was already there in the spec. It just needed to be implemented in the decoder/scaler hardware by the screen manufacturer. And it's likely the reason they couldn't charge royalties for it, because it is propriety to the standard.

I'm pretty sure others here will back me up on this.


I will admit there were some pretty @#$@# bad implementations of freesync. And this is one of the things Freesync 2 was designed to address and why there is a more stringent certification program now.

But I am not completely swayed because you can't show actually comparable performance at a significantly better price, at best you can save a couple hundred bucks and settle for second best, even buying just one card and leaving PSUs out of it. You spend as much or more for the card for a meager performance increase that is easily bested by the competition at the same price point, and you save about $200 dollars on a monitor that is second best at best.

That ASUS freesync panel is the exact same one used on your X34 IIRC. The cheapest Vega 56 is $399. It runs on a 650 Watt supply. That $200 in savings could have upgraded you to a Vega 64 albeit with a bigger generic power supply to replace your seasonic. (Rosewill 850Watt would have done the job.. While I'm not a big fan of them, they will get the job done with plenty of headroom.)

Bu that's the Mia Culpa of tech. It keeps advancing. You still have a nice setup.
 

Not reasonably fair?

No, I think Kyle did all he could given his focus for the test, to ensure a level field.

That being said, that is not the entire issue at all and here we go.

First, do you think these gamers didn't recognize if they were on the ASUS PG348 vs the ASUS MX34V? I did.

The ASUS PG348 is the left monitor, the ASUS MX34V is on the right.

Now I am pretty sure that Kyle either missed it, or couldn't reasonably do anything about it. But there is an LED showing on the MX34V that any owner of a PG348 would identify as not supposed to be there. Look low middle on the right side system in the video. Blind went right out the window at that point for any of these guys who have owned a PG348.

Second, Kyle said he dumbed down the visual quality but it brings me back to what I saw on Newegg's site when I was looking at the MX34V, 16.7 million colors vs 1.07 Billion. What effect does this have on performance? Is it a different bit depth? Does it reduce the workload for a machine using the MX34V so that it can run faster. Or does it only effect the color reproduction at the display, I don't know for sure myself, but I do know that visual quality is part of what a buyer pays for and although I do agree that you can run a fair test of pure performance as Kyle did, while removing visual quality from the equation. I also feel that it isn't right to insist that the price difference between to the two monitors is only about performance and other differences play no role in the price difference. I think it wasn't right to ask the money question. I get where it comes from but tell me, if he had left it out would it have had an impact on the outcome?

Striping out visual quality and knowledge of other capabilities and only showing people what looks and feels almost identical was intended to isolate the Freesync and G-Sync performance for unbiased comparison. But you can't get away from the reality that you are comparing a VA panel to an IPS panel and no one has heard of IPS glow and experienced gamers wouldn't recognize an IPS panel from a VA panel?

What about input lag, does panel type effect responsiveness? I found a report that the PG348 has a 13ms input lag, but I found no reports on the MX34V other than a Russian site that tested input lag at 1080P and 60Hz. Maybe that's how you do test input lag, but it sounds wrong to me.

So again, you asked if I though it was a reasonably fair test and I think that it was as reasonably fair as Kyle could make it although I think that asking the $300 dollar question wasn't right and wasn't needed because there is more too it and much of that more was hidden. I think a better question would have been to ask them "if they could recognize which system they were on by any means?"

I do think that Kyle did about the best you could do except that the panel types needed to be the same cause they make a difference, but that's very hard to do at this end of the ultrawide spectrum, there are very few choices.

I hope you accept this as a complete answer, just like I hope you realize that I know a loaded question when I see it. :sneaky:
 
I stand corrected. Display port was the first to introduce it. Not HDMI. HDMI didn't support it till 2.1

It kind of went down like this (chronologically speaking)

NVIDIA announced G-Sync first.
AMD-Radeon group looked at it and went, "I know that's an optional part of DP spec." No one really looked at it twice to be honest before AMD. This is how AMD counter fired so quickly. It was because it was already there in the spec. It just needed to be implemented in the decoder/scaler hardware by the screen manufacturer. And it's likely the reason they couldn't charge royalties for it, because it is propriety to the standard.

I'm pretty sure others here will back me up on this.


I will admit there were some pretty @#$@# bad implementations of freesync. And this is one of the things Freesync 2 was designed to address and why there is a more stringent certification program now.



That ASUS freesync panel is the exact same one used on your X34 IIRC. The cheapest Vega 56 is $399. It runs on a 650 Watt supply. That $200 in savings could have upgraded you to a Vega 64 albeit with a bigger generic power supply to replace your seasonic. (Rosewill 850Watt would have done the job.. While I'm not a big fan of them, they will get the job done with plenty of headroom.)

Bu that's the Mia Culpa of tech. It keeps advancing. You still have a nice setup.

Still having issues with what you are saying here.

See, things didn't happen the way you think they did when it came to making Freesync work with the Display Port 1.2 standard but that is what happened, AMD went to the VESA standards board and displayed their Freesync implementation on a laptop and asked the board to modify the standard so that Freesync would function.
I had to translate this from French I believe;
http://www.hardware.fr/news/13545/amd-freesync-proposition-dp-1-2a.html

Did it help that Syed Ather Hussain was a display architect for AMD, and the sitting Vice Chairman of the VESA Board at this time?
“DisplayPort Adaptive-Sync enables a new approach in display refresh technology, ” said Syed Athar Hussain, Display Domain Architect, AMD and VESA Board Vice Chairman.
https://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

Ah well, although I actually commend AMD for coming up with what is certainly an affordable adaptive sync solution, had they not convinced the VESA Board to change the Display Port standard so that it would allow Freesync to function, then AMD would have had no way to blunt NVidia's domination of adaptive sync technology short of following NVidia's lead. Since it really didn't cost AMD anything to get Freesync off the ground it really worked out well that even with "giving it away for free" it allowed them terrific PR and allowed them to vend off NVidia's dominating position.

As for the monitor panel, I'm using Newegg's description of the specs and it says the ASUS MX34VQ has a VA panel. Is this not the monitor you are talking about?
https://www.newegg.com/Product/Product.aspx?Item=N82E16824236803&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-KB Networks, Inc.-_-na-_-na-_-na&cm_sp=&AID=10440897&PID=3891137&SID=rewrite

And that $200 could not have upgraded me to a Video card that had not yet been released. When I bought my 1070 they were almost new and coming in stock and going out as if they were on stop light controls.
EVGA GeForce GTX 1070 SC GAMING ACX 3.0, 08G-P4-6173-KR, 8GB GDDR5, LED, DX12 OSD Support (PXOC)




    • DELIVERED ON: Thursday, June 30, 2016 9:14 AM



    • DESTINATION: SIERRA VISTA,AZ



    • $444.98

And my Acer X34 was purchased and arrived
  • DELIVERED ON: Wednesday, February 8, 2017 7:54 PM
I believe there is a time problem;
AMD Vega launches August 14 with RX Vega Nano to follow later on

I had set my choices long before Vega was a reality. My 1070 was in my computer a full year before Vega was a reality and the final component, the X34, was already bought and in use for 6 months as well.

If I were buying today it could have been different, but in retrospect, the exact card I bought in June of 2016 is on sale today for only $5 dollars less.
https://www.newegg.com/Product/Product.aspx?Item=N82E16814487248

I'm sort of glad that I didn't wait for the price to drop (y)
 
Your incorrect, you need quite a few chips of gddr to get the bandwidth while HBM can get it with one chip and this has been a obvious end goal for HBM despite you and Factum and other saying it will never happen. Your buddy Juangra even said it would not use EMIB and he was wrong like usual. It's actually Intels EMIB which makes this simple and more cost effective which just has to be killing you. Your even down to bashing KBL to try to discredit this product and were all having a good laugh watching you meltdown over it.
HBM/2 sucks for the consumer. And this is proof that AMD has never been focused on consumer for quite a while now.
 
Yet without it they could not have pushed the frequency as high on Vega, power draw is already too high. They had little choice in the matter and it cost them as well since it delayed Vega waiting on HBM production. I expect Navi to use HBM as well, good or bad that is what I expect them to do. HBM and the ability to use it on a APU has always seemed like the end road use for it, since lacking dedicated memory for the GPU side on a APU has always hurt the performance. If they can make this powerful enough then Nvidia could be in real trouble and that would make Intel and AMD happy. Will have to see how it turns out.
Yeah, how it turns out..... How about where it has been for the last couple years?
 
Yet without it they could not have pushed the frequency as high on Vega, power draw is already too high. They had little choice in the matter and it cost them as well since it delayed Vega waiting on HBM production. I expect Navi to use HBM as well, good or bad that is what I expect them to do. HBM and the ability to use it on a APU has always seemed like the end road use for it, since lacking dedicated memory for the GPU side on a APU has always hurt the performance. If they can make this powerful enough then Nvidia could be in real trouble and that would make Intel and AMD happy. Will have to see how it turns out.

From what I pieced together with Navi, I don't believe this is likely. Or at the very least it will be a hybrid approach where the central scheduler deals with HBM and the SP/CU's deal with more conventional memory, embedded cache, and possibly a ring bus implementation. HBM wouldn't work well with multiple chips due to it's serial nature and use of a silicon interposer requiring close proximity. It all depends on how the scheduler chip dishes it out. The truth is a shader/computer module only needs access to a relatively small subset of memory to render it's tile. This would work well off cache. The problem becomes where you need data from other tiles, or need to modify data in other tiles. And it's a very different beast from SLI/Crossfire.
 
I don't agree with that. Intel can play the Nvidia game just as well if it needs to and it will if they were willing to make this move.
Cool. It has been a huge cash cow for them in terms of availability and cost for years now. /s
 
Back
Top