Well so much for AMD getting priority with HBM 2

I feel so much better reading this after all the ignorance that as been spewed out by AMD lovers over the past few weeks.
 
It hasn't been a good week for the AMD faithful. First Jim Keller leaves and now their hope/fantasy of HBM 2 exclusivity/priority has been dashed.

the AMD fans have been pretty ridiculous, of course according to Kyle I was "trolling" and thus got banned for calling out their bullshit. lol
 
We keep hearing about performance metrics but what about driver quality, frequency of driver and multi gpu profile updates and overall ecosystem? Nvidia has AMD whipped in all of those and that is a large reason people avoid AMD, myself included. Also nvidia has much better developer relations and that helps a lot with games working properly on release unlike AMD. Plus AMD is a follower and releases inferior technology like freesync thinking they can fool us when in reality it's just their deluded and shrinking fan base that buries their heads in the sand and accepts the bs AMD shovels at them. Personally I think the PC gaming market will be better off once AMD finally goes out of business. Then game developers won't have to contend with AMD's second tier software and hardware and can dedicate resources optimizing for one GPU maker. Imagine if Rockstar didn't have to waste their time optimizing their code for AMD's inferior hardware and software, they could have added more features and bug fixes to the game while only concentrating on NVIDIA and GTA V would probably be a better game and have come sooner.

Oh my, what an ignoramus...

Apart from the driver drivel that is just wrong, that last part is just... I can't even describe how out of touch with reality it is.
 
There already isn't any competition.
That's not true at all. The 300-series is very competitive with the 970 and 980. The only area where NVIDIA soundly beats AMD right now is with the 980 Ti. If AMD dropped the prices on Fury to more reasonable levels they would have a much more competitive product.

If I was spending $300 right now today I would get a 390.
 
what I find most interesting about this thread is the fact that it's not only old news, but the fact that it was broke by wccftech back on Sept 6th.

amd already came out and said they weren't going to slow down nvidia and pascal would ship with it.

all the hardcore acting like this is new news. Amd isn't nvidia and wants tech to be available.

http://wccftech.com/amd-squashes-rumors-hbm-ip-licensing-fees-memory-standard-free/

HBM was CO-DEVELOPED with AMD and SK-Hynix, a company who makes and sells memory for a living. It would make 0 sense for SK-Hynix to partner up on a long term endeavor with some stipulation that "you can't sell to company X".

This sort of thing happens all the time in the tech industry. Samsung makes parts that go in iPhones (the iPhone 4 gpu and flash ram were samsung), yet these companies compete in the smartphone market. Toshiba makes the cell processor used in playstations, even though they were competitors in the format war (hd-dvd vs bluray). These are just the examples I know off the top of my head. They all buy each others parts, it cheaper and more efficient to do so. AMD hasn't likely bought any nvidia parts, because nvidia is not a manufacturer. Both companies cards have been made by the same OEM vendor (Flextronics) in the past, when they didn't manufacture their own cards, but wanted to sell their own cards.

Whoever started a rumor otherwise was definitely trolling...
 
It's basically only cooler if you want the same perf as your 980. The high end cards will likely still put out just as much heat.

I would have to disagree judging by historic background and development of computer processing technologies. Take example a pentium 4 vs core 2 duo . Pascal will be like something never seen before in the history of video card processing technology
Try again
 
Last edited:
The biggest thing is.......WHO CARES.All the future gpus will have it we all knew that so why does it matter. We already have it on fury and we see what good that does it performance wise
 
The biggest thing is.......WHO CARES.All the future gpus will have it we all knew that so why does it matter. We already have it on fury and we see what good that does it performance wise

Just like when AMD was the first one to release AMD64 architecture for their CPUs back in 2003. What good that did for the consumer when everyone was running Windows XP 32 bit.
In 6 months we will have new motherboards that improve the use of HBM bandwidth as well as technologies incorporated with in the processor as well when communicating to the GPU. Either way I think Pascal will be a big jump for 4k+ monitor owners enjoying 60-120+ FPS on a single card. We also should see a improvement with the next gen video cards and windows 10 DX 12 if game programmers take advantage of it..

"In addition to the HBM2 bandwidth boost, Nvidia says it will enhance the performance of Pascal GPUs with NVLink. This interconnect technology is said to allow data sharing "at rates 5 to 12 times faster than the traditional PCIe Gen3 interconnect".

Will this technology be incorporated on motherboards ? I would assume so...

My point is AMD has been known to release new technologies to the consumer that dont seem to take off flying so to speak. Another Case in point is Mantle. I think A-sync will do well since intel is backing it.



http://wccftech.com/intel-vesa-adaptive-sync-monitors-amd-free-sync/

Intel’s Skylake units do not contain the necessary hardware to back ASync right now but future processors may do so. Since both GSync mobility and FreeSync mobility appear to use the VESA standard over eDP – it would be possible for future Intel mobility processors to support the mobile version as well – but that is something that still has a long way to go considering the actual pioneering tech isn’t mainstream yet.

Read more: http://wccftech.com/intel-vesa-adaptive-sync-monitors-amd-free-sync/#ixzz3mRW4rS6D
 
Last edited:
The biggest thing is.......WHO CARES.All the future gpus will have it we all knew that so why does it matter. We already have it on fury and we see what good that does it performance wise

I think the reason why this was brought up is because wccftech themself wrote about AMD having exclusivity, and since it came up, many people often quote that exclusivity thing when discussing about when Pascal will arrive, some stating that Pascal will be delay and stuff like that due to such exclusivity. So in that sense, it concerns Pascal release.

Not that I think it ever made any sense, but with that rumor put to rest, at least we now know that Pascal release would not depend on that, but rather other factors such as TSMC's ability to produce their 16nm chips.


And yeah, I do think that the advantage of HBM was probably misrepresented. Many people expected huge improvement without realizing that any improvements would be from the amount of performance that was hindered by the memory bandwidth. The actual performance still depends on the GPU, what HBM does is merely removing a bottleneck to that performance.

When we consider how fast GDDR are today, along with other ways of mitigating bandwidth bottleneck such as cache memory and data compression, it's not surprising that we're not seeing the benefits of HBM yet. Maybe when we have GPU capable of rendering high quality graphics at 4k that we'll need the extra bandwidth.
 
All I care about: GPUs stay competitive and keep driving the industry forward.
 
Consumers (we) lose if AMD exits the discrete market for GPU's. Less competition means less innovation. Not sure why so many people are excited if not eager to dance on AMD's grave.

^^This, Nvidia Fanboys = Apple iSheeps
Me personally, I had the Radeon HD 8750M, 7950 Boost, but went with a GIMPED 970 just cause it was new @ Q3 - Q4 2014, wrong move!
 
I don't understand these deliberate troll attempt threads. I should just quit the internet.
Pascal gets HBM2 on time, great. I will probably buy one.
Then I could take the argument, well Nvidia is a day late and an exclusivity contract short. AMD was first to HBM. :rolleyes:
OP hates AMD. We know. /thread
So AMD did have HBM first, didn't really do much with it in even being first to the market.

Well AMD put HBM on a gpu not designed from the ground up for it and got marginal benefit. I'm hoping the nextgen gpu's from both camps are designed to take full advantage of HBM and that we see something amazing as a result.
Maybe with pascal rumor saying 5000-6000 cuda cores maybe rops will be up in that area to of 128-150's

what I find most interesting about this thread is the fact that it's not only old news, but the fact that it was broke by wccftech back on Sept 6th.
amd already came out and said they weren't going to slow down nvidia and pascal would ship with it.
all the hardcore acting like this is new news. Amd isn't nvidia and wants tech to be available.
http://wccftech.com/amd-squashes-rumors-hbm-ip-licensing-fees-memory-standard-free/
If i remember right the standard they used for HBM means they could charge a fee for it. Plus their whole stance of "free and open standards" would go out the window if they tried to.

Nobody doubted that HBM 2 would come to Pascal. However, WFFCTech themselves and AMD faithful mistakenly believed that AMD would get some kind of timed exclusive or priority with HBM 2 which was never true or reasonable. That is what Hexus's article addresses.To address the bolded part above, AMD isn't doing this out of the kindness of its heart, it needs NVIDIA to drive prices of HBM 2 down.
HBM was CO-DEVELOPED with AMD and SK-Hynix, a company who makes and sells memory for a living. It would make 0 sense for SK-Hynix to partner up on a long term endeavor with some stipulation that "you can't sell to company X".
SK-Hynix i don't think would agree'ed to be exclusive with AMD, Nvidia has to much money to be locked out liked that and would went to someone else to make chips to compete.

That's not true at all. The 300-series is very competitive with the 970 and 980. The only area where NVIDIA soundly beats AMD right now is with the 980 Ti. If AMD dropped the prices on Fury to more reasonable levels they would have a much more competitive product.
If I was spending $300 right now today I would get a 390.
Fury already had a price cut from its original price that rumor stated which usually is right. Fury was looking like it would had 850$ price tag on it but when 980ti dropped at 650$ AMD had no choice buy to match it so they already took a profit hit on that, they might be only breaking even at this point.
 
That's not true at all. The 300-series is very competitive with the 970 and 980. The only area where NVIDIA soundly beats AMD right now is with the 980 Ti. If AMD dropped the prices on Fury to more reasonable levels they would have a much more competitive product.

If I was spending $300 right now today I would get a 390.

AMD is only a worthwhile option if you ignore things like power efficiency, cooling, actual playability and driver stability.
 
AMD is only a worthwhile option if you ignore things like power efficiency, cooling, actual playability and driver stability.

You do know that amd is king with the 390?
cool, efficient and fast :D
Best card in the world next to fury

It so silent and cool for me I have tear in my eyes as I am so happy with the card

I find kids like yourself have no idea how greatness in the world works.
its called the AMD 390 :D
 
I would have to disagree judging by historic background and development of computer processing technologies. Take example a pentium 4 vs core 2 duo . Pascal will be like something never seen before in the history of video card processing technology
Try again

Really? I'm already signed up for a Titan D and all but I expect it to use just as much power, if not more than the Titan X. Plus the power density increases greatly so I expect these to be much harder to cool. If the memory is on die that'll be even more heat to dissipate off the chip. My 5960x pulls over 300 watts. That's the best enthusiast chip.

Will it be cooler per a transistor? Sure! But you have twice as many. Look at Intel, did we have some magical jump between node (factor in double if you want) shrinks? No.

Perhaps lay off the nVidia marketing materials for a little while...
 
You do know that amd is king with the 390?
cool, efficient and fast :D
Best card in the world next to fury

It so silent and cool for me I have tear in my eyes as I am so happy with the card

I find kids like yourself have no idea how greatness in the world works.
its called the AMD 390 :D

I can't see the logic in purchasing a 3 year old rebranded GPU that draws almost 500w at load and has next to no functionality on linux.
 
the AMD fans have been pretty ridiculous, of course according to Kyle I was "trolling" and thus got banned for calling out their bullshit. lol

"Drivers have been pretty ridiculous going the speed limit. Of course according to the police I was "speeding" and thus got a ticket for passing those slow fucks. lol"

You see how this sounds?
 
Fury already had a price cut from its original price that rumor stated which usually is right. Fury was looking like it would had 850$ price tag on it but when 980ti dropped at 650$ AMD had no choice buy to match it so they already took a profit hit on that, they might be only breaking even at this point.
The card launched at $649. It was never $850. Rumors don't matter.

AMD is only a worthwhile option if you ignore things like power efficiency, cooling, actual playability and driver stability.
Power efficiency isn't great but the 3rd party coolers handle it just fine. I actually have AMD cards and haven't had any issues with stability or playability on them. The girlfriend's PC has an overclocked 290 and she's played through tons of games in the last year without any issues at all.
 
The card launched at $649. It was never $850. Rumors don't matter.

Power efficiency isn't great but the 3rd party coolers handle it just fine. I actually have AMD cards and haven't had any issues with stability or playability on them. The girlfriend's PC has an overclocked 290 and she's played through tons of games in the last year without any issues at all.

I'll agree that I've never seen stability issues from either vendor if on a single card. I have a rigs with both cards too.

I've seen stability issues from both vendors if on multiGPU.
 
All I care about: GPUs stay competitive and keep driving the industry forward.

No way dude, I'll only feel better about myself when my corporation of choice utterly destroys their competition.
 
I'll agree that I've never seen stability issues from either vendor if on a single card. I have a rigs with both cards too.

I've seen stability issues from both vendors if on multiGPU.
I've had 5870 CFX, 580 SLI, 680 SLI, and 290X CF.

I never had stability issues except for a bug with the X58 chipset and my 290Xs, which wasn't AMD's fault. The 5870s sucked in CrossFire at the time but the 290X CF setup was infinitely better. I ended up replacing my 290X CF setup for a 980 Ti to use GSYNC, but they were a very powerful setup, and the only issues I had were that the reference coolers were super loud and a few graphical issues with Witcher 3 in CFX.
 
It has been known for a while now that high end Pascal cards would use HBM2. I don't see why this is exactly news now. :confused:

That said, I do feel for AMD fans, especially considering that I was an Athlon/Athlon XP/Athlon 64/Athlon 64 X2 user. I completely avoided Intel for almost 8 years because AMD was a viable alternative. Not so much an ATi fan, though, but I do admire their R300, R480, R520 and R580 series GPUs. Also the R100 and R200 series architectures were pretty cool for the time, even if they weren't as fast as their NV1x and NV2x counterparts.

I still say the R300 is one of the best video cards ever designed. I also loved the sweet deal I got by getting the 9500 on the 9700 board, then soft modding it for a free upgrade :D.
 
There is an obvious solution to AMD's issues these days.

Intel needs to up their GPU presence, and Nvidia needs to start making X86 CPUs. :p Competition.
 
Really? I'm already signed up for a Titan D and all but I expect it to use just as much power, if not more than the Titan X. Plus the power density increases greatly so I expect these to be much harder to cool. If the memory is on die that'll be even more heat to dissipate off the chip. My 5960x pulls over 300 watts. That's the best enthusiast chip.

Will it be cooler per a transistor? Sure! But you have twice as many. Look at Intel, did we have some magical jump between node (factor in double if you want) shrinks? No.

Perhaps lay off the nVidia marketing materials for a little while...


regarding your last sentence.. I would just accept it as soon to be reality that Pascal will kick the shit out of AMD alot sooner than later if AMD does not come up with something better than a Fury.

A company and organization like Nvidia does not have to come up with lies to get customers sucked into waiting for their products.. AMD on the other hand with the Mantle Fiasco announcement now that was funny and still is today.

Pascal

SMC’s 16FF+ (FinFET Plus) technology can provide above 65 percent higher speed, around 2 times the density, or 70 percent less power than its 28HPM technology. Comparing with 20SoC technology, 16FF+ provides extra 40% higher speed and 60% power saving.
 
regarding your last sentence.. I would just accept it as soon to be reality that Pascal will kick the shit out of AMD alot sooner than later if AMD does not come up with something better than a Fury.

A company and organization like Nvidia does not have to come up with lies to get customers sucked into waiting for their products.. AMD on the other hand with the Mantle Fiasco announcement now that was funny and still is today.

Pascal

SMC’s 16FF+ (FinFET Plus) technology can provide above 65 percent higher speed, around 2 times the density, or 70 percent less power than its 28HPM technology. Comparing with 20SoC technology, 16FF+ provides extra 40% higher speed and 60% power saving.

I never said anything about AMD so it's funny half your post is bashing AMD... History has proven AMD to be slightly behind in performance. There's no reason to think AMD isn't moving to 16FF+ or something comparable. I would expect the same situation as we have today where AMD is slightly behind.

You just quoted marketing material. Notice the word "can". Don't get me wrong a double node shrink (skipping 20nm) will do all sorts of good with double the cores, ect. It's still within the realm of physics though. A maxed out, overclocked core is going to put out serious heat and be of a higher heat density than before. Especially with the VRAM on die. It's a reality for both vendors. Also more fun for us enthusiast to try and cool. :D
 
Last edited:
No way dude, I'll only feel better about myself when my corporation of choice utterly destroys their competition.

remember the gold old days when AMD had the FX chip and the shares were up 16.00 a share. Present day doesnt look good for AMD, as of 22 Sept

Advanced Micro Devices, Inc.
NASDAQ: AMD - Sep 22 4:09 PM EDT
1.73Price decrease0.08 (4.42%)
After-hours: 1.72Price decrease0.01 (0.45%)

I hope AMD comes back with some good old competition in the processor market again . I still dont understand why they like to use gold pins under their processors.. Maybe they can bring back the pencil lead trick method so hardocp users like ourselves can get a 8 core processor for the price of a dual core :D
just go get that number 2 pencil and draw it in. BAM 8-12 cores for FREE


https://www.youtube.com/watch?v=WPERGiOegl4
 
I never said anything about AMD so it's funny half your post is bashing AMD... History has proven AMD to be slightly behind in performance. There's no reason to think AMD isn't moving to 16FF+ or something comparable. I would expect the same situation as we have today where AMD is slightly behind.

You just quoted marketing material. Notice the word "can". Don't get me wrong a double node shrink (skipping 20nm) will do all sorts of good with double the cores, ect. It's still within the realm of physics though. A maxed out, overclocked core is going to put out serious heat and be of a higher heat density than before. Especially with the VRAM on die. It's a reality for both vendors. Also more fun for us enthusiast to try and cool. :D

I am just posting facts on the other hand you put words in peoples mouths to whatever makes you feel good. I have noticed you like to make up theoretical technical issues with Pascal like you are a theoretical physicist.
Perhaps you know this person


sci-fisciencephysicsoftheimpossible.jpg


A maxed out, overclocked core is going to put out serious heat and be of a higher heat density than before. Especially with the VRAM on die. It's a reality for both vendors. Also more fun for us enthusiast to try and cool. :D

And History shows AMD has always had this issue

more often then intel or nvidia
 
Last edited:
I am just posting facts on the other hand you put words in peoples mouths to whatever makes you feel good. I have noticed you like to make up theoretical technical issues with Pascal like you are a theoretical physicist.
Perhaps you know this person


sci-fisciencephysicsoftheimpossible.jpg




And History shows AMD has always had this issue

more often then intel or nvidia

I generally use real world experience... You're the one quoting theoretical marketing numbers. My point was with Intel's high end the power usage hasn't gone down with node shrinks.

Performance per a watt is a lot better and will be wish pascal. Actual power usage likely won't change much.
 
I generally use real world experience... You're the one quoting theoretical marketing numbers. My point was with Intel's high end the power usage hasn't gone down with node shrinks.

Performance per a watt is a lot better and will be wish pascal. Actual power usage likely won't change much.

I think you have a problem what what Nvidia is telling us.

Like i said earlier a company with huge amounts of capital does not need to provide false advertising to the consumer about its future products.
I am talking about pascal node shrink and the benefits it will bring.. Faster, cooler , less electricity then any card made today . HBM with NVlink will be greater than anything we have seen 6 months from now and YES it will be faster and cooler while using alot less electricity.

Lets talk about Pascal
I am curious to know when Nvlink will be available on consumer motherboards,

http://www.anandtech.com/show/7900/nvidia-updates-gpu-roadmap-unveils-pascal-architecture-for-2016
Coming to the final pillar then, we have a brand new feature being introduced for Pascal: NVLink. NVLink, in a nutshell, is NVIDIA’s effort to supplant PCI-Express with a faster interconnect bus. From the perspective of NVIDIA, who is looking at what it would take to allow compute workloads to better scale across multiple GPUs, the 16GB/sec made available by PCI-Express 3.0 is hardly adequate. Especially when compared to the 250GB/sec+ of memory bandwidth available within a single card. PCIe 4.0 in turn will eventually bring higher bandwidth yet, but this still is not enough. As such NVIDIA is pursuing their own bus to achieve the kind of bandwidth they desire.



Nvidia starts building the 'world's most powerful' supercomputer using Nvlink


http://www.theinquirer.net/inquirer...ilding-the-worlds-most-powerful-supercomputer



This is why we need NVlink and Pascal

http://www.cnet.com/news/lg-outs-imac-with-8k-display-for-later-this-year/

The display maker says 8K means images as sharp as the human eye can discern, and indicates that Apple has an "iMac 8K" coming this year.


Can anyone tell me how AMD plans to release a video card to drive a 8k monitor within the next 6-8 months? I dont see anyone gaming on this monitor with an AMD card anytime soon
 
Last edited:
Can anyone tell me how AMD plans to release a video card to drive a 8k monitor within the next 6-8 months? I dont see anyone gaming on this monitor with an AMD card anytime soon

Well, considering that traditionally AMD has better performance in higher resolutions, I don't think it's too hard for them.
 
Well, considering that traditionally AMD has better performance in higher resolutions, I don't think it's too hard for them.

however in the meantime their shares are currently at a historic record low for the company

I find that hard to believe right now that htis will not be too hard for them like you say


18 Sept

http://www.fool.com/investing/gener...-may-not-survive.aspx?source=eogyholnk0000001

Tim Green: Advanced Micro Devices (NASDAQ:AMD)has seen its share of both the CPU and GPU markets collapse over the past few years. The company has been slashing research and development spending as revenue has declined, and the once-profitable business has now reported a string of big losses. During the second quarter, AMD reported a net loss of $181 million, with sales slumping 35% year-over-year.

Amd A Series Processor
SOURCE: AMD

Deep problems plague the company. AMD CPUs can't match Intel in terms of performance, and the recent weakness in the PC market has put even more pressure on AMD's core business. In the discrete GPU market, AMD has been bleeding share to NVIDIA for the past year. During the second quarter, AMD shipped just 18% of discrete GPUs, down from about 40% during the second quarter of 2014. AMD's computing and graphics segment, which includes both CPUs and GPUs, reported a 54% year-over-year decline in revenue during the second quarter, and the only thing keeping the company afloat are sales of chips that go into game consoles.

AMD has enough cash, about $829 million, to keep going for a while, but with $2.2 billion of debt and no sign that these losses will let up anytime soon, a turnaround is looking increasingly remote with each passing quarter. The launch of new CPU chips based on the Zen architecture, planned for next year, is AMD's best chance at reversing the troubling trends plaguing its business. But Zen looks to be a Hail Mary for AMD; anything short of perfection may not be enough to save the company.
 
Last edited:
So you are resorting to cursing ?
hmmm
you are on my ignore list because obviously you are not too educated when it comes to finance 101

One of the better ones I've heard this week.

Nope your wrong

and guess what your now on my ignore list.
have a nice day

Touched a nerve?
 
What the fuck that has to do with anything?


Stock price by itself doesn't have anything to do with it, but if we look at actually assets + cash and cash equivalents vs. debt and compare that to AMD's market performance over the past 2 years, gives us why the stock price is so low.

And at the current rate they are burning cash, they only have 5 quarters to go before they have no cash so UnrealCpu, 829 million in cash isn't much for them at this point.
 
Back
Top