From ATI to AMD back to ATI? A Journey in Futility @ [H]

Ford stoped using Microsoft in sync systems in 2015. They now use QNX (Blackberry) as the OS

Could they have made any worse a choice.

First sync built on Windows mobile 6 and now this. What is with Ford picking up defunct loser operating systems?
 
Must not be a Monday car lol
My '80 LTD had 500K, moved that 302 to my '80 Fairmont later but engine scarred a cylinder from stuck ring after sitting a tad too long. Eventually bought this '87 EXP off my dad. Gotta say I am thoroughly impressed with this car. Helps it is only 2400lbs. And I give it absolute hell 5 days a week.
 
Gotta say I am thoroughly impressed with this car. Helps it is only 2400lbs. And I give it absolute hell 5 days a week.

All cars back then we're lightweight. But they were also death boxes compared to today's cars.

That said I owned one of the newer cougars, a Ford probe and two new style tauruses. The probe got 300,000. It leaked oil like crazy because of the aluminum head on iron block. (Darn Mazda design). But it refused to die.
 
I'm going to guess iMac mini. Would make sense. Apple wants custom processor because integrated Intel HD or iris graphics just not cutting it. Makes for consistent graphics across the product line when it comes to their push into VR etc. My question is now will Intel make a bid for RTG? Maybe even AMD out right… If so hopefully it's for at least 50 bucks a share :).
 
I'm going to guess iMac mini. Would make sense. Apple wants custom processor because integrated Intel HD or iris graphics just not cutting it. Makes for consistent graphics across the product line when it comes to their push into VR etc. My question is now will Intel make a bid for RTG? Maybe even AMD out right… If so hopefully it's for at least 50 bucks a share :).

Not going to happen. Anti-trust would kick in on the CPU side. In fact the sole reason AMD exist was because the US military required two suppliers for x86 based chips.
 
I'm going to guess iMac mini. Would make sense. Apple wants custom processor because integrated Intel HD or iris graphics just not cutting it. Makes for consistent graphics across the product line when it comes to their push into VR etc. My question is now will Intel make a bid for RTG? Maybe even AMD out right… If so hopefully it's for at least 50 bucks a share :).


Keep in mind the code name was "Palo Alto". As for a buyout that would be an anti-trust nightmare.
 
Not going to happen. Anti-trust would kick in on the CPU side. In fact the sole reason AMD exist was because the US military required two suppliers for x86 based chips.

At the time that was satisfied by IBM who still hold a limited pentium grade x86 license re: space shuttle. This isn't in effect anymore, they can shift to open license ARM for all it matters.
 


Keep in mind the code name was "Palo Alto". As for a buyout that would be an anti-trust nightmare.


Oh, I understand the codename stuff but as far as the antitrust concerns I think that it could be argued that at no other point in the last 30 years or so has their been so much competition on the horizon for Intel. ARM, Qualcomm and other custom chips (Google, Nvidia etc.) and a business friendly administration may just look the other way with the "America first" policy. No skin off my back today… 35,000 long ;-).
 
Oh, I understand the codename stuff but as far as the antitrust concerns I think that it could be argued that at no other point in the last 30 years or so has their been so much competition on the horizon for Intel. ARM, Qualcomm and other custom chips (Google, Nvidia etc.) and a business friendly administration may just look the other way with the "America first" policy. No skin off my back today… 35,000 long ;-).

Incase you haven't noticed over 90% of desktops run x86. Given the sheer numbers, yeah that puts you as a trust when you are the sole supplier.
 
Incase you haven't noticed over 90% of desktops run x86. Given the sheer numbers, yeah that puts you as a trust when you are the sole supplier.

Oh, I know that but if Windows is going to be able to run on ARM (if they can skirt the Intel legal minefield) it wouldn't just be a trickle of phones, laptops adopting it… It would be a tsunami. For my purposes it would be the perfect set up. Phone/tablet running Windows and compatibility with desktop. As we've seen over the last decade… Things can change FAST in tech whatever the reasons.
 
Windows on ARM isn't new: Look at the 2012 Surface running Windows RT. It's that the new product is emulating x86 using an ARM chip.

Microsoft has a rather extensive history prior to Windows XP for having Windows versions running on various chipsets.
 
Windows on ARM isn't new: Look at the 2012 Surface running Windows RT. It's that the new product is emulating x86 using an ARM chip.

Microsoft has a rather extensive history prior to Windows XP for having Windows versions running on various chipsets.

I didn't say it was something new. The latest and upcoming versions of the ARM design (and derivatives like iPad Pro) are much more powerful than what they tried to do with Windows RT (which was highly neutered/slow/limited compared to full x86). If they can get a seamless workflow between ARM tablet, phone, phablet etc. and desktop it's a killer set up. Microsoft has come up with good ideas before and just had poor implementation and abandoned them to quickly. Intel is more frightened this time because of the onslaught coming from ARM whether in desktop, mobile or servers. Their legal team is going to be getting a lot of business over the next couple years.
 
I didn't say it was something new. The latest and upcoming versions of the ARM design (and derivatives like iPad Pro) are much more powerful than what they tried to do with Windows RT (which was highly neutered/slow/limited compared to full x86). If they can get a seamless workflow between ARM tablet, phone, phablet etc. and desktop it's a killer set up. Microsoft has come up with good ideas before and just had poor implementation and abandoned them to quickly. Intel is more frightened this time because of the onslaught coming from ARM whether in desktop, mobile or servers. Their legal team is going to be getting a lot of business over the next couple years.

I fully expect this to get squished legally by Intel on some basis we are not currently aware of. But the timing is so sweet.... Zen core CPU's up and down the field that MS can claim it has closely developed for etc... makes Intel squirm. ARM on low powered crap like phones.

I actually think this is in response to Intel backing out of low power dev for micro devices. What the hell else can MS do other than be left by the roadside to become irrelevant as Android, iOS and outliers like QNX start to dominate all devices used.
 
I didn't say it was something new. The latest and upcoming versions of the ARM design (and derivatives like iPad Pro) are much more powerful than what they tried to do with Windows RT (which was highly neutered/slow/limited compared to full x86). If they can get a seamless workflow between ARM tablet, phone, phablet etc. and desktop it's a killer set up. Microsoft has come up with good ideas before and just had poor implementation and abandoned them to quickly. Intel is more frightened this time because of the onslaught coming from ARM whether in desktop, mobile or servers. Their legal team is going to be getting a lot of business over the next couple years.

Emulation is a very tricky thicket. x86/x87 specific instructions like AVX and x87 math processor flags written in assembly are hard to emulate. All attempts so far have resulted in massive performance drops.
 
Emulation is a very tricky thicket. x86/x87 specific instructions like AVX and x87 math processor flags written in assembly are hard to emulate. All attempts so far have resulted in massive performance drops.
I wouldn't say they're hard to emulate, just difficult to execute efficiently without appropriate hardware underneath. Really need a SIMD array to map instructions too if you want performance. A hardware translator wouldn't hurt either.
 
I wouldn't say they're hard to emulate, just difficult to execute efficiently without appropriate hardware underneath. Really need a SIMD array to map instructions too if you want performance. A hardware translator wouldn't hurt either.


Which means they are hard to emulate ;)
 
Which means they are hard to emulate ;)
No, just that the throughput ultimately isn't there. It'd be like emulating a GPU on a CPU. You can get it to execute efficiently easily enough, but performance will suck. That'd have nothing to do with the emulator.
 
So, one month AFTER Kyle told us he would be putting an article together, this is still officially bullshit? All we have is a codename that's probably the most generic thing I've ever heard. Place names used by Intel for Code Words? Sure. But nothing that far south.

And the best it could possibly be is a special order by Apple to put a processor, GPU and local HBM on the same package (connected to the same PCIe lanes that are already on the processor). You know the same way Apple puts the same processor, GPU and vram on a tiny motherboard?

SIMPLY AMAZING? All this thread has been to talk about a custom-built package with a little higher integration than Apple could do themselves? And no actual technology transfer, since anyone and their dog can buy an AMD GPU and install it on whatever the fuck they want.

No wonder this thread has become complete fucking distracted, talking about emulators :D
 
Last edited:
No, just that the throughput ultimately isn't there. It'd be like emulating a GPU on a CPU. You can get it to execute efficiently easily enough, but performance will suck. That'd have nothing to do with the emulator.

If you want to go that route and only talk about the emulator, its going to be massively slower and that is even if it can be emulated. Its very difficult to emulate anything for that matter with out performance degradation. The only time emulators ever worked well is 5 or 10 years down the road of the original product cause at that point the horse power of the new chips can over come the emulation translation performance drops (pretty much the chip can translate the code fast enough so it doesn't effect performance anymore)
 
Last edited:
So, one month AFTER Kyle told us he would be putting an article together, this is still officially bullshit? All we have is a codename that's probably the most generic thing I've ever heard. Place names used by Intel for Code Words? Sure. But nothing that far south.

And the best it could possibly be is a special order by Apple to put a processor, GPU and local HBM on the same package (connected to the same PCIe lanes that are already on the processor). You know the same way Apple puts the same processor, GPU and vram on a tiny motherboard?

SIMPLY AMAZING? All this thread has been to talk about a custom-built package with a little higher integration than Apple could do themselves? And no actual technology transfer, since anyone and their dog can buy an AMD GPU and install it on whatever the fuck they want.

No wonder this thread has become complete fucking distracted, talking about emulators :D

Indeed, still waiting, but I have to say, something seriously feels up with AMD's graphics division right now...

Make no mistake, the future of RTG is in serious doubt. RX Vega = 1080. We all know it, and so the only choice AMD now has is this:

1. Don't release RX Vega.
2. Release it and seriously undercut nvidia's equivalent. Polaris all over again basically.

Frankly, if I were AMD, option 1 would be the wisest choice, and then spin off RTG. With the complete lack of current AMD stock on shelve, and no confirmation at all of any RX Vega products, and/or any indication that they'll even sell significantly, it feels to me that what Kyle originally said is pretty much going to play out.
 
Indeed, still waiting, but I have to say, something seriously feels up with AMD's graphics division right now...

Make no mistake, the future of RTG is in serious doubt. RX Vega = 1080. We all know it, and so the only choice AMD now has is this:

1. Don't release RX Vega.
2. Release it and seriously undercut nvidia's equivalent. Polaris all over again basically.

Frankly, if I were AMD, option 1 would be the wisest choice, and then spin off RTG. With the complete lack of current AMD stock on shelve, and no confirmation at all of any RX Vega products, and/or any indication that they'll even sell significantly, it feels to me that what Kyle originally said is pretty much going to play out.

FFS, unless Raja's rebellion is truth and they still want to spin off, why would AMD, who spent the last 2 years+ basically saying, "We are competitive, we have CPUs and GPUs, please give us money" spin off RTG now? Does anyone understand how bad it would look? Confidence would collapse in AMD, AMD would lose its theoretical potential in HPC, GPU markets, and still need GPU tech for their CPUs! Why give that tech away? No, unless RTG really thinks it can find a new home, AMD is going to take the damage, and move on, promising Navi, and hopefully relying on their CPU sales to get the money needed to invest in RTG again.
 
Added to that RTG's IP is across all brands, what will AMD do if they spin off RTG? Lose consoles, lose APU's, lose their semi custom business? They will lose a lot if they spin off RTG. Just have to swallow the pill and hope their CPU's can cover RTG.
 
Added to that RTG's IP is across all brands, what will AMD do if they spin off RTG? Lose consoles, lose APU's, lose their semi custom business? They will lose a lot if they spin off RTG. Just have to swallow the pill and hope their CPU's can cover RTG.

Well, there CPU division looks to be heading in the right direction. Just wish there GPU's were doing the same.
 
Well its an extended problem, with RTG faltering, they won't be able to expand it quickly nor increase R&D. Pretty much the CPU division now has to cover all of RTG's R&D, expenses, and future chips on top of the debt, while increasing its own R&D for future CPU's. Its still an uphill battle, not as bleak as before, but still.
 
My real concern for RTG is that Volta will be the death blow.


Glad you have those concerns, but I don't see AMD loosing too much sleep over the "Gaming" version of Volta. That is roughly 10~11 months away. Vega is going to have free reign in the gaming market thru the 2017 Holiday. Cheap 4k FreeSync2 monitors for everyone, even the basement dwellers. OPhra will be giving them away on TV, FS2 craze for young gamerz. ergo: AMD will have mindshare when Volta hits.

But, honestly if AMD is sandbagging their "control fabric" side of things, then Volta might not even stand a chance against a Vega x2. Which I think will be released some time in October. Still in time for the Holidays, and it would place AMD on top of the GPU wars, and about 40% out in front of Tital Xp (Pascal). Not forgetting, that Navi uarch is soon coming down the pipe and the transition into the RX Vega sku is going to mean a possible Vega x4 (RX4). I really don't think Volta @ 800mm^2 is going to be able to compete with AMD direction and strategy. Even as facetious as that may sound, it is entirely plausible knowing ALL WE KNOW so far about AMD & their technology. I am just reacting to the cadence of AMD's tick-tock cycles between their various platforms.

RX Vega will be a hit for gamers & a blow for Nvidia.





~ sine wave ~
 
Glad you have those concerns, but I don't see AMD loosing too much sleep over the "Gaming" version of Volta. That is roughly 10~11 months away. Vega is going to have free reign in the gaming market thru the 2017 Holiday. Cheap 4k FreeSync2 monitors for everyone, even the basement dwellers. OPhra will be giving them away on TV, FS2 craze for young gamerz. ergo: AMD will have mindshare when Volta hits.

But, honestly if AMD is sandbagging their "control fabric" side of things, then Volta might not even stand a chance against a Vega x2. Which I think will be released some time in October. Still in time for the Holidays, and it would place AMD on top of the GPU wars, and about 40% out in front of Tital Xp (Pascal). Not forgetting, that Navi uarch is soon coming down the pipe and the transition into the RX Vega sku is going to mean a possible Vega x4 (RX4). I really don't think Volta @ 800mm^2 is going to be able to compete with AMD direction and strategy. Even as facetious as that may sound, it is entirely plausible knowing ALL WE KNOW so far about AMD & their technology. I am just reacting to the cadence of AMD's tick-tock cycles between their various platforms.

RX Vega will be a hit for gamers & a blow for Nvidia.



~ sine wave ~
So you think AMD's answer to Volta should be a dual Vega card? LOL.
 
Dual GPU is not equal to single GPU in direct comparison.

Single GPU performance and consistency is much more preferred hence most people like single fastest cards over dual GPUs all day long.

If one prefers playing benchmarks all day then dual GPUs suffice lol.
 
Last edited:
I don't care if it's dual, single, or triple GPU on a single card, if it performs it performs and you can compare it on a price level/power/performance level then.
 
I don't care if it's dual, single, or triple GPU on a single card, if it performs it performs and you can compare it on a price level/power/performance level then.


Well that is the thing currently it just won't, it has to use Xfire and right now with most engines using differed rendering, it functions like a single gpu solution. Cost for such a solution added to the pitfalls further decreases its effectiveness. We saw this with all Dual GPU tech in the past. The only time it was feasible was prior to differed renderers. For VR it wil come in handy but that market is so small right now again cost of manufacturing might be detrimental for the results
 
Back
Top