Separate names with a comma.
Discussion in 'HardForum Tech News' started by Megalith, Apr 14, 2018.
they've been at it since Intel740 1998.
me asuming it beats a ti. i have had dual 295x2 systems that used way more then 500watts and only supported a select few games
dont look at them for ipc, they do not have the budget to do it however they can still stay "competitive" with high core count cpus. and yea intel could absalutly kill them but they obviusly wont as they are more then happy at the curent pace of development. this same pace will continue basicly forever as almost no other company can get in the market. ibm power cpus are decent but the support behind them isnt anywhere even close to x86 and ipc is more then a issue with those even when they jack the clock wayyy up.
agreed but unforcuantly it will be quite abit before any concumers actually see cards. intels not doing this for the gaming market they are doing it to beet nvidia for ai workloads and simular. intel already has things that can do that (the x200 phis are FANTASTIC when optimized and kill gpus) intel also make a card called the vca2 which was a few xeon chips on a pcb frenkenstined together with software. I would hilly guess we will se somthing closer to either of those then a gaming gpu for a long time.
it was stated a few years ago if either company was bought the cross-licensing is to be terminated.
which isn't to say a new agreement couldn't be reached.
And with this, we all know this guy does not like AMD. Period. Facts be damned.
Yes thank you. I don't know how many times I have repeated this on other forums. Nope, they are still all holding out for some hot acquisition or some similar sh*t that will never happen. The x86 Licensing "giveaway" by intel was one of the many stupid things Intel has done, and probably one of the stupidest They realize that now and relish at the thought of termination
Who the hell would
a) buy amd and invalidate the license?
(HINT: the co. wouldn't be worth crap then or less than half of what you just paid)
2) why would amd give up the x86 license, it's major asset?
(HINT: Everyone at AMD secretly works as double agents for Raj and Jensen)
Yeab a and 2.. figure it out.
Sort of not really. All the Intel stuff not counting the newest AMD vega stuff.... has been licensed from Nvidia.
Intel can't build new versions of the old "Intel" gpus as they where built on NV tech.
Intel and Nvidia cross licensed patents back in 2004, then 7 years later NV settled a lengthy legal battle that saw Intel Pay NV 1.5 billion (paid out in installments until 2016) in exchange for NV getting out of the chipset business... and they further agreed to continue licensing Intel GPU patents. Nvidia made Intel pay through the nose for the GPU licenses... and there deal expired in March.
Hence the new Kabylake chips with AMD GPUs.... and why Intel has been hiring GPU hardware and software engineers as if they where trying to rush build their own IP or something.
“x86-64/AMD64 was solely developed by AMD. AMD holds patents on techniques used in AMD64; those patents must be licensed from AMD in order to implement AMD64. Intel entered into a cross-licensing agreement with AMD, licensing to AMD their patents on existing x86 techniques, and licensing from AMD their patents on techniques used in x86-64. In 2009, AMD and Intel settled several lawsuits and cross-licensing disagreements, extending their cross-licensing agreements.”
Which makes me wonder, why they didnt buy Imaginations PowerVR division, since they are or were for sale recently, because as far as i keep reading, amd and nv holds pretty much every IP gpu related.
So what is this thing? An efficient mining card? Uggg. So much for serious competition to drive innovation.
The Intel 740 has nothing to do with NV. Intel is in their tenth gen. video arch. Cross licensing was only for six years, so ok - I'll give ya the last six.
Yep no way intel ever renews that deal... at least as it is.
AMD is pretty much unsellable. Having said that Intel licences a lot of AMD tech as well. Really AMD never gets solid without Intels input and a ton of oversight as the US authorities will no no major x86 tech sales to specific countries / ownership groups.
Here is the cross license deal if anyone is interested in boring legal docs.
It's a very well thought out extension to create when nobody wants to swallow an Itanium Architecture change.
Agreed, unsellable. They CAN merge or acquire. AMD cannot be bought out by it's two main competitors also because of Monopoly issues, the US Govt would never approve.
1) it was designed over 20 years ago with very little Intel engineering
2) the tech behind the 740 was actually designed by GE aerospace for use in simulators. (no not new shiny space shuttle stuff I'm talking about Lunar module stuff) It was sold by GE to Martin Marietta Corporation in 92. (they merged with Lockheed not long after) In 95 Lockheed martin formed Real3D. Read3D sold some cards for use in video game arcade machines... and later formed a partnership with Intel and they co produced the 740.
3) Intel sold it for 18 months and pulled it... and Lockheed shut down Real3D. Intel purchased the patents. Then was sued by 3DFX (remember those clowns... I joke) that case was settled with Intel selling ALL of those patents to 3DFX. (I don't think I need to explain who owns those patents now right ?)
After that Intel licensed powervr tech from Imagination Technologies. For the last 6 years at least they have been paying NV a lot of cash for access to their patents. I have no idea how many NV patents they used in their more recent chips... but I would have to assume it would be more work trying to carve them all out then it would be worth. Also I think I have explained Intel has no base of patents of their own... they really have never ever ever designed an in house GPU.
They currently own very very few graphics patents... and none that are responsible completely and alone for a shipping product.
It sounds like they are hiring a massive team though to build something new... but ya they are going to need some serious talent on board cause unless Intel is planning to go on a purchasing streak they need to develop some in house tech.
Don't mean to cut you off.. but - Glad you got to the wiki. 740 Born in the USA Military Industrial Complex and yes a long sordid tale. No Nvidia. Commonly recognised as Intel's first dGPU effort. Regardless of the details and eventful history.
Accept it (the patents which the 740 is based on) ended up in the hands of 3DFX... who was purchased by NV. Hence all their bases belong to NV.
I have no doubt one of Intels biggest regrets has been not buying 3DFX.
Come on man.. you're going to pull the later patent after 740 link to 3dfx on me.. I didn't know time travel was allowed.
Ah man the whole thing is a mess frankly. The Real3D patents which where the basis of the 740 for sure and I believe (but could be wrong) the bones of things like the 810 chipset gpu stuff ect... where cross licensed 20 different ways and had ownership stakes. Nvidia factors in a bunch of different ways. Nvidia also picked up SGIs portfolio and they owned 10% of real3ds patents.... and the exact details of the Intel 3DFX settlement I don't think where ever made public but most reports say Intel basically gave most of those patents to 3DFX in that deal to put an end to the law suit. (some have claimed 3DFX obtained a license to them).
Bottom line is however you shake it out.... Nvidia wiggled their way into almost every Patent Intel was using. Hence the massive licensing deal they got Intel to agree to. Also I'm sure why Intel seems to hate Nvidia so completely.
(really though it would be super interesting to know how much of that 1970s Apollo Simulator tech still lives in modern GPUs lol)
Yes, rather Insestuous the whole industry is. In the end it's all about the cash.
Funny thing, the actual Apollo lander "computer" is like on the scale of a TS-80.
AMD's best option is for a mid tier card that excels is Vulcan and DX12 and performs well in console ports due to shared tech.
That is all.
If they make a make a modular design that can scale to different market segments, all the better!
People forget, AMD doesn't have the budget to design 100 CPU's a year and maybe not even have something as fast as a 1080Ti. They have hit price to performance and a segment where they can sell good volumes of product. AMD can't always be the best, but if they play their cards right, they can make a lot of money being 2nd twice. The only question remains is can they produce the volume? Remains to be seen.
Its not that AMD can't design a high end part. They sell Radeon pros that can more then smack a 1080ti around. The issue is supply, marketing... and profit margins. The simple truth is there is almost zero profit margin in the tippy top of the high end gaming cards. Those chips can just as easily end up on Radeon Pro / quatro cards and the demand AMD is seeing right now for 9100s and Pro SSGs is high they have a hard enough time producing enough chips to keep those customers happy. Those customers make them far better margins. NVidia is no different the reason you haven't seen volta based consumer cards... you can blame on low competition but I think the real truth is Nvidia can't really produce them in high enough volume to make it worth not being able to deliver supply to meet the demand of the higher end Super Computer/Compte markets they are selling them in. Until they can make Votla based chips for the mid range they will never touch the high end consumer market with them.
The 1080tis and other top end gmaing cards only ever get made to win benchmarks and mind share. Which sells the real mass market mid range parts. AMD seems to have said let NV have the benchmark single digit wins for now... and focused on the mid range segments where all the volume is. The crap end of that is the mining craze seems to be snapping up most of that supply... making it hard for people to get their hands on AMD gaming cards at all.
Huff. Why ChadD, why?
But those are workstation graphics cards. I spent two years managing a CAD lab and the price of those cards used to shock me, especially since they were built from the same silicone as your standard gamer card. The thing is, they are almost the same cards, but they're built for very different purposes. When a gaming card is rendering the frames of a scene and it encounters an error, it junks the whole thing, the error, the frame, and the associated polygons in its buffer, and then it quickly moves on to rendering the next frame. You'd be surprised how many times a gaming card hits a floating point error and dumps a half-rendered scene, but you don't see it. On the other hand, if a workstation card is rendering a scene it must not have an error. Same GPU with a similar workflow, but very different processes used to arrive at the final result.
Students would use gaming cards with 3ds Max because they couldn't afford workstation cards for home computers, but ... You'd extrude an object, splice, splice again, and it'd be all good until a few hours later when you were almost finished with your project and then BAM! the wireframe you were looking at would suddenly explode into something that looked like a bowl of popcorn with some guitar strings shot through it.
So “cheap” performance equivilant to a 1080 with HBM2 not for gaming. So maybe they are just gonna call it a crypto mining card and do a full 180.
Lots of talk about licensing and patents- yet x86-64 is what it is. Add a few registers, increase their size to handle the longer words... wow that was hard. Funny Intel was able to 'hack' it into Pentium IV's right quick, and then the Core 2 arch (based on the 32-bit Pentium Pro/II/III!) put them three years ahead of AMD again.
Someone doesnt remember ia64
Amds solution was better and Intel dropped their own to use it
Someone is being really short-sighted. Intel wanted to push IA64 so much that they chose to ignore the obvious easy step to x86-64. They could have done it at any time and chose not to. Nothing hard about increasing register size, lol.
And yet there goes amd with their amd64 patents.
Intel really dropped the ball there.
Could have had x86 and x86-64 in the bag.
Oh well hindsight and all.
It's just how much Intel wanted IA64 to be a thing. Hell, I'd still like it to be a thing- it needs great compilers, which we just might get with machine learning becoming a thing .
But it certainly wasn't going to be a thing back then, and AMD made a smart move by taking the obvious step of extending x86 to 64bit right when 2GB of RAM was becoming a real limitation. Microsoft reciprocated with XP 64 and then Vista 64, and that was that.
[it didn't help Intel that AMD also moved to an IMC with the Athlon64 either- absolutely spectacular timing on their part, as the IMC cemented their performance competitiveness until Core 2 was released]
Intel still doesn't really care about the consumer. They care about selling to other companies.
I still think Larrabee was half assed from the start. But that's what I can glean from the Intel people I talked to back in the day.
So will navi with infinity fabric be transparent to game devs, or will it be like everything else, and devs get to choose to support it "Publisher CEO: RELEASE THE GAME NOW!!!!" *dev releases game only supporting 1 out of 4 navi cores*
*nvidia uses said game to show how much better their new gpu is*
Infinity Fabric will probably make multiple die gpus easier much easier for AMD, I think a lot more development is needed to see how it affects mgpu / sli implementation.
If it works similar to the CPU, a GPU with 4 cores that each have 15 compute units should work identical to a single core with 60 compute units. It's possible they make it that each core could be directed at different tasks to limit any performance loss that would occur because of the Infinity Fabric.
I gotta say - my Titan Xp purchase seems better and better. From the rumors I've heard, NVIDIA's next lineup is more in line with a Pascal refresh or Pascal-plus, rather than a full new generational leap. This thing will stay relevant for a long, long while (in computer years, that is).
For me the only problem with videocards atm is the price. I don't need something faster than a gtx 1070, but I will not pay what they cost.
You seem to be overlooking a simple truth. Itanium was never ever for consumers, or even non massive server setups. There where no plans to put it in grandmas computer, the main issue is they didn't even have plans to get it into the majority of servers of the day.
It was a bit of a F u to the server industry that was mostly running on x86 32 bit... at a time when they where really starting to need the increased memory 64 bit could bring. Considering AMD was around... and even Via, who had just aquired the National Semi Conductor (Cyrix) guys. Considering the Cyrix team had completely white roomed x86 and had due to purchases just scrapped their M3 which I believe may have been the first Design to ever incorporate a GPU. It should have been clear to Intel that there where multiple companies out there capable of stepping up to fill the gap. Intel was run by morons thinking someone else wouldn't step in to fill the massive market hole they where ignoring. Pure hubris and they deserved to have their Itanium plans dashed. Of course those morons also had a massive bag of a cash and a willingness to break the law... so they win I guess.
so it will be a great gaming card if it is priced right.
It'll be sky high like everything else on the market. That $250 price tag makes me laugh. More like $650.
and if we can get our hands on it in quantity. Which implies uninflated costs.