Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
bc if they sold it for $850 and it was equal to the 980 ti (for $650) then you'd have to be a little unhinged to pay $200 for nothing
AMD Radeon Fury X rumored to be limited to just 30,000 units for 2015
http://www.tweaktown.com/news/45835/amd-radeon-fury-rumored-limited-30-000-units-2015/index.html
bc if they sold it for $850 and it was equal to the 980 ti (for $650) then you'd have to be a little unhinged to pay $200 for nothing
Just like everyone else we’ve just heard rumors of AMD’s Fury X being limited to 30,000 units for 2015 and only coming with Liquid Cooling. And simply put those rumors are nothing more than that, rumors. In fact it took us less than 40 minutes to confirm with some of our most reliable sources which are very close to AMD that this was nothing more than baseless speculation.
Too bad Freesync is such garbage. I might actually want to go back to AMD for cheap performance but G-sync and timely driver releases have me spoiled.
What is wrong with freesync? seems like people who have it like it
SweClockers said it was going to be $849.
Within VERY close distance of the $1000 Titan X and above the 980 Ti according to the leaked 3Dmark. Probably driver tweaks before launch should easily push it above the Titan X, especially the more demanding we go, which is what these cards are meant for anyway. At 4k should wreck the $1k Titan due to it's massive HBM bandwidth advantage...
Looks VERY good for AMD. Card is blazing fast. And this time Nvidia has no "Titan XXX edition" to pull out of it's hat as Titan is the full chip.
AMD has been gaining on Nvidia top performance every generation with much smaller dies, it had to happen that Nvidia ran out of room on 28nm where AMD has plenty to spare, hence AMD's single GPU flagship has finally caught or surpassed Nvidia's top single GPU card for the first time in ages.
I like how the thread title says "much less Titan" LOL. Considering Titan is a whopping 5% faster than 980Ti. Sigh...
Well... You'd be paying $200 more for worse driver support and the anxiety of gameworks features tanking performance. Sounds like a bargain.
I'd love to wait a few months for Xfire support on AAA games. Sign me the f' up.
NVIDIA has always had good drivers. I've used NVIDIA cards dating back to the GeForce 256 days, and they were reliable even then. They've had a few high profile issues over the years, but those have been few and far between. AMD (then ATI) had some pretty rough drivers for a while, but all-in-all I haven't had problems with them in a long time. The only issue I see is that AMD is generally slower with CrossFire profiles.
For single GPU users (most people) it shouldn't be a concern anymore.
If you haven't used NVIDIA drivers in what sounds like 7 years (GTX 200-series time frame) I don't think you can really make a qualitative judgment of how well they work.I beg to differ i used to be a hardcore nvidia fan for years I had a host of issues with drivers. And every single time no matter how i installed them they always blue screened pretty regularly.
My first nv card was a geforce gts 256... my last one was a geforce 250 gtx...
I have sitting around still functional a agp 4200 ti and a 6800 xt
If you haven't used NVIDIA drivers in what sounds like 7 years (GTX 200-series time frame) I don't think you can really make a qualitative judgment of how well they work.
No, I definitely agree with you. I used NVIDIA for 500 series and 600 series and I've used AMD for the past 18 months and more or the less the drivers are equally capable, with NVIDIA having marginally better SLI software support.While I wholeheartedly agree with this statement, I'd say it equally applies to those on the other side of the fence as well. Too bad reality is often a one way street.
If you haven't used NVIDIA drivers in what sounds like 7 years (GTX 200-series time frame) I don't think you can really make a qualitative judgment of how well they work.
So what's the status with 4K these days? Only the 960 and Tegra X1 support HDMI 2.0 as of now? And the Fiji cards are expected to have HDMI 2.0?
NVIDIA has always had good drivers. I've used NVIDIA cards dating back to the GeForce 256 days, and they were reliable even then. They've had a few high profile issues over the years, but those have been few and far between. AMD (then ATI) had some pretty rough drivers for a while, but all-in-all I haven't had problems with them in a long time. The only issue I see is that AMD is generally slower with CrossFire profiles.
For single GPU users (most people) it shouldn't be a concern anymore.
AMD and nVidia drivers, these days, are about on par. The only thing that nVidia does better is getting updates out faster and more frequently. This is largely because they have a larger budget and they use that budget to buy themselves into game developer's hearts so the game developers will often help them out and completely leave AMD on their own - even though AMD still represents a very large portion of the installed gaming GPU base.
"Only thing Nvidia does better" -- I'd call getting updates out fast a major thing if not the most important thing. Day one drivers and SLI profiles on major AAA releases is paramount and trying to marginalize the importance of it or making excuses for why AMD can't meet that expectation is absurd.
Maybe instead of blowing one big 8 million dollar chunk in a check to EA to crowbar Mantle into a handful of games, AMD should have spread that money around a bit more or maybe not gutted their driver team. No one to blame but themselves.
They didn't pay 8million to EA...
Let this myth die already.
"Only thing Nvidia does better" -- I'd call getting updates out fast a major thing if not the most important thing. Day one drivers and SLI profiles on major AAA releases is paramount and trying to marginalize the importance of it or making excuses for why AMD can't meet that expectation is absurd.
Maybe instead of blowing one big 8 million dollar chunk in a check to EA to crowbar Mantle into a handful of games, AMD should have spread that money around a bit more or maybe not gutted their driver team. No one to blame but themselves.
They did buy a giant advertisement in Times Square for the 300 series launch. I personally would of preferred they used that $ to hire a few more engineers and get their multi-GPU back up to speed. It wasn't too long ago [H] was praising them for XDMA and smooth crossfire.
My only hope is that they shifted some of their engineers to VR and DX12 so when that launches it'll go smooth.
The sign, with playback system designed and managed by Diversified Media Group, is a single surface covering a city block in length and stands eight stories high. Driving the visual display are three AMD FirePro professional graphics cards using AMD Eyefinity Technology, with each card powering six sections of the display for a combined resolution of 10,048 x 2,368 pixels. The individual display sections are synchronized across graphics cards and zones using the FirePro™ S400 synchronization module.
Did they pay for that or did they get ad space for supplying the hardware and helping with the implementation? Hell, they probably got paid for it.
Maybe instead of blowing one big 8 million dollar chunk in a check to EA to crowbar Mantle into a handful of games, AMD should have spread that money around a bit more or maybe not gutted their driver team. No one to blame but themselves.
garbage removed,
Without knowing Mantle's impact on the industry (API's in particular) it's really short-sighted to call it a failure. Calling it a success, in any capacity, is also presumptuous.
Or... you can make yourself sound like a dick by taking sides on the issue.
Without knowing Mantle's impact on the industry (API's in particular) it's really short-sighted to call it a failure. Calling it a success, in any capacity, is also presumptuous.
Or... you can make yourself sound like a dick by taking sides on the issue.
Pieces of the DX12 manual are cut and paste from the Mantle manual.
Three things.NVIDIA was able to beat Mantle without paying game developers to use a separate API.
http://www.tomshardware.com/news/nvidia-driver-update-direct3d-optimization,26381.html
This is probably why AMD killed it.
AnandTech
[The 780 Ti is] 11% faster than Radeon R9 290X
Mantle reduces the CPU’s workload by giving developers a way to talk to the GPU directly with much less translation. With less work for the CPU to do, programmers can squeeze much more performance from a system, delivering the greatest benefits in gaming systems where the CPU can be the bottleneck.
Intel Core i7 3930K
Ha. How are those numbers looking now? And by now, I just mean any time in 2015.
Bankruptcy clause does not fix anything for AMD, Sony and Microsoft. Console manufacturing relies on a competitive product when previously the situation with both Intel and Nvidia the hardware was so damn expensive they took a good financial hit. Now with competition that part has been solved but in no way are they benefiting from a short term clause. If they wanted to be at the mercy of Intel and Nvidia they never would have gone with AMD.
In the end it might even be cheaper to bail out AMD rather then to "let" them go bankrupt. Your clause with Cry engine really is peanuts compared to the massive amounts of money that is in the console industry.
So when AMD goes bankrupt tomorrow a "contract" appears which says well thank you AMD well take your chips for free now (does not really sound plausible when your partners are racking in the money, does it?).