There is an SFF subforum on [H]. Few of the people who post there bother with the video card section, and vice versa. There's a reason for that.
Hopfully i join there sone, the force is strong in this one (nano)
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
There is an SFF subforum on [H]. Few of the people who post there bother with the video card section, and vice versa. There's a reason for that.
Because so far the evidence that Nvidia GPUs are going to "suck" at DX12 is based on one game in it's Alpha stage. So anyone with some common sense is going to withhold judgement until we actually see more examples of DX12 games.
When it came out it was by far the fastest card, allowing an experience nothing else could provide, there was no competition.
Its award was deserved.
When the Fury cards came out, there were already cards performing the same and much faster.
They did not come across with the same impact, in fact, the opposite.
They are priced so high its hard to recommend them let alone give them an award, especially the Nano.
Award deserved? really thats a $1000 card LOL
*Wantapple, im a littel concered abaut the 175w thou, it will be a real challange to get that silent without loosing to much performance.
Any advices what is the best (fastest) 45w cpu out there, becouse well 45w, u can cool passiv with the right gear. But it has to be abel to not bottelneck the 175w nano, i do however recon Nano has to be a bit underclocked to get it silent.
Edit: Probally will have to set up, diffrent modes, one for silent and regular use, and one for gaming.
If you cant afford high end cards, it doesnt mean they dont deserve awards
When the AMD 295x2 came out it was $1500 and was also given a gold award.
http://www.hardocp.com/article/2014/04/08/amd_radeon_r9_295x2_video_card_review
They didnt test against the Titan Z because it wasnt released.yes because the titan z was $3000
The way Kepler, Maxwell and Maxwell II architecture is designed prohibits it from utilizing all the features of DX12, including ASYNC and multiple drawcalls.
Award deserved? really thats a $1000 card LOL
i was thinking on Core i5-6500T 35w, hopefully it will fit my Noctua NH-L9i CPU Cooler. that cpu along with R9 Nano shoud be about 210w. ill be using silverstone sff 450w psu. but if i mod my other little case ill be using a 280w "bronze" flex psu.[/QUOTE
35w is a beuty, how powerful is it ?
]
yes because the titan z was $3000
Anyone waiting for heavily OCed Nano with better circuit board will be badly surprised. According to KitGuru, AMD forbid it's partners to deviate from reference specs. They can only add new cooling, while staying with dimensions and TDP.
http://www.kitguru.net/components/g...w-partners-to-modify-specs-of-radeon-r9-nano/
Well that's a lot less rude than what I thought you'd done. My bad then clearly and I apologise. Given what you originally said I think you can understand how I got the wrong end of the stick though.
I'd imagined some poor AMD employee deflated, listening to a dial tone mid-explanation after you'd hung up on them.
Zarathustra[H];1041827073 said:Which is the appropriate thing to do. A review site that goes publishing recommendations or strong opinions based on speculation is going to have a serious confidence problem with its readership over time.
Until there are a series LAUNCHED titles with DX12 support, any benchmark is nothing but a preview, and not to be taken too seriously. The [H] has built its reputation on this approach. Relying on REAL game performance rather than speculation, or canned benchmarks. Hopefully that will never change, because that is what keeps me coming back.
It very well may be that the GeForce 9xx architecture doesn't handle some aspects of DX12 well. Seeing that it was designed before the API spec was completed, this wouldn't be a huge surprise.
On the other hand DX12 has so many features and leaves so much tweaking in the hands of the game developers, that even if the 9xx has dreadful issues in some titles, it might not in others.
That, and as I mentioned above, I'll play tomorrows games on tomorrows hardware. A GPU doesn't have a very long lifespan in my system anyway. No matter what happens, I won't be too disappointed in my 980ti's. heck, the newest games I'm even playing right now were launched in 2010 (Civ 5) and 2011 (Red Orchestra 2) so I don't think I'll have too much to worry about short term.
Once the DX12 titles become more readily available, I will use benchmarks of those titles with next gen GPU's from Nvidia and AMD as my deciding factor for my next GPU purchase, which by the looks of things is going to be a year away at least.
Which would seem to apply to both the Titan X and the Nano. Yet the Nvidia Titan X gets an [H] Editors Gold Choice award while the AMD Nano gets, "I actually hung up on the call when they told us the price".
One offers a high level of gameplay experience no holds bar, the other does not.
Naturally, else it'd be a Fury X
The only thing separating the two is the artificial cap on TDP.
Most GPU briefings are conference calls with other editors. I stayed till the end, nothing really major. It was an overall quiet conference, there were no hardware questions at all at the end till I asked some hard hitting technical questions that no one else in the conference asked. I was the only one talking about the way the clock speed works, no one else was concerned. The vibe I'm getting is that the hardware aspects are going to be downplayed at the launch, with the emphasis on the size and space saving and power in that form factor from most media.
I think game developers are still going to have to be concerned with DX11 performance in their games for a very long time to come. The base of DX12 gamers and Win10 users isn't going to be that high for a while. Devs will need to ensure DX11 works good and well for a few years more at least, so I see the DX11 experience still being supported and optimized in coming games.
Exactly. This is what every kid screaming "DX12" in every post doesn't get. it's going to be years before developers are actually targeting DX12 and building games around it, rather than bolting on a couple extraneous DX12 features onto a DX11 game and calling it a "DX12" title. It's just not the magic saviour of gaming that some people assume.
Hell, the guy that helped AMD finalize Mantle isn't even shipping the first DX12 EA titles based on Frostbite until holiday 2016, and he's planning to co-ship them with Vulkan last he mentioned.
DX11 will continue to be targeted for years, in some cases even DX9. No matter how obnoxious Microsoft gets with trying to hit people over the head with near-forced Windows 10 installations, a massive pool of Win7 PCs will remain.
Exactly. This is what every kid screaming "DX12" in every post doesn't get. it's going to be years before developers are actually targeting DX12 and building games around it, rather than bolting on a couple extraneous DX12 features onto a DX11 game and calling it a "DX12" title. It's just not the magic saviour of gaming that some people assume.
Hell, the guy that helped AMD finalize Mantle isn't even shipping the first DX12 EA titles based on Frostbite until holiday 2016, and he's planning to co-ship them with Vulkan last he mentioned.
DX11 will continue to be targeted for years, in some cases even DX9. No matter how obnoxious Microsoft gets with trying to hit people over the head with near-forced Windows 10 installations, a massive pool of Win7 PCs will remain.
Developers for games on PS4 and Xbox1 are targeting DX12 hardware level already and past couple of years. Those games should have DX12 paths, at least the newer games coming out next year. How big a difference it will make has to be seen. Will AMD just cream Nvidia in performance - I doubt it (maybe in a few titles but that is about it).
The fact that AMD has denied any up front sampling shows they KNOW this card will be a dud. I have doubts that it will even get within 10% of the real Fury X, especially once thermal throttling kicks in.