Itching to move to 1060 from 290x

The Fury certainly won't address the power draw and heat output issues that 302efi wants to curtail.
Fury uses less power than a 290X that he currently has so it'll put less heat in his room too. http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/17

Vs a 1070 its about a 14W difference in games. https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/27.html

It also looks like its days are over and will at best become a side grade with memory limitations.
A Fury is significantly faster than either a 290X or a 8GB 390X or 480 in plenty of games. It also does better with DX12 games too. That doesn't sound like a side grade in the least. Realistically if you're not gaming @ 4K you'll probably be OK with 4GB for a fair while yet with most games.
 
  • Like
Reactions: Zuul
like this
A Fury is significantly faster than either a 290X or a 8GB 390X or 480 in plenty of games. It also does better with DX12 games too. That doesn't sound like a side grade in the least. Realistically if you're not gaming @ 4K you'll probably be OK with 4GB for a fair while yet with most games.

Why make all the sacrifices from day 1 just to buy a Fury card that is obsolete? In Watch Dogs 2 with the full settings the Fury is already slower than a 480.

And significantly faster than a 290X? No, a 1070 is significantly faster.

wd2.png


perfrel_1920_1080.png


And then we dont have to talk about when you have to sacrifice IQ because there is not enough VRAM.
 
Why make all the sacrifices from day 1 just to buy a Fury card that is obsolete?
Obsolete? It works quite with DX12 and at 1080p and 1440p resolutions. And it'd be about 20% faster than this current card for very cheap if he sells his current card for a decent price. A 1070 will be about 23% faster than a FuryX but cost lots more for (around $141) more assuming he spends about $400 on one and sells his 290X for the same price.

In watch dog s 2 with the full settings the Fury is already slower than a 480.
Nope. A Fury would only be a little slower than the FuryX in that game which favors NV GPU's BTW.

And significantly faster than a 290X? No, a 1070 is significantly faster.
About 20% faster than a 290X is significant. You'll notice it.

And then we dont have to talk about when you have to sacrifice IQ because there is not enough VRAM.
Most games will be shader/GPU limited and not VRAM limited with a Fury for a long. Stop with the FUD.
 
I know the 1070 is the "upgrade". I definitely see your point in not paying for something that's pretty much the same just for a couple better features.

My tower is on the right side (underneath) of my desk and it gets really uncomfortable as far as sitting for any extended period of time. The heat just radiates up all around the desk. Sucks when your trying to have a nice late night Diablo gaming session and you end up sweating in a 68 degree house.. Terrible

If the only thing bothering you about your current card is the heat under the desk a small fan blowing past your case should remove the heat from underneath the desk and would cost you a hell of a lot less than a new video card.

EDIT: DAMMIT, beaten by post #33.
 
What about something like this?

http://www.newegg.com/Product/Product.aspx?Item=9SIA1K64893710

Much less expensive but it's way cheaper than a side-grade and might afford you some more life out of your card (both in terms of performance and longevity), as well as being quieter.

Quick edit: As for the actual heat output, lowering the operating temp of a CPU/GPU by 10C will lower the TDP by ~3% on average, so it actually will end up outputting less heat. I'm just not sure the difference will be appreciable.

I've looked at the Acceleros a few times in the past. The issue is I have Noctua D15 and I can't use any card/cooler that has a cooling on the back, there is almost no room.

I can't move my GFX to any other slot since my first pci-e is the only 16x, the rest only run at 8x
 
If the only thing bothering you about your current card is the heat under the desk a small fan blowing past your case should remove the heat from underneath the desk and would cost you a hell of a lot less than a new video card.

EDIT: DAMMIT, beaten by post #33.

Due to space limitations the case tower has to stay where it is, there's no other option for that. Also having a full tower Antec "quiet" P183 doesn't help with where I can put it.
 
I've looked at the Acceleros a few times in the past. The issue is I have Noctua D15 and I can't use any card/cooler that has a cooling on the back, there is almost no room.

I can't move my GFX to any other slot since my first pci-e is the only 16x, the rest only run at 8x

Ahh yes, it isn't a very space friendly solution.
 
Most games will be shader/GPU limited and not VRAM limited with a Fury for a long. Stop with the FUD.

You sure about that? The last couple of Assassin's Creed games and Rise of the Tomb Raider were all capable of maxing out 4GB of memory, as I recall. I didn't have a fast enough GPU for the memory to be the limiting factor at the time, but there are games that can use more than 4GB, and as time moves forward, this is likely to be more common, and not less.
 
If you can afford to save $350-400+ (really more like $400+, the $350-ish 1070's are deals you have to wait on and tend to go fast) you can afford to save $600.

Actually its more like 39% for the 290. And you have a 290X so in your case it'd be more like 35%.

Its a nice boost but you'll have the upgrade itch within 6 months to a year if you buy a 1070. And then you have to figure what the price of a 1070 + a new mid/high end card will be in your budget.

I went from an OCed 290 to a 1070 evga ftw d and it feels like a 50% increase. Its a bigger upgrade then I expected.
 
You sure about that? The last couple of Assassin's Creed games and Rise of the Tomb Raider were all capable of maxing out 4GB of memory, as I recall. I didn't have a fast enough GPU for the memory to be the limiting factor at the time, but there are games that can use more than 4GB, and as time moves forward, this is likely to be more common, and not less.

WTF, where'd you get that post? I never wrote that in this thread.
 
You sure about that? The last couple of Assassin's Creed games and Rise of the Tomb Raider were all capable of maxing out 4GB of memory, as I recall.
Are 2 games a good indicator for most games though? Now or in the future? DX12 is also supposed to be more VRAM efficient over time according to some of the guys over at B3D too. Given that a 4GB Fury, FuryX, or a 290/X still does well in most games I don't see this as a huge problem.

Very interesting read on the subject. https://forum.beyond3d.com/threads/will-gpus-with-4gb-vram-age-poorly.58233/
 
I went from an OCed 290 to a 1070 evga ftw d and it feels like a 50% increase. Its a bigger upgrade then I expected.
It might feel that way for you but the numbers from in game benches tell a different story. 'Feel' is pretty subjective unfortunately.
 
It might feel that way for you but the numbers from in game benches tell a different story. 'Feel' is pretty subjective unfortunately.

Thats not entirely true, feel is the most important thing.

150fps is awesome benchmark wise....unless you dont look at frame times and realize its micro stuttering. so you could measure "feeling" in that respect.

But from empirical evidence, I have certainly doubled my fps across all games is not more. I run at 1440p so it may have been a bigger help to me than someone at 1080p.

But from some one with first hand experience in the subject I say go 1070 from 290. Worst case scenario, return it.
 
You're trying to give specific performance numbers though. You can't do that based on feeling.

When you can't quantify something is when its OK to try and go off perception or feeling alone but that isn't the case here. Performance and micro stuttering are both very measureable these days and quality reviews are only a google away.
 
Fury uses less power than a 290X that he currently has so it'll put less heat in his room too. http://www.anandtech.com/show/9421/the-amd-radeon-r9-fury-review-feat-sapphire-asus/17

Vs a 1070 its about a 14W difference in games. https://www.techpowerup.com/reviews/MSI/GTX_1070_Quick_Silver_OC/27.html


A Fury is significantly faster than either a 290X or a 8GB 390X or 480 in plenty of games. It also does better with DX12 games too. That doesn't sound like a side grade in the least. Realistically if you're not gaming @ 4K you'll probably be OK with 4GB for a fair while yet with most games.




You're shilling an obsolete GPU that won't come anywhere near the performance of a 1070 and is merely marginally faster than a 290X. Makes as much sense as spending money on a sidegrade to a 1060: none.

Even compared to a FuryX, the 1070 has the clear advantage, with exception to certain DX12 titles that places them within single digit frame rates of one another.

http://www.anandtech.com/bench/product/1720?vs=1731
 
Last edited:
Are 2 games a good indicator for most games though? Now or in the future? DX12 is also supposed to be more VRAM efficient over time according to some of the guys over at B3D too. Given that a 4GB Fury, FuryX, or a 290/X still does well in most games I don't see this as a huge problem.

Very interesting read on the subject. https://forum.beyond3d.com/threads/will-gpus-with-4gb-vram-age-poorly.58233/
That's actually three games (Assassin's Creed Syndicate and Unity being very similar, but distinct), and that's about half of the AAA titles I played in the last 18 months, so yes, I say that's a good indicator of performance now and going forward, at least for me.

When I played all of those, I was using a 290X, and it was possible with that card, at 1080P, to come up with an arrangement of settings that caused performance to tank, especially if one set the texture detail settings to whatever the highest was. It made an especially big difference in Rise of the Tomb Raider to go from "high" to "ultra" or whatever the highest was, on the texture detail slider. I guess I don't know whether the difference was a result of not bringing enough VRAM or FLOPs, but it was there, and the game seemed to think it was VRAM related. It went back to 60ish FPS if I backed that slider off a notch.

Edit: The limits seem to be much higher with my GTX 1080. I haven't revisited Rise of the Tomb Raider, but it shrugs off the Assassin's Creed titles with no apparent fuss at all. I didn't really investigate much further than that, since the 1080 didn't seem to be struggling.
 
What about something like this?

http://www.newegg.com/Product/Product.aspx?Item=9SIA1K64893710

Much less expensive but it's way cheaper than a side-grade and might afford you some more life out of your card (both in terms of performance and longevity), as well as being quieter.

Quick edit: As for the actual heat output, lowering the operating temp of a CPU/GPU by 10C will lower the TDP by ~3% on average, so it actually will end up outputting less heat. I'm just not sure the difference will be appreciable.
I have that cooler on my 290X. I would not recommend it.

First, it makes the card take up probably five expansion slots. Three below, two above. Second, it comes with basically no provision at all for cooling the VRM FETs or memory, except for a HUGE heatsink that attaches to the back of the card. On the bright side, it cools the core super effectively and it's quieter than the stock fans that came with my Sapphire 290X.

Photo for reference. The T-shaped heatsink and ramsinks I had to buy separately.
IMG_0516.JPG
 
I have that cooler on my 290X. I would not recommend it.

First, it makes the card take up probably five expansion slots. Three below, two above. Second, it comes with basically no provision at all for cooling the VRM FETs or memory, except for a HUGE heatsink that attaches to the back of the card. On the bright side, it cools the core super effectively and it's quieter than the stock fans that came with my Sapphire 290X.

Photo for reference. The T-shaped heatsink and ramsinks I had to buy separately.
/snip

That's kinda crappy given the cost of it :(.
 
What you need to do is buy time.. now my meaning is I also jumped at the first R-290 released and like others had the heat issues but I sold the card when mining was the crazy and made profit.. I putted around on a 7950 till AMD got there act together and made the 290X we all knew they could (Sapphire Tri -X 290X New Edition) 34c and gaming 60c with factory 1020/1350 clocks for 345.6GB/s and it cost me brand new ($265) vrm temps are mind blowing low for a 290X .. It runs just a tic slower then the 390x but faster then the 390 .

So there are 290X 's that don't fit the mold and o have never had heat issues as it has been a dream of a card and I can afford to wait much longer then you when a used 1080GTX may cost $250.

but no way in hell would I buy a 1060 as a replacement ..1070 if they give it to me lol..because the fps to price ain't there yet.
 
Dang if you could sell your 290x for 180ish, there are some deals on a 480x for 165 so you could break even and get your space heater problem fixed. Just my 2c though.
 
but no way in hell would I buy a 1060 as a replacement ..1070 if they give it to me lol..because the fps to price ain't there yet.

Please point out any AMD GPU that matches the overall performance of the 1070.

$230 RX480 8GB vs a $400 1070: $170 more buys about 30-50% more performance (even in DX12) in just about every game in every reviewer portfolio.

http://www.anandtech.com/bench/product/1748?vs=1731

Until AMD releases a GPU that can actually compete with nVidia's upper-tier (1070) and flagship (1080) offerings (and hopefully Vega will do exactly that), the 1070's fps to price is money well spent if buying today, imo.

Of course, the option to ride it out and wait for Vega is a viable path, as long as the reasons for wanting to upgrade now can be repressed.
 
You're shilling an obsolete GPU that won't come anywhere near the performance of a 1070 and is merely marginally faster than a 290X.
I'm not shilling anything. I never said it got close to a 1070 and 20% vs his current card is not a marginal difference. I said it was great bang for the buck if he sold his current card and bought it. And that is a fact.

I say that's a good indicator of performance now and going forward, at least for me.
OK so 3 games but that still doesn't matter too much at all. There are plenty of games that never run well no matter what hardware you have. And I'm talking about in general, not just for you or me. You have to show that it'll be a problem for most if not all games over its practical usable life. 4GB of VRAM will become a problem eventually, just like having only 1 or 2GB did, but if its 2-3yr who cares? You can't expect any card to play all games maxed out well at peak resolutions over that time period.
 
OK so 3 games but that still doesn't matter too much at all.

Are you daft? Of course it matters; those are the games I actually played with my 290X.

You'd really be ok with spending $550 (original MSRP of the 290X) on a graphics card that only works really well "in general" for the tasks you use it for?
 
Not trying to throw fire on this:
Fyi here is a 4gb 480 deal for $165 https://slickdeals.net/f/9498892-po...-for-164-99?src=SiteSearchV2_SearchBarV2Algo1

I have seen the 8gb models in the 175-190 range AR.

If the 1060gtx 6GB is your bag then I have also seen these as low as $190 around slickdeals too with the 25Amex coupons.

In my experience the 980ti can be a furnace too, albeit a huge difference from a 290x or 2, maybe the 1070 gtx is a little better (-50w vs the 980ti) but really any this gen card 16 or 14nm is going to save you a bunch of heat output.

As with most tech (that's competitive) the longer you wait the better the deals that will pop up though.
 
I'm not shilling anything. I never said it got close to a 1070 and 20% vs his current card is not a marginal difference. I said it was great bang for the buck if he sold his current card and bought it. And that is a fact.

Another fact: The Fury line was borderline obsolete when it launched in June 2015...

4GB VRAM
275W TDP
No HDMI 2.0

A 20% increase of the dated performance of a 290X is *still* marginal. Even at 1080p, it's not enough to satisfy high eye candy settings with recent games nor prepare for future titles if the plan is to keep the 290X replacement for any measurable amount of time. What if 302efi needs/wants a new monitor and decides on a 1440p and/or a 75-144 Hz model in a year or two? I sure as heck wouldn't want to be stuck with the Fury's limited GPU hardware, and I don't wish that upon him, either.

The 1070 will provide 70-80% increases in DX11 and around 20-25% in DX12 vs his current 290X...all while drawing a hell of a lot less power and spitting out way less heat.

Until AMD brings actual competing tech to market that's performing above the competition's current-gen low and low-mid range offerings, I don't even understand why anyone would advise a sidegrade (1060 included) that won't have a long usable life (290X should indicate that OP likely doesn't upgrade often) and provide the chance for excruciatingly high buyer's remorse.

Keep arguing for the Fury, though...just know what you are really arguing for.
 
Please point out any AMD GPU that matches the overall performance of the 1070.

$230 RX480 8GB vs a $400 1070: $170 more buys about 30-50% more performance (even in DX12) in just about every game in every reviewer portfolio.

http://www.anandtech.com/bench/product/1748?vs=1731

Until AMD releases a GPU that can actually compete with nVidia's upper-tier (1070) and flagship (1080) offerings (and hopefully Vega will do exactly that), the 1070's fps to price is money well spent if buying today, imo.

Of course, the option to ride it out and wait for Vega is a viable path, as long as the reasons for wanting to upgrade now can be repressed.

You missed my point that it would cost me $400 to gain 30% to 50% if that as my 290x is alittle faster then his.. now that AMD is releasing WattMan for Hawaii it may hold more performance as a 1200Mhz Hawaii is do able on my 6Phase board.
 
Of course it matters; those are the games I actually played with my 290X.
Its 3 games out of how many? That you played them doesn't matter.

You'd really be ok with spending $550 (original MSRP of the 290X) on a graphics card that only works really well "in general" for the tasks you use it for?
Did I recommend he buy a 290X or spend $550? Nope. Re-read the thread. I was talking about getting a Fury (which can be had for $259) and selling his current card (which is a 290X) for $180 which would give him 20% more performance for $79 out of pocket. It'd also use less power and put out less heat which he said were his real concerns. A 1070 will be faster but he'll have to spend more out of pocket too which I already pointed out as well earlier in a different post. If you're not going to read the thread, or even worse misread the thread and put words in my mouth and attribute to me things I never said, don't post.

Another fact: The Fury line was borderline obsolete when it launched in June 2015...4GB VRAM 275W TDP No HDMI 2.0
None of those things are either pressing needs or even indicators of being obsolete at all for playing PC games. You can create scenarios where they'd be a issue but that can be true of any video card.

A 20% increase of the dated performance of a 290X is *still* marginal.
20% isn't marginal. Words mean things and you can easily notice a 20% performance increase just by eyeballing it in game performance. It'd be an extra 12fps for instance if you were previously stuck at 60fps in a given game. Marginal would be like 1% performance improvement which is unnoticeable.

Even at 1080p, it's not enough to satisfy high eye candy settings with recent games nor prepare for future titles if the plan is to keep the 290X replacement for any measurable amount of time.
I have a 290 non-X and contrary to your belief it plays many games, even new ones, quite well. Even at 1440p if you turn down some IQ settings, but the game will still look great, I'm hardly suffering here. Dishonored 2 was the game that gave me the most trouble recently but I was still able to finish and enjoy it just fine prior to the recent patches which improved performance greatly. As were many other people. You or I can easily create what-if scenarios why he'd want a 1070 or even a 1080 (if you read earlier in the thread that was what I was recommending BTW) but realistically he'd probably fine for another couple years either with his current card or a Fury.
 
Its 3 games out of how many? That you played them doesn't matter.


Did I recommend he buy a 290X or spend $550? Nope. Re-read the thread. I was talking about getting a Fury (which can be had for $259) and selling his current card (which is a 290X) for $180 which would give him 20% more performance for $79 out of pocket. It'd also use less power and put out less heat which he said were his real concerns. A 1070 will be faster but he'll have to spend more out of pocket too which I already pointed out as well earlier in a different post. If you're not going to read the thread, or even worse misread the thread and put words in my mouth and attribute to me things I never said, don't post.


None of those things are either pressing needs or even indicators of being obsolete at all for playing PC games. You can create scenarios where they'd be a issue but that can be true of any video card.


20% isn't marginal. Words mean things and you can easily notice a 20% performance increase just by eyeballing it in game performance. It'd be an extra 12fps for instance if you were previously stuck at 60fps in a given game. Marginal would be like 1% performance improvement which is unnoticeable.


I have a 290 non-X and contrary to your belief it plays many games, even new ones, quite well. Even at 1440p if you turn down some IQ settings, but the game will still look great, I'm hardly suffering here. Dishonored 2 was the game that gave me the most trouble recently but I was still able to finish and enjoy it just fine prior to the recent patches which improved performance greatly. As were many other people. You or I can easily create what-if scenarios why he'd want a 1070 or even a 1080 (if you read earlier in the thread that was what I was recommending BTW) but realistically he'd probably fine for another couple years either with his current card or a Fury.

Turn down IQ setting because it's required? Buy a console, then. That would be the most cost-effective solution for following that logic.

You just said it: you played a recently released game (Dishonored 2) and it gave you a bit of trouble. Games are just going to get more and more demanding, so it makes no sense to sidegrade just for the sake of easing power draw and heat output. Might as well get a huge performance boost while getting those benefits.

If getting a Fury: and then in a couple years when it's time to upgrade because that Fury will be even more long in the tooth than it is today, what will it's residual value be? The same as a 1070? Not a chance. ...and in those couple/few years, a future version of a 1060 or RX480 tier/level card (1160? 1260? RX580? RX680?) will likely be pushing the $400 mark that the 1070 is today, so the out-of-pocket delta for upgrading is going to be much larger with selling a used Fury than it is with selling a used 1070 to offset that future upgrade cost.

Yeah, a lot of "what ifs". But they absolutely need to be fairly contemplated and considered.
 
Turn down IQ setting because it's required? Buy a console, then. That would be the most cost-effective solution for following that logic.
When talking about PC gaming? Nope. And everyone has to turn down IQ settings at one point or another since no virtually no one can afford to buy a new top end card all the time or even to afford SLI/CF.

You just said it: you played a recently released game (Dishonored 2) and it gave you a bit of trouble.
It gave every card a bit of trouble and even a 1080 will struggle to maintain 60fps with it at 1440p and does lousy with it at 4K even with settings turned down. Your original point was about 4GB VRAM being overly limiting. And it was still playable and still looked quite good. And now they (Bethesda) patched it and guess what? It performs lots better. On all cards. That should be just about impossible if VRAM was the problem.

If getting a Fury: and then in a couple years when it's time to upgrade because that Fury will be even more long in the tooth than it is today, what will it's residual value be? The same as a 1070? Not a chance
If the resale market stays the same as it is now? Probably $150-200. My 290 would probably still sell for around $110-150 on ebay and its relatively as old as a Fury would be then. But then buying video cards based on resale value is kind've dumb since you can't actually predict where the market is going to be in 3 months much less 2yr.

Yeah, a lot of "what ifs". But they absolutely need to be fairly contemplated and considered.
Any fair consideration shows your what-ifs to be mostly silly at best. If he or anyone wanted more performance then a Fury could offer sure a 1070 or a 1080 would be the way to go but if you're on a budget 20% more performance for $79 more is a damn good deal and nothing you're saying is addressing that fact.
 
When talking about PC gaming? Nope. And everyone has to turn down IQ settings at one point or another since no virtually no one can afford to buy a new top end card all the time or even to afford SLI/CF.


It gave every card a bit of trouble and even a 1080 will struggle to maintain 60fps with it at 1440p and does lousy with it at 4K even with settings turned down. Your original point was about 4GB VRAM being overly limiting. And it was still playable and still looked quite good. And now they (Bethesda) patched it and guess what? It performs lots better. On all cards. That should be just about impossible if VRAM was the problem.


If the resale market stays the same as it is now? Probably $150-200. My 290 would probably still sell for around $110-150 on ebay and its relatively as old as a Fury would be then. But then buying video cards based on resale value is kind've dumb since you can't actually predict where the market is going to be in 3 months much less 2yr.


Any fair consideration shows your what-ifs to be mostly silly at best. If he or anyone wanted more performance then a Fury could offer sure a 1070 or a 1080 would be the way to go but if you're on a budget 20% more performance for $79 more is a damn good deal and nothing you're saying is addressing that fact.
What you will learn quickly is the Anti-AMD-brigade is persistent and commonly without rationale or context. I agree with you entirely. If the OP has the funds to do so then there is no reason not to go with a 1070 or 1080. But he stated that HE IS FINE WITH HIS CURRENT PERFORMANCE and ONLY HAS ISSUE WITH HEAT and NOISE. So a FuryX is the best solution as it is far quieter than any other GPU and cost next to nothing over selling his current card. The extra performance is not necessary by his CRITERIA but the reduced heat and noise IS.
 
What you will learn quickly is the Anti-AMD-brigade is persistent and commonly without rationale or context. I agree with you entirely. If the OP has the funds to do so then there is no reason not to go with a 1070 or 1080. But he stated that HE IS FINE WITH HIS CURRENT PERFORMANCE and ONLY HAS ISSUE WITH HEAT and NOISE. So a FuryX is the best solution as it is far quieter than any other GPU and cost next to nothing over selling his current card. The extra performance is not necessary by his CRITERIA but the reduced heat and noise IS.

Anti-AMD? Ha!

You should spend some time researching both my comment history and hardware history posted in this forum instead of making assumptions...
 
What you will learn quickly is the Anti-AMD-brigade is persistent and commonly without rationale or context. I agree with you entirely. If the OP has the funds to do so then there is no reason not to go with a 1070 or 1080. But he stated that HE IS FINE WITH HIS CURRENT PERFORMANCE and ONLY HAS ISSUE WITH HEAT and NOISE. So a FuryX is the best solution as it is far quieter than any other GPU and cost next to nothing over selling his current card. The extra performance is not necessary by his CRITERIA but the reduced heat and noise IS.

Besides doing the own goal. You already excluded the Fury yourself in favour of the 1060/480. Priceless!
 
Honestly, I get enjoyment out of reading these threads. Then I go home, hop on my FX 8300 / 2 x R9 Fury at 4k setup and completely enjoy it. (Did I mention it is at 4k? :) ) There are some folks around here that are completely anti AMD with zero doubt. Not going to let them bother me well I get to enjoy what I purchased with my hard earned dollars and know exactly what I am experiencing, not what they think I should be experiencing. :)

If the OP wanted to save some money, purchasing the Sapphire R9 Fury Nitro+ would definitely be it and be an upgrade as well. (I had a R9 290 unlocked to a 290x and have direct, real life experience in that.) It uses a bit less power and is no where near as noisy at full load. Otherwise, I would say that sticking with the 290x would be best for now.

I even have an XFX R9 380 at work here and enjoy it very much too. By the way, has the OP posted what he is going to do? Perhaps I missed it.
 
Anti-AMD? Ha!

You should spend some time researching both my comment history and hardware history posted in this forum instead of making assumptions...
I was posting in general. But sorry your posts in here do not follow OPs criteria.
 
I was posting in general. But sorry your posts in here do not follow OPs criteria.

And the point I've been making to 302efi is why sidegrade to a 1060 just for the sake of lowering power draw and heat output? Why not also get a sizeable performance increase while also attaining lower power draw and heat output, if he has the monetary means to do by getting a 1070?

And 302efi stated that he was going to see about getting a deal on a 1070.

And then all the ridiculous posts about settling for an obsolete Fury or weak RX480 started pouring in...

Might as well just say "stick with the 1060" since that was his original criteria in the first place, right?

Please...
 
Quick question guys... I found almost new R9 Nano for $185 locally.. Should I jump on this??
 
And the point I've been making to 302efi is why sidegrade to a 1060 just for the sake of lowering power draw and heat output? Why not also get a sizeable performance increase while also attaining lower power draw and heat output, if he has the monetary means to do by getting a 1070?

And 302efi stated that he was going to see about getting a deal on a 1070.

And then all the ridiculous posts about settling for an obsolete Fury or weak RX480 started pouring in...

Might as well just say "stick with the 1060" since that was his original criteria in the first place, right?

Please...
First that obsolete FuryX at times matches your advised 1070, so not too sure obsolete is a fair term, sounds more Anti-AMD-brigade. Second being the 1060/480 are virtually equal in real world terms, I am not getting the issue there either other than being a Nvidia-propagandist. Most if not all the AMD/Nvidia recommendations are within the OPs criteria. The reason for leaning toward the FuryX? Cost! That is the Only advantage over the 1070/1080, well other than noise which as I stated earlier is better than any card mentioned to date.

So in conclusion, What Messyn191 mentioned is the best conclusion and sound advice.
 
Back
Top