FuryX users = Dust in the wind on these forums

Status
Not open for further replies.
Unless the user were running ESXi or a VM farm on that system with ECC memory, the dual-core i3-6100 is a far better value.

Well I already answered this above and I said that I would leave the topic alone. I am 100% sure that buying a dual core in 2015 is a bad idea unless building a word processing machine or a basic laptop. Otherwise we would all be playing Battlefront on our dual core laptops. There is a reason that DX12 was created and dual cores are the antithesis of this ideology. Nice for mobile gaming though, which is where Intel is headed.
 
Well I already answered this above and I said that I would leave the topic alone. I am 100% sure that buying a dual core in 2015 is a bad idea. Otherwise we would all be playing Battlefront on our dual core laptops. There is a reason that DX12 was created and dual cores are the antithesis of this ideology. Nice for mobile gaming though, which is where Intel is headed.

The i3-6100 @ 3.7GHz has over twice the IPC per core that an FX-8350 @ 4GHz has, plus HyperThreading.
So basically, two cores on the i3 out-perform four cores of the FX-8350, while still being able to run four simultaneous threads.

That has nothing to do with "mobile gaming", I'm talking about desktop-grade CPUs.
I'm not sure you realize how out-classed AMD FX CPUs are right now compared to Intel's Skylake CPUs.
 
Well I already answered this above and I said that I would leave the topic alone. I am 100% sure that buying a dual core in 2015 is a bad idea unless building a word processing machine or a basic laptop. Otherwise we would all be playing Battlefront on our dual core laptops. There is a reason that DX12 was created and dual cores are the antithesis of this ideology. Nice for mobile gaming though, which is where Intel is headed.

You still don't get the point, even if the FX had 32 cores it would still be less FPS than the i3 6100 which makes it a shitty processor. Most chips will get 60 FPS anyways, getting 60 FPS is no big deal, it is about better value for $ spent.
Spending more than a 100 bucks on a mobo to overclock a really old 8 core CPU is not a good thing for the consumer.
The only reason Intel hasn't made any significant improvements since 2500k is because AMD made the FX, and then Intel realized that they can completely ignore the existence of AMD and sell a shit ton of CPUs because they are the only player left.
Most Tech enthusiasts want to see big improvements per generation on both sides and that will only be possible when there are two sides to choose from, but there is only one side right now that is viable.
Zen with 2600k/3770k performance with a 10% lower price tag than intel will get way more praise than Skylake or any of these 5% IPC refreshes after Sandy ever got.
 
Nvidia has dominated the flagship tier for the better part of a decade, and this forum is full of rich people scooping them up.
You can't "defend your viewpoint" here because the Fury X is borderline indefensible. When AMD starts making competitive flagship GPUs then I suppose you'll see a change in the wind here.

What a load of bullshit...
 
The i3-6100 @ 3.7GHz has over twice the IPC per core that an FX-8350 @ 4GHz has, plus HyperThreading.
So basically, two cores on the i3 out-perform four cores of the FX-8350, while still being able to run four simultaneous threads.

That has nothing to do with "mobile gaming", I'm talking about desktop-grade CPUs.

Well ok I'll jump back in since asked. It is a POS. I would never own one. I don't care what type of Hyperthreading it is supposed to have. Magic fairy dust. It is still a dual core. That makes it useless for gaming if you like doing other things at the same time in a DX12 world where more cores is king.

If it is so great then Intel is screwing everyone over that bought quads, hex cores, and octa cores. I have yet to see a successful Twitch streamer running a dual core. I see lots of 5820K's and 5960X though. The ones with quad core Intel chips will tell you to donate / subscribe so that they can upgrade.

Before you toss out the "You can stream with a video card" all of those streams look like a LEGO Block massacre happened. Blocks, blocks, and more blocks. Now of course not everyone is into this type of entertainment. I understand that also. But it doesn't change my mind as to what is useful for me.
 
I have a Fury X. It's warm enough at stock, doesn't overclock well, no BIOS or pencils mods to perform so I see little reason to discuss it. I like it for SFF builds and it is what is it. What are you going to do, talk about the mediocre OC then how your pump fails from excess heat?

I like the card. But it is what it is. Not too much to tinker/mess with which makes little to talk about.
 
You still don't get the point, even if the FX had 32 cores it would still be less FPS than the i3 6100 which makes it a shitty processor. Most chips will get 60 FPS anyways, getting 60 FPS is no big deal, it is about better value for $ spent.
Spending more than a 100 bucks on a mobo to overclock a really old 8 core CPU is not a good thing for the consumer.
The only reason Intel hasn't made any significant improvements since 2500k is because AMD made the FX, and then Intel realized that they can completely ignore the existence of AMD and sell a shit ton of CPUs because they are the only player left.
Most Tech enthusiasts want to see big improvements per generation on both sides and that will only be possible when there are two sides to choose from, but there is only one side right now that is viable.
Zen with 2600k/3770k performance with a 10% lower price tag than intel will get way more praise than Skylake or any of these 5% IPC refreshes after Sandy ever got.

No you don't get it. If I'm getting 60 fps in all my games then what will 70 fps do for me? My monitor is 120Hz. Will an Intel chip get me to 120Hz on max settings minimum on all games? That's the break even point where it matters for me.

I told you that the 990FX boards were on EBAY for $60. 990FX board is a 990FX board. They all will OC just fine. You don't need some special board to OC a FX chip. That's you adding words to what I said.
 
I have a Fury X. It's warm enough at stock, doesn't overclock well, no BIOS or pencils mods to perform so I see little reason to discuss it. I like it for SFF builds and it is what is it. What are you going to do, talk about the mediocre OC then how your pump fails from excess heat?

I like the card. But it is what it is. Not too much to tinker/mess with which makes little to talk about.

Why buy it if you are relegated to the notion that the pump is going to fail due to excessive heat? Why not just get a slightly larger case by a couple of inches and a 980ti? If the Fury X fails like you say then the 980ti not failing has to be a better value. At least for a piece of mind why buy something that you don't believe in?

Of course I've been running AIO water cooling for years and no issues since the first Corsair units had a recall. But I can imagine that others have had failures as nothing is perfect.
 
No you don't get it. If I'm getting 60 fps in all my games then what will 70 fps do for me? My monitor is 120Hz. Will an Intel chip get me to 120Hz on max settings minimum on all games? That's the break even point where it matters for me.

I told you that the 990FX boards were on EBAY for $60. 990FX board is a 990FX board. They all will OC just fine. You don't need some special board to OC a FX chip. That's you adding words to what I said.

Your setup is useful for you, I never said it is not. But you recommending the same setup to someone who isn't doing what you're doing and is just playing video games would be the problem a lot of people have.
 
Why buy it if you are relegated to the notion that the pump is going to fail due to excessive heat? Why not just get a slightly larger case by a couple of inches and a 980ti? If the Fury X fails like you say then the 980ti not failing has to be a better value. At least for a piece of mind why buy something that you don't believe in?

Of course I've been running AIO water cooling for years and no issues since the first Corsair units had a recall. But I can imagine that others have had failures as nothing is perfect.

I meant if you OC the hell out of it. At stock it's fine. I actually lowered the max temp 10C and it's still quiet.

Just ignore that part. As a tinkered there's nothing real exciting about the Fury X after BIOs modding/water cooling/hard modding my Titan X.
 
I meant if you OC the hell out of it. At stock it's fine. I actually lowered the max temp 10C and it's still quiet.

I slapped a H80i GTX on my R9 290 and I top out at 54c with non gaming benchmarks OC to the max. It was a fun upgrade for cheap since my stock card has to be the worst R9 290 overclocker on the planet. How is the Fury X?
 
The answer is in your question.
over the last 5+ years there are more intel core i5,i7 threads than amd threads.:eek: i'll let you figure out why.;)
:)

Remember almost all the CPU threads being about the Athlon64 and Athlon 64 X2 a decade ago?
 
Well ok I'll jump back in since asked. It is a POS. I would never own one. I don't care what type of Hyperthreading it is supposed to have. Magic fairy dust. It is still a dual core. That makes it useless for gaming if you like doing other things at the same time in a DX12 world where more cores is king.

If it is so great then Intel is screwing everyone over that bought quads, hex cores, and octa cores. I have yet to see a successful Twitch streamer running a dual core. I see lots of 5820K's and 5960X though. The ones with quad core Intel chips will tell you to donate / subscribe so that they can upgrade.

Before you toss out the "You can stream with a video card" all of those streams look like a LEGO Block massacre happened. Blocks, blocks, and more blocks. Now of course not everyone is into this type of entertainment. I understand that also. But it doesn't change my mind as to what is useful for me.

no man, just no.. just because the FX at your usage perform better that doesn't mean it's a better chip. you are wrong and you have to understand that for a purely gaming machine your chip its far worse than anything intel related since 3 generations ago, at any price, which its just worse by the fact that it still get stomped by tiny POS chips as you said i3s.

DX12? DX11?.. IPC its still the king and still i3. belive or not outperform FX8. chips.. overcome it please, understand that your chip its inferior for purely gaming purposes, not everyone buy gaming machine to waste time watching other playing video games in twitch much less buy a gaming machine to stream high quality videos in twitch... people buy gaming machines for gaming.. and so far, i3s are the king of value for gaming machines, just because your setup is good for you doesn't mean it will be the same for everyone else.

and as you like always to post certain ancient 4K SLI review (doubtful untrustworthy at my better opinion =) i'll post a new one with DX12. so DX12 high hopes for AMD FX users? keep the dreams on earth please.

DX11 vs DX12@ TechSpot

CPU_01.png


CPU_02.png


CPU_03.png


CPU_04.png


CPU_05.png


CPU_06.png
 
Have you seen my Ashes of the Singularity benchmarks? There is an entire thread with my processor running the game and getting consistently decent numbers. If that game ran badly then I would have upgraded in a heartbeat. Also my monitor is 120Hz. When Intel can get their processors to run at 120Hz all day on close to max settings then wake me up for an upgrade.
 
I slapped a H80i GTX on my R9 290 and I top out at 54c with non gaming benchmarks. How is the Fury X?

I lowered the max temp to 55C but stock the VRMs get 100C+ and 65C for a max IIRC. I don't want to know what happens OC'd. Before all the "but VRMs can get to that temperature" remember the whole board is enclosed for some awful reason.
 
I slapped a H80i GTX on my R9 290 and I top out at 54c with non gaming benchmarks OC to the max. It was a fun upgrade for cheap since my stock card has to be the worst R9 290 overclocker on the planet. How is the Fury X?

Did this increase the max OC you can get by a big amount? I am planning to cool mine too and want to know if it will be a minor stability change or a big one, right now very high overclocks 1100/1600 need 100% fan speed to be stable.
 
I lowered the max temp to 55C but stock the VRMs get 100C+ and 65C for a max IIRC. I don't want to know what happens OC'd. Before all the "but VRMs can get to that temperature" remember the whole board is enclosed for some awful reason.

Would have been nice if AMD had released a Fury X without the water cooler. Then you could slap a full water block on it like this EK block, and hook it to something like the Swiftech AIO that is expandable.
 
Well ok I'll jump back in since asked. It is a POS. I would never own one. I don't care what type of Hyperthreading it is supposed to have. Magic fairy dust. It is still a dual core. That makes it useless for gaming if you like doing other things at the same time in a DX12 world where more cores is king.

Whoa there, the i3-6100 is definitely not a bad CPU, and core-for-core, it is over twice as fast as the FX-8350 @ 4GHz; I know this, because I own and use both CPUs and platforms.
There is no bias here towards either CPU, it just is what it is.

Yes, it is a dual-core, but with HyperThreading, it can run up to four simultaneous threads (two physical cores, four logical cores), and each logical core is still a good bit faster than each physical core on the FX-8350.
I get what you are saying for streaming using the additional cores, and for your case scenario, that makes sense.

For many others, however, who might just be gaming or rendering, but not all at once, the i3 is far superior, mainly due to the fact that most games don't utilize more than four threads/cores, and this is an area, by itself, than the i3-6100 outclasses the FX-8350.
 
Did this increase the max OC you can get by a big amount? I am planning to cool mine too and want to know if it will be a minor stability change or a big one, right now very high overclocks 1100/1600 need 100% fan speed to be stable.

1050 with 53% fan to 1125 with 20% fan. Fan just cools the VRMs and other stuff now. The Corsair unit fans run at whatever speed. I pay little attention to it since my temps are 42c idle and 54c full load.
 
Whoa there, the i3-6100 is definitely not a bad CPU, and core-for-core, it is over twice as fast as the FX-8350 @ 4GHz; I know this, because I own and use both CPUs and platforms.
There is no bias here towards either CPU, it just is what it is.

Yes, it is a dual-core, but with HyperThreading, it can run up to four simultaneous threads (two physical cores, four logical cores), and each logical core is still a good bit faster than each physical core on the FX-8350.
I get what you are saying for streaming using the additional cores, and for your case scenario, that makes sense.

For many others, however, who might just be gaming or rendering, but not all at once, the i3 is far superior, mainly due to the fact that most games don't utilize more than four threads/cores, and this is an area, by itself, than the i3-6100 outclasses the FX-8350.

Here's my issue with that. Steam is always breaking the AMD and Nvidia encoding when playing a game on my gaming PC and streaming it to my living room PC. Sometimes it defaults to the CPU encoding. I can tell because if I leave my PC on 5GHz and it starts streaming, it turns my room into a hot box when it breaks. Later on that day there is an update and all is well.

Do you really want a dual core encoding your game for your living room that is also being tasked with playing it?
 
1050 with 53% fan to 1125 with 20% fan. Fan just cools the VRMs and other stuff now. The Corsair unit fans run at whatever speed. I pay little attention to it since my temps are 42c idle and 54c full load.

I got a GELID, haven't put it on yet, hopefully it proves worth the money. I was hesitant to spend even $50 - 70 on a cooler since my card was ~USD$160 (used) and i have beaten it up with 100% fan speed for fun overclocks as i got extended warranty so i was wondering if the better cooling will give me a nice bump in perf.
from stock to MAX OC ever i get more than 12% boost in benches which is really really nice for me.
 
Here's my issue with that. Steam is always breaking the AMD and Nvidia encoding when playing a game on my gaming PC and streaming it to my living room PC. Sometimes it defaults to the CPU encoding. I can tell because if I leave my PC on 5GHz and it starts streaming, it turns my room into a hot box when it breaks. Later on that day there is an update and all is well.

Do you really want a dual core encoding your game for your living room that is also being tasked with playing it?

Well it's kind of pointless just running around in circles and debating upon hypotheticals. Someone that owns both CPU's (cough... cough) needs to just do a straight up gaming+encoding benchmark to see where both fall. Otherwise this will never end.
 
Here's my issue with that. Steam is always breaking the AMD and Nvidia encoding when playing a game on my gaming PC and streaming it to my living room PC. Sometimes it defaults to the CPU encoding. I can tell because if I leave my PC on 5GHz and it starts streaming, it turns my room into a hot box when it breaks. Later on that day there is an update and all is well.

Do you really want a dual core encoding your game for your living room that is also being tasked with playing it?

If Steam stream is anything like twitch stream, My old ass laptop ran GTA V fine off my PC when the connection was wired, with wifi it stuttered occasionally. this was i3 PC using steam stream. However i thought my GPU was encoding it, i don't know for sure. And by fine i mean no stutters, and i used this to play games on the TV.
 
Remember almost all the CPU threads being about the Athlon64 and Athlon 64 X2 a decade ago?

Oh man, I miss those days, back when AMD was on top!
Held on to a lot of 939 systems for many, many years, and I can see that Intel's first generation Core i 1366 platform is going to be the same thing for this decade, as well.

Good memories and good times! :cool:
 
If Steam stream is anything like twitch stream, My old ass laptop ran GTA V fine off my PC when the connection was wired, with wifi it stuttered occasionally. this was i3 PC using steam stream. However i thought my GPU was encoding it, i don't know for sure. And by fine i mean no stutters, and i used this to play games on the TV.

It breaks and will default to CPU. Been there done that since it hit the Beta channel long ago.
 
Here's my issue with that. Steam is always breaking the AMD and Nvidia encoding when playing a game on my gaming PC and streaming it to my living room PC. Sometimes it defaults to the CPU encoding. I can tell because if I leave my PC on 5GHz and it starts streaming, it turns my room into a hot box when it breaks. Later on that day there is an update and all is well.

Do you really want a dual core encoding your game for your living room that is also being tasked with playing it?

I've already said this twice, as others here have, but I will say it again.
That FX CPU is great for your usage-scenario, and if it works for you, then great, use it! :)

We are just saying, for individuals who are not you, and who are just wanting to game, not stream or anything else at the same time, will be better off with an i3-6100.
Also, for four-threaded, or lower, applications the i3-6100 is the far superior CPU in terms of performance, TDP, and IPC per core/thread.

It is hard for us to recommend an FX CPU for anyone at this point who either isn't running your exact same scenario (gaming+streaming) on a budget, as the i3-6100 is a much better CPU for pure gaming (no streaming).
No one is trying to "dis" you or your setup, we are just saying it the way it is in terms of the technology being used.


One last thing, DDR3 is going EOL within the next few years, so it is also hard to recommend a FX platform with DDR3 when there is no re-use or upgrade path for that platform.
With the Core i Skylake platforms (1151) DDR4 can be used and can also be re-used in future builds for the foreseeable future, as no successor for DDR4 has been announced.
 
Oh man, I miss those days, back when AMD was on top!
Held on to a lot of 939 systems for many, many years, and I can see that Intel's first generation Core i 1366 platform is going to be the same thing for this decade, as well.

Good memories and good times! :cool:

I am still holding onto two old socket 939 dfi's one with a opty 165 and the other with a 4400. The ultra d was sli modded by me way back when and still boots up when I need to pull some ancient file off it. Man those were the fun days of overclocking for me. Still remember taking my opty 146 to 3.2 on air, with the side off and a box fan on the side of the case lol. Anyone else remember trying to get tccd to 300 1:1 :D that took some serious tweaking in the bios but damn was it fast paired with raid lol. OMG when did I get this old :eek:
 
IMO they shot themselves in the foot by making it a card with a radiator. A huge swath of potential customer base is immediately turned off.
 
I've already said this twice, as others here have, but I will say it again.
That FX CPU is great for your usage-scenario, and if it works for you, then great, use it! :)

We are just saying, for individuals who are not you, and who are just wanting to game, not stream or anything else at the same time, will be better off with an i3-6100.
Also, for four-threaded, or lower, applications the i3-6100 is the far superior CPU in terms of performance, TDP, and IPC per core/thread.

It is hard for us to recommend an FX CPU for anyone at this point who either isn't running your exact same scenario (gaming+streaming) on a budget, as the i3-6100 is a much better CPU for pure gaming (no streaming).
No one is trying to "dis" you or your setup, we are just saying it the way it is in terms of the technology being used.


One last thing, DDR3 is going EOL within the next few years, so it is also hard to recommend a FX platform with DDR3 when there is no re-use or upgrade path for that platform.
With the Core i Skylake platforms (1151) DDR4 can be used and can also be re-used in future builds for the foreseeable future, as no successor for DDR4 has been announced.

A3Venom was surely trying earlier. Thus the discussion. Also I think it is hilarious that...

The guy with the "worst" CPU is the one that's a bonfide power user.

That is all. :D :D :D
 
But when would someone buy a fury X ? If it is considerably slow compared to a 980ti aftermarket, why do you expect other people to cater to your emotional attachment of a brand, you guys are like what, 12?
If you get a good deal and get it for cheap, it is good, but for same $ it sucks to buy something that is slower.
And if you want to talk about AMD hate, they deserve all the hate they get. Specially on CPUs, anyone who wants to discuss or defend AMD FX line shouldn't be allowed to access a computer and should stick to console/iPad Pro for their gaming and computing needs.

380/380x390/390x gets a lot of praise everywhere because they are winners in many points, Fury X ain't.

What if I play AOTS, I'd have same perf, quieter card, that runs cooler and no fear of drivers shutting down my GPU fans overheating my board or frying it.
 
A3Venom was surely trying earlier. Thus the discussion. Also I think it is hilarious that...

The guy with the "worst" CPU is the one that's a bonfide power user.

That is all. :D :D :D

I repeatedly said it works in your case, but for most users it is useless. And I never said i am a power user. I just use open my PC, run a game with only other uses being Youtube, Porn and Netflix. Only thing i have a huge problem with is people recommending it to newbies and making them waste their $ on low budget builds.

And i have made threads asking about upgrade advice to i7 (haswell) or i5 (skylake) - because i want to step into the multitasking domain and will buy a skylake mobo and cpu this boxing day.
 
I am still holding onto two old socket 939 dfi's one with a opty 165 and the other with a 4400. The ultra d was sli modded by me way back when and still boots up when I need to pull some ancient file off it. Man those were the fun days of overclocking for me. Still remember taking my opty 146 to 3.2 on air, with the side off and a box fan on the side of the case lol. Anyone else remember trying to get tccd to 300 1:1 :D that took some serious tweaking in the bios but damn was it fast paired with raid lol. OMG when did I get this old :eek:

I know, right!
Nice setups, those Opteron 165 CPUs and Toledo X2 CPUs with the extra L2 cache were great.

Really liked it when AMD went from Clawhammer (130nm and SSE2) to San Diego (90nm and SSE3), it was such an improvement and really made a difference with games and applications -- when were still using single-core x86.
I remember people arguing about the performance of a 939 3800+ X2 vs a 940 FX-57, since people didn't really understand SMP applications that well back then.

I also remember one of the first games to fully utilize SMP was Quake 4 in late 2005.
When you ran the game, you would just have to add the "-smp" flag to it, and both cores would be utilized (100% on the first core, and about 60% on second -- good CPU-scaling back then, haha). :D
 
A3Venom was surely trying earlier. Thus the discussion. Also I think it is hilarious that...

The guy with the "worst" CPU is the one that's a bonfide power user.

That is all. :D :D :D

lol, it isn't the "worst CPU", it's just dated by everything that is out now from Intel.
Like we said, it works for you, and I'm glad there is an 8-core CPU in that price range so you can get your work done and can do what you want to do without breaking your wallet in half. :p

If it makes you feel any better, I'm still using a FX-4100 with ECC memory in an ESXi box, and it's the same, original, FX system I built back in late 2011.
Bulldozed, haha! :D
 
I have a FX-8120 sitting around here. If my 9 y.o. great niece wasn't so smart mouthed I'd build her a PC around it. I even bought 16GB of ram for it a little before Black Friday. Her mouth and attitude disgusts me. Very bright young lady. Straight A student. I wouldn't bother trying to explain jack to her as she knows everything already.
 
I have a FX-8120 sitting around here. If my 9 y.o. great niece wasn't so smart mouthed I'd build her a PC around it. I even bought 16GB of ram for it a little before Black Friday. Her mouth and attitude disgusts me. Very bright young lady. Straight A student. I wouldn't bother trying to explain jack to her as she knows everything already.

She'd probably tell you she wants a Core i5 in it or nuthin' :-p
 
I have a FX-8120 sitting around here. If my 9 y.o. great niece wasn't so smart mouthed I'd build her a PC around it. I even bought 16GB of ram for it a little before Black Friday. Her mouth and attitude disgusts me. Very bright young lady. Straight A student. I wouldn't bother trying to explain jack to her as she knows everything already.

Gruncle Stan, is that you?! :eek:

MaG5yN8.png
 
I have a FX-8120 sitting around here. If my 9 y.o. great niece wasn't so smart mouthed I'd build her a PC around it. I even bought 16GB of ram for it a little before Black Friday. Her mouth and attitude disgusts me. Very bright young lady. Straight A student. I wouldn't bother trying to explain jack to her as she knows everything already.

Well, you are probably looking at a future CEO there. Despite it pissing me the hell off, people who grow up expecting the world to be delivered to them usually get it.
 
no man, just no.. just because the FX at your usage perform better that doesn't mean it's a better chip. you are wrong and you have to understand that for a purely gaming machine your chip its far worse than anything intel related since 3 generations ago, at any price, which its just worse by the fact that it still get stomped by tiny POS chips as you said i3s.

DX12? DX11?.. IPC its still the king and still i3. belive or not outperform FX8. chips.. overcome it please, understand that your chip its inferior for purely gaming purposes, not everyone buy gaming machine to waste time watching other playing video games in twitch much less buy a gaming machine to stream high quality videos in twitch... people buy gaming machines for gaming.. and so far, i3s are the king of value for gaming machines, just because your setup is good for you doesn't mean it will be the same for everyone else.

and as you like always to post certain ancient 4K SLI review (doubtful untrustworthy at my better opinion =) i'll post a new one with DX12. so DX12 high hopes for AMD FX users? keep the dreams on earth please.

DX11 vs DX12@ TechSpot

CPU_01.png


CPU_02.png


CPU_03.png


CPU_04.png


CPU_05.png


CPU_06.png

Run low enough batches and it still wouldn't matter which API you are under
go above 100K batches and see your I3 IPC shine. That is where your slow 8 core technology from 3 years ago would hardly be the limiting factor.


This thread isn't about CPUs ffs
This thread is an epic fail as if users that buy hardware have one simple obligation to spam the crap out of this forum and forget about what they wanted todo with their new videocard turn into forum zealots because that is what you do when you buy a new videocard these days
 
I repeatedly said it works in your case, but for most users it is useless. And I never said i am a power user. I just use open my PC, run a game with only other uses being Youtube, Porn and Netflix. Only thing i have a huge problem with is people recommending it to newbies and making them waste their $ on low budget builds.

And i have made threads asking about upgrade advice to i7 (haswell) or i5 (skylake) - because i want to step into the multitasking domain and will buy a skylake mobo and cpu this boxing day.

TMI man, TMI :eek:
 
Status
Not open for further replies.
Back
Top