are you going to abandon intel for ryzen?

How do you expect anyone to answer that question? Outside of it's pending launch soon, Ryzen is an unknown at this point. We don't have prices. We don't have performance data. Your question might as well verbatim read, "Chevy is making a new car, will you buy it?"
 
I am super excited for Ryzen, really hoping it gets Intel and AMD competing at the high end again. A Slot A athlon 700 was my *FIRST* cpu/motherboard I bought myself when I was like 13. Ran AMD cpus all the way to the Athlon 64s. I am excited about the prospect of a real CPU price war. Also really anticipating Vega
I have an overclocked i7-6700k with a GTX1070 in my gaming machine. I also have a FX9590/Crosshair V Forumula-Z/R9 Fury in my "workstation/2nd gaming pc." I am guessing I will keep the i7 for gaming, as I am guessing an OC'd skylake will still be faster for current games, especially since I play Starcraft 2 more often than all my other games combined.
So my workstation will most likely get upgraded to an 8c/16t ryzen part. Both machines are watercooled, so I will be pushing Ryzen to see what it can do :)
 
Can we stop with the 'Faildozer' nonsense? I have an FX 8320 and an i7 4930k. I can't tell the difference between the two in 1080p games other than one made my wallet a lot lighter. The FX series was a great value, ''good enough' IPC for single-threaded programs and it's a beast for multi-threaded workloads, i7-level performance for less than the price of an i5.

Saying you can't tell the difference between unnamed 1080p games is not a valid argument. I can run Duke Nukem 3D at 1080p on a FX series processor and get 1000 fps then say Bulldozer is totally awesome because it runs a 20 year old game great. But an E1200 can do the same job just fine. It's like saying any two red cars are the same, nevermind one is a Ferrari and the other is a Renault.

Bulldozer was a disaster, if you have to rewrite parts of the thread scheduler to take "advantage" of the Bulldozer architecture (more like make it avoid putting two resource intensive threads on the same module to avoid severe performance degradation.) then you have a problem. I have several Bulldozer arch CPUs and they're useless for any sort of real workload since you can only effectively use half of the CPU without starting to experience performance degradation from the CMT shared resources. Heat and power consumption is another issue, why get a crippled 125W+ part when a 65-95W Intel part can do the same thing properly and not crank out the heat?

I'm also rather annoyed that the POS ASUS x79 Deluxe I bought is dying already. My 5-year-old Gigabyte 990FXA-UD3 AM3+ board was 1/3 the price and is still going strong with a 4.8 GHz overclock on the FX 8320.

An ASUS motherboard not being reliable? Not surprising.
 
Personally, I am going to take a couple of days off from work on the day of Zen's release. I am going to do the 3 hour drive down to the Northeast Ohio Microcenter and enjoy that day completely. If the prices are not too high, I am going to get an upgrade for both my computers at the same time and enjoy the build at home. (Looking forward to the excitement.)
 
Personally, I am going to take a couple of days off from work on the day of Zen's release. I am going to do the 3 hour drive down to the Northeast Ohio Microcenter and enjoy that day completely. If the prices are not too high, I am going to get an upgrade for both my computers at the same time and enjoy the build at home. (Looking forward to the excitement.)

It sounds like you know what you are doing, but you might need to bring a friend. Microcenter has that one cpu per person policy and some stores enforce that, some don't. With this being a pretty big new cpu launch I could see them being much more strict.
 
Personally, I am going to take a couple of days off from work on the day of Zen's release. I am going to do the 3 hour drive down to the Northeast Ohio Microcenter and enjoy that day completely. If the prices are not too high, I am going to get an upgrade for both my computers at the same time and enjoy the build at home. (Looking forward to the excitement.)

That gives me that much more appreciation for living 20 minutes away from both a MicroCenter and a Fry's Electronics store. While most of my shopping is online these days, it's nice to be able to walk into a computer superstore and just stare at the shiz like a kid. So much shiz! It won't be taken for granted again.
 
I had no idea that AMD was still making new CPU's. I thought they gave up after the Bulldozer Fiasco..
 
It sounds like you know what you are doing, but you might need to bring a friend. Microcenter has that one cpu per person policy and some stores enforce that, some don't. With this being a pretty big new cpu launch I could see them being much more strict.
Easy. Buy 1, go for lunch, come back for another.
Or do 1 for pickup, buy 1 in-store.
 
Cant believe I'm on the AMD CPU section nevermind contemplating buying an AMD processor but here I am!!
Been running this same intel system for 8 years (i7 920) and was finally going to do the upgrade this winter to either 5820K or 6800K and lo and behold whats this..AMD coming out with a new cpu and it actually might be decent for a change?!
You should have jumped on a cheap (~$50) X56x0 Westmere :)
Can we stop with the 'Faildozer' nonsense? I have an FX 8320 and an i7 4930k. I can't tell the difference between the two in 1080p games other than one made my wallet a lot lighter. The FX series was a great value, ''good enough' IPC for single-threaded programs and it's a beast for multi-threaded workloads, i7-level performance for less than the price of an i5.
^ This. People whine, piss and moan about the FX, but it was pretty good value for money / had a lower price of entry. All my non-gaming rigs are running 8320s and those won't need an upgrade for a brief eternity.
 
Last edited:
I'm on s I7-2600K. Non-OCd

the only reason i want to upgrade is my mobo always has had flakey USB. If the performance is there then maybe.
 
Bulldozer was a disaster, if you have to rewrite parts of the thread scheduler to take "advantage" of the Bulldozer architecture (more like make it avoid putting two resource intensive threads on the same module to avoid severe performance degradation.) then you have a problem. I have several Bulldozer arch CPUs and they're useless for any sort of real workload since you can only effectively use half of the CPU without starting to experience performance degradation from the CMT shared resources.

Don't worry mate, it's called PEBKAC. I won't trivialize the discussion saying "mine doesn't do that", you could simply go to Amazon or Newegg reviews and find what professionists that use Vishera processors (cause you replied to a post about Vishera) for a living say. Or go to OBS forum and read experiences of gamers that stream. Example:

https://obsproject.com/forum/threads/hows-the-fx-8350-for-obs-streaming-1080p-while-gaming.2871/

And forgive me if 100% CPU usage isn't heavy enough for you.


You should have jumped on a cheap (~$50) X56x0 Westmere :)

^ This. People whine, piss and moan about the FX, but it was pretty good value for money / had a lower price of entry. All my non-gaming rigs are running 8320s and those won't need an upgrade for a brief eternity.

You can say that again.
 
Last edited:
My prediction is I think going forward into 2018 is that Core i5 owners are going to be disappointed. HT makes a big difference, enough difference to pull your CPU ahead another year or two.

I think any Ryzen and Core i7 CPU will be fine and future proof and that Intel will be forced into HT for all CPU's in the 2018 and beyond (Coffee Lake).


I own a 2500k and I will probably buy the i7 7700k and OCit to whatever I can get stable with my Noctua C14S and forget about it for the next few years. Knowing how DX12 scales, Ryzen isn't going to be any better for gaming, with a few exceptions in a few isolated cases (in that next 2-3 years) and then only a little bit. i7's hyper threading and faster clock speeds already insulated me from any extra "core" gains in games. AMD Fan boys will disagree, but whatever.

Sandy Bridge is still good, but the Witcher 3 and BF1 were two of the first games that actually started to show "age" for Sandy Bridge and that chip is 6 years old and STILL runs fine.

I think getting a Core i5 is stupid though, Hyper threading IS starting to show a difference.

Look, Ryzen will be very close to Kaby Lake. We have hit engineering limits with the current silicone technologies and AMD was going to catch up at some time. Intel isn't going to pull a rabbit out of its ass in 2018 and "destroy AMD" like some people suspect.

I think from here on out, we are going to see the battle shift to Core Count and Price and profits will drop because of it.

In 3 or 4 years from now, we will probably see some radical shifts and breakthroughs, but I think anybody buying a Kaby Lake or a Ryzen, will all be fine until the next break through in 3 years or so.




I think the only people who are going to really see a difference in 2018 and beyond, are going to be Core i5 owners and then mostly in the "beyond" phase, meaning they will still be very happy.



For the most part, we are all bitching about a few dollars or % points anyways.
 
Don't worry mate, it's called PEBKAC. I won't trivialize the discussion saying "mine doesn't do that", you could simply go to Amazon or Newegg reviews and find what professionists that use Vishera processors (cause you replied to a post about Vishera) for a living say. Or go to OBS forum and read experiences of gamers that stream. Example:

https://obsproject.com/forum/threads/hows-the-fx-8350-for-obs-streaming-1080p-while-gaming.2871/

And forgive me if 100% CPU usage isn't heavy enough for you.

You can say that again.


Yes, you'll be able to find limited usage scenarios where the 8 cores of AMD's current FX chips will be of benefit, but for overall computing it is a complete and total fail.

A friend of mine (a software developer) is actually still running his old quad core Phenom II, because he bought an FX8350 and his compile times ROSE by 20% compared to the much lower clocked Phenom II.

I used one as a virtualization server for a while, and it wasn't a bad platform for that, but on the desktop it is really only going to be useful in for rendering/encoding type professional workloads and suck at everything else.

You get a choice between two brands. One brand performs great in a narrow slice of tasks that are well threaded, but stinks at everything else. The other brand performs everything across the board, regardless of how well threaded it is. Which one do you pick?

For me there is one number that will determine if I will buy a Ryzen chip. In Cinebench R11.5 single threaded test, can it beat a score of 1.92 at max overclock on water? If so I am interested.

If not, it is slower per core than my 5+ year old i7-3930k at its max overclock of 4.8Ghz on water, and I won't touch it.

Multithreaded tests are irrelevant when it comes to predicting CPU performance. if it performs well per core, then you KNOW that it will handle everything you can throw at it down the road. If it performs poorly per core, and only does comparatively well when you load up all of the cores, then you know that you are going to ahve mixed results in the future, well threaded apps may work well, but everything else will suck. I'm done making compromises.

I want anything I buy to work well with everything I might run, an that requires good per core performance. I don't care about IPC or Clock speed alone, just that it has high performance per core, and has at least 4 cores. Everything else is irrelevant to me.
 
Yes, you'll be able to find limited usage scenarios where the 8 cores of AMD's current FX chips will be of benefit, but for overall computing it is a complete and total fail.

A friend of mine (a software developer) is actually still running his old quad core Phenom II, because he bought an FX8350 and his compile times ROSE by 20% compared to the much lower clocked Phenom II.

I used one as a virtualization server for a while, and it wasn't a bad platform for that, but on the desktop it is really only going to be useful in for rendering/encoding type professional workloads and suck at everything else.

You get a choice between two brands. One brand performs great in a narrow slice of tasks that are well threaded, but stinks at everything else. The other brand performs everything across the board, regardless of how well threaded it is. Which one do you pick?

For me there is one number that will determine if I will buy a Ryzen chip. In Cinebench R11.5 single threaded test, can it beat a score of 1.92 at max overclock on water? If so I am interested.

If not, it is slower per core than my 5+ year old i7-3930k at its max overclock of 4.8Ghz on water, and I won't touch it.

Multithreaded tests are irrelevant when it comes to predicting CPU performance. if it performs well per core, then you KNOW that it will handle everything you can throw at it down the road. If it performs poorly per core, and only does comparatively well when you load up all of the cores, then you know that you are going to ahve mixed results in the future, well threaded apps may work well, but everything else will suck. I'm done making compromises.

I want anything I buy to work well with everything I might run, an that requires good per core performance. I don't care about IPC or Clock speed alone, just that it has high performance per core, and has at least 4 cores. Everything else is irrelevant to me.

I replied to a specific quote. I don't care about benchmarks, i do a lot of multitasking and use exactly very heavy applications (x264), like the ones that was claimed that the FX can't really handle. I am not a gamer, so i very much prefer the FX than the similarly priced i3, which, has great IPC, but load it with 8 heavy threads (like in a streaming scenario) and it will start to gasp. I want anything i buy to run every application i have fast and not worry about how many threads i run. And FX does that really well and i rarely see a maxed out core. I also enjoy a nimble desktop and 8 threads help a lot in that.
 
I replied to a specific quote. I don't care about benchmarks, i do a lot of multitasking and use exactly very heavy applications (x264), like the ones that was claimed that the FX can't really handle. I am not a gamer, so i very much prefer the FX than the similarly priced i3, which, has great IPC, but load it with 8 heavy threads (like in a streaming scenario) and it will start to gasp. I want anything i buy to run every application i have fast and not worry about how many threads i run. And FX does that really well and i rarely see a maxed out core. I also enjoy a nimble desktop and 8 threads help a lot in that.


Fair enough. So your workload is in the limited range these things are good. Good for you. As far as "I rarely see a maxed out core" comment, this has nothing to do with the CPU itself. Even if the cores were maxed out, you would almost never see them as such in the performance tab of the Task Manager, unless you were loading up all 8 cores. The Windows scheduler moves threads around between cores so fast that most of the time, even a single maxed out thread on one core, will often look like an average of 12.5% across all cores, as the schedule rapidly shifts the thread from core to core.

Chances are you are very much pinning those cores a lot more often than you think.
 
Fair enough. So your workload is in the limited range these things are good. Good for you. As far as "I rarely see a maxed out core" comment, this has nothing to do with the CPU itself. Even if the cores were maxed out, you would almost never see them as such in the performance tab of the Task Manager, unless you were loading up all 8 cores. The Windows scheduler moves threads around between cores so fast that most of the time, even a single maxed out thread on one core, will often look like an average of 12.5% across all cores, as the schedule rapidly shifts the thread from core to core.

Chances are you are very much pinning those cores a lot more often than you think.

There is specific software that if ran at background can later inform you as per what maxed out a core. I use more utilities than the average user does, but i am not a gamer, like most of you are.
 
I have an i5 4670k. I think if ryzen is 30% faster at around $300 bucks I'll jump in. Maybe a little more. The auto OC feature is very exciting. Hopefully it's compatible with the h100i.
 
Most likely. I want to do a 4k build this year as I put it off from 2015 and 2016... Hopefully some decent 40" screens will come out as that is the main reason I want to move. Screen size, thus requiring a more grunty rig.

This current rig will be praised in all its glory and given a fantastic ceremony + elevation to a permanent wall install ahem, relegated to NAS duty in the loft, or if I'm feeling nice, it'll be a 2nd room gaming rig and NAS until I build a more power efficient NAS..
 
1- post here what processor brand/model you have now?
i have an intel i7-4770k that i have been stone waiting for some better to come to replace it with something faster for almost 2 years i think. i must say guys jumping from a 4770k to a 6900k to me is just amazing.

2- and how soon will you do it and for what price do you think the ryzen cpu is worth?
honestly!? for me, 5-6 months i'll have enough money to buy the mobo+cpu+memory.
i think $350 is reasonable or maybe $50 more is not too much to ask, say $400 for the cpu.

you guys?

I have an i7 920.
I don't think I will leave Intel to go to AMD but you never know. Maybe it will come out before I do my next build and it's performance will speak for itself, not marketing, speculations, and dreams. Seeing this thread made me look more into it.
I am rooting for AMD do well because if there's good competition there's better pricing and product innovation. I sadly feel like AMD has failed to stimulate Intel for the last few years.
Last time I was excited for an upcoming AMD it was the bulldozed and that did not end up a great processor, they choked it on limited resources.
The ryzen looks interesting. But we won't know how good it is until it comes out. I honestly don't know how much it will cost since they can sell it for as much as people are willing to pay for it.
Magic question is when is it coming out and will they keep up with demand ? If I wanted one could I even procure it? Every time in the past I demonstrated interest in some AMD products it was either out of stock or the price ballooned out of control. I am talking about the 4870x2 in the past and other beast gpu solutions.
 
Don't worry mate, it's called PEBKAC. you could simply go to Amazon or Newegg reviews and find what professionists that use Vishera processors (cause you replied to a post about Vishera) for a living say.

Yes, you definitely have a PEBKAC if you think that Amazon and Newegg reviews are made primarily by professionists. The majority are made by the average joe, people that don't know what they're doing or trolls. Sure there are also professionals that make reviews, but who has the time to spend hours sifting through crap to find them? If you have to read those types of reviews to make a buying decision, you haven't done your research properly.

Or go to OBS forum and read experiences of gamers that stream. Example:

https://obsproject.com/forum/threads/hows-the-fx-8350-for-obs-streaming-1080p-while-gaming.2871/.

A single use case where the 8350 does marginally better in heavily threaded tasks is not an example of all real world use cases. Your link also doesn't address the issues with CMT or power consumption that I mentioned. They even talk about the 2600k being able to do the same thing in that specific use case, which is a year older and uses 31.5% less power.

Bulldozer CPUs are awful for single threaded tasks, you can only effectively use half of the CPU before you run into performance degradation due to CMT. I run into this problem all the time with the AMD CPUs I have.

And forgive me if 100% CPU usage isn't heavy enough for you.

Do you even know what you're talking about? Games automatically allocate the largest CPU time slice available so you'll see 100% CPU/thread usage regardless if you're on a 486 or some 16 core Xeon. The same applies to other high demand applications like encoders.
 
I have a 3770K and was looking into getting a 6850K but will wait for actual performance reviews for the Ryzen 8-core CPU and 4-core to see. With what I've seen so far I think that I will bite the bullet and get the 8-core Ryzen especially if that rumored price is true.
 
I was thinking by the end of 2017, I am still rocking a Phenom II X4. If Ryzen can even get close to current Intel performance, I will finally make my switch.
 
I'll get a Ryzen eventually. But I am more excited to see price drops, hopefully the FX 8 cores will drop too so I can start making some cheap gaming systems for a small business that I'm trying to get started.
 
I'll get a Ryzen eventually. But I am more excited to see price drops, hopefully the FX 8 cores will drop too so I can start making some cheap gaming systems for a small business that I'm trying to get started.
Aren't they pretty cheap already with the MicroCenter bundle prices?
 
Yes, you definitely have a PEBKAC if you think that Amazon and Newegg reviews are made primarily by professionists. The majority are made by the average joe, people that don't know what they're doing or trolls. Sure there are also professionals that make reviews, but who has the time to spend hours sifting through crap to find them? If you have to read those types of reviews to make a buying decision, you haven't done your research properly.

I thought it would be rather obvious, that i meant to read reviews that come from professionists. They often state that they are and what professional use they make. AFAIK there is no other way to know if someone is professional, there is no "pro" tag neither in Amazon nor in newegg, but i guess i was asking too much of a mental work. If you don't trust them, then you are free also to ignore people in this very forum. Do you want me to hold you hand? You don't want to find pro reviews, you ignore anyone in a forum which opinion you don't like, but you "know" what you think is true. OK. And then it's me who didn't make research. Whatever mate.

A single use case where the 8350 does marginally better in heavily threaded tasks is not an example of all real world use cases. Your link also doesn't address the issues with CMT or power consumption that I mentioned. They even talk about the 2600k being able to do the same thing in that specific use case, which is a year older and uses 31.5% less power.

Bulldozer CPUs are awful for single threaded tasks, you can only effectively use half of the CPU before you run into performance degradation due to CMT. I run into this problem all the time with the AMD CPUs I have.

What you said, about "they're useless for any sort of real workload since you can only effectively use half of the CPU without starting to experience performance degradation from the CMT shared resources", is total BS and google is your friend. I will just leave this here about whether this is real workload:

http://www.cpu-world.com/Compare/375/AMD_FX-Series_FX-8350_vs_Intel_Core_i7_i7-3770.html

BTW, how much were they costing back then? Ah, yes...totally useless... They are useless for heavy workload, because they suffer a scale penalty in FPU heavy applications. And this makes them useless. Ok mate.

Do you even know what you're talking about? Games automatically allocate the largest CPU time slice available so you'll see 100% CPU/thread usage regardless if you're on a 486 or some 16 core Xeon. The same applies to other high demand applications like encoders.

Reading comprehension problems? Or are the games the "real workload that uses half of the CPU"? Who talked about games? Read again my reply.

What you REALLY mean, or rather should say is: "Intel is much better at games, for twice or 3 times the price up until 2 years ago and FX is trash cause at 1/2 or 1/3 the money was like on par or 20% behind on heavy multithreaded jobs".

You know, professionists most commonly use heavy multithreaded applications, not games. Whenever a game is heavily multithreaded and heavy on the CPU, the FX performs admirably so many years after its release. Google for Witcher 3, Assassins Creed Unity and other heavy titles that can use all 8 cores.

EDIT: And since you now talk about games, let me illuminate you, about the fact that FX was RARELY using 100% CPU load (contrary to what you think), because games were not multithreaded enough and heavy enough.

This is a game. Famous for running badly in FX

http://imgur.com/gqELQ4a

^ If you THINK that on this game, the FX will show 100% CPU, you are OUT of your mind.

This is ANOTHER game:

http://imgur.com/8UJJijq

Spot the difference. OK, it has a severe degradation since it runs 8 heavy threads, so it's 5 entire FPS behind 4760K in min fps and 0 in average, but, hey, how much was the 4760K for?

http://imgur.com/wcBftLo

Get it now? Total trash you say? Which one of the 2 games, according to you is heavier? Trash for a gamer that doesn't care about money, maybe. Have you any idea about the price difference between Intel and AMD up to 2 years ago? Let me tell you something. Outside USA, there are other continents too, where professionists don't go to spend 500 EUR for a CPU alone, when they can get the job of an 3770 +-20% at 1/3 of a price? That's more of a spoiled gamers sport mainly in the US. The FX is GOLD for people that can run HEAVY applications on them. And by heavy, i mean loads close to 100%. IF it's still not clear to you, there is nothing more i can do. You don't even know basic stuff about FX functioning.

EDIT 2: Take also this. Since video encoding/rendering is only one thing that professionists use with heavy real world loads, i will also leave here photo editing and 3D rendering:

http://www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/4

For the record, photoshop is FPU heavy and always favoured Intel. However, as you can see, despite the "severe degradation", the 8350 is 3 seconds behind 3770. Now, maybe in USA your professionists count down seconds, in Europe they don't. I can't even drink a sip of coffee in 3 seconds.

I mean, i am sorry, that's what professionists use here for heavy, real life, multithreaded jobs. If you think this is trash, you should really search the prices of Intel-AMD up to 2 years ago and ponder on the fact that even the lowest of the FX line (the 8300 was unlocked and freely overclockable).

Maybe in USA your professionists do other heavy stuff, please free to illustrate.
 
Last edited:
Addendum:

This is about FX that will wreck your power bill:






I mean, you Intel guys don't mind paying some hundred $ more for the CPU and you mind about the "nickles"?
 
I have no reason to upgrade my 4770k @ 4.4ghz. The only reason to upgrade is for the new features of a newer platform. Performance is more then plenty for me and honestly for most people too. Unless you a actually need the extra cores for work it's pointless. I'll probably hold on til it for a couple more years before I ponder a CPU upgrade.
 
I have no reason to upgrade my 4770k @ 4.4ghz. The only reason to upgrade is for the new features of a newer platform. Performance is more then plenty for me and honestly for most people too. Unless you a actually need the extra cores for work it's pointless. I'll probably hold on til it for a couple more years before I ponder a CPU upgrade.

Unless you can get a CPU that at stock is a better and can still overclock.

I am in the same boat as you, I have a 4790K and a 4960X and both still do what I need but Zen is very intriguing.
 
Nah, I think I'm fine with my 6700k at 4.5.
Don't see myself upgrading my CPU before 2019/2020.
 
Nah, I think I'm fine with my 6700k at 4.5.
Don't see myself upgrading my CPU before 2019/2020.


Yeah, I am still on my 2011 3930k. I feel like I'll be upgrading soon (more for features and temps than absolute performance, as this thing still rocks at 4.8Ghz) but whatever I get next - I feel - will be with me for quite a while. if I can get as many years out of it, as I got out of my 3930k, we are talking 2022 :eek:

I thought we'd all be flying spaceships, and buying augs by 2022 :p
 
Stop. It's my choice of OS and none of your concern.

I cannot do my work efficiently on anything other than Windows NT 5.x. Trust me, I have tried newer Windows versions. I used Windows Vista at launch alongside XP and win2k, while it wasn't as horrible as the newer versions it was still a royal pain in the ass. Not only does it slow my work down, it also constantly infuriates me while using it with various settings being put into where they aren't supposed to be, GUI changes, bugs, not to mention the bloat. So I've used Vista in 2008, 2009 and went back to XP after a few months. I tried using Windows 8 when I got my new PC in 2013 for a while, went back to XP after a few months (thank God Ivy Bridge still has XP chipset drivers). I also got a Haswell i5 laptop and I had to put up with Windows 7 until I figured out a way to install the Intel Iris GPU driver for XP. It basically was bugging out after install, the screen would just flicker and it would act the same way as my 2008 laptop with a defective Geforce 8400M did... anyway I eventually found out it worked fine at 50hz refresh rate so I am using it now and again, thank God Intel actually bothered to leak a beta version of chipset and iGPU drivers for XP embedded systems. They stopped providing those entirely after Haswell... I did however have to get a USB wifi card, since the internal crap was NVME/M.2 and I couldn't just throw in a mPCIe card as I would normally do. No biggie, the internal adapter was the Intel Wireless-N 7260 which is a defective adapter model (constant drop of packets, I'd rather use dial-up than that! - they actually recalled them afaik).

I have also used various *nix versions ever since... UnixWare and Solaris I guess? Also used Slackware in the 1990s. I am a fan of proprietary UNIX, I was really hoping the OpenSolaris project would go somewhere when the source was released to the public in '08 or whatever, but unfortunately Sun/Oracle or whatever they fancy themselves as now simply decided to dump the source online and stop development and support. I guess I appreciate them finally putting out Proprietary UNIX into the open source, makes me sad it didn't gain the amount of interest I was expecting. In any case I haven't used *nix on the desktop for a long time, and don't really think that's what it's for these days. Great for servers, not so much for desktop use. Simply put unix/linux still doesn't have as high quality software base as I have come to expect on Win9x and NT since the 90s.

I could literally list you 1000 reasons detailing every single possible thing I hate about NT6.x - 10 but I don't have time now, it's for a different topic and you probably won't care anyway. People who are happy with newer versions of Windows in my opinion either don't know better (kids these days) or simply aren't accustomed to using the PC in the power user manner that I do. Because I refuse to accept the notion NT6.x makes anything easier to do, from my experience it's purely the opposite and while making things harder it also uses more resources. I am simply astonished to see a system using 2GB of RAM idling on the desktop. :O... After install win2k uses about 50MB, XP x86 90MB, and XP x64 about 300MB. Waste of resources.



No but I have experience reverse engineering botnet binaries for research purposes (so ironically the answer is yes). I am well experienced in cyber security. I don't infect my systems unless that is specifically what I need to do. People who visit pr0n sites with Java enabled are probably not the people who I would suggest run an XP based system, however if you are careful with your browsing habits and you have a functional firewall, there is very little chance that an XP system will get infected. 90% of vulnerabilites require social engineering (and thus the blame is on the user). In any case modern malware is developed primarily for the platform that is going to generate the bulk of the infections, in other words NT6+ (and these days android as well). In fact when studying botnets and their rootkits I've constantly had to install them in Windows 7 based systems, since they require WinAPI and dependencies not found on XP, or new .NET versions.

So to conclude, I have about 15 functional, working PC's in my house at this moment (as I said in the first post, got about 20 CPU's laying around) of which I have decided to keep around 7 that I use on a somewhat regular basis. Of these systems, 100% are running NT5.x based systems, most have 32-bit XP but on newer systems I opt for running XP x64 for better rendering/compiling/encoding performance. I have had enough of testing modern systems and I have come to my final decision which is that I am not going to "upgrade" to anything other than NT5.x on the desktop, barring God sending a miracle in the form of M$ actually picking up the win2k3 codebase for a new OS (not holding my breath there). I am okay with having to compile new software myself for XP compatibility and even reversing newer winapi libraries to work on XP (I have done that already). I have to admit I don't have as good knowledge with drivers, I've only written virtual/software drivers for Windows but I guess I'm gonna have to learn my way around them in OllyDbg... I would much rather do this than use NT6+. Long-term it will actually save me time going this route, believe it or not, and not to mention save my nerves and dignity.

Thus if I am to upgrade my hardware, it better have some XP compatibility. Chipset drivers would be a miracle but I am not really expecting that anymore. What I am expecting from X370 systems is they have third-party chips for LAN and Audio, since most IC manufacturers still make drivers for XP (Realtek, Atheros, VIA - thank you!). (On Skylake we have some manufacturers doing this, and others using Intel based LAN solutions, which on top of being shit, don't have XP drivers so I need to use expansion cards for that, and I already populate almost all my slots). If that is not the case then I will stick with my current systems for as long as I have to, which means pretty much until someone comes up with something faster supporting XP (this could even be a non-x86 CPU that is faster in x86 emulation than the fastest hardware x86 chip, who knows).


It doesn't matter how good you are with security. If you are running a system with open known exploits it is only a matter of time. Hubris about "I know what I am doing unlike those idiots over there" will only hasten the problem.

This is why we can't have nice things. By doing this, you are part of whats irresponsibly ruining the internet for everyone. Just update your goddamned obsolete apps that won't run on modern platforms already. No excuses.

If it were up to me, every OS would have a built in feature that irreversibly disables the network stack the second it goes EOL.
 
Nah, I think I'm fine with my 6700k at 4.5.
Don't see myself upgrading my CPU before 2019/2020.
No problem, since Intel won't really have a new uArch until then anyway.
Judging from the longevity of my old i7 920 D0, I went ahead and maxed out my Z87 rig at 4770K launch.
 
If it were up to me, every OS would have a built in feature that irreversibly disables the network stack the second it goes EOL.

You give Microsoft/Apple/Google that kind of power and they will EOL consumer desktops at 1 year and force you buy a new one every year.

Overall I do agree, you need to be an adult and upgrade.
 
p.s. Didn't IBM just switch to Apple, because overall their WinTel PCs were
much more costly to maintain, over time??

If you think Windows PCs are expensive in maintenance you dont want to know about Apple.

There is no alternative unless you want to decrease productivity and increase cost.

IBM dont like Microsoft products because Microsoft have beaten the IBM dinosaur into the ground. OS/2, Notes etc.

Mac sales are pretty much in a collapse with a 15-20% YoY decline.
 
Last edited:
I know if the price is good I wouldn't mind trying it out for a second machine.
 
I thought it would be rather obvious, that i meant to read reviews that come from professionists. They often state that they are and what professional use they make. AFAIK there is no other way to know if someone is professional, there is no "pro" tag neither in Amazon nor in newegg, but i guess i was asking too much of a mental work. If you don't trust them, then you are free also to ignore people in this very forum. Do you want me to hold you hand? You don't want to find pro reviews, you ignore anyone in a forum which opinion you don't like, but you "know" what you think is true. OK. And then it's me who didn't make research. Whatever mate.

It seems you're in such a blind rage that you're spinning hilarious straw man fallacies. The last time I checked, Amazon and Newegg are not Hardforum.

What you said, about "they're useless for any sort of real workload since you can only effectively use half of the CPU without starting to experience performance degradation from the CMT shared resources", is total BS and google is your friend. I will just leave this here about whether this is real workload:

http://www.cpu-world.com/Compare/375/AMD_FX-Series_FX-8350_vs_Intel_Core_i7_i7-3770.html

Are you not able to comprehend words properly or something? Or is it your blind rage stopping you? I'm talking about one thing and you're going off on topics that were never part of the original discussion.

What you REALLY mean, or rather should say is: "Intel is much better at games, for twice or 3 times the price up until 2 years ago and FX is trash cause at 1/2 or 1/3 the money was like on par or 20% behind on heavy multithreaded jobs".

No, that's what you want to see that was never said.

You know, professionists most commonly use heavy multithreaded applications, not games. Whenever a game is heavily multithreaded and heavy on the CPU, the FX performs admirably so many years after its release. Google for Witcher 3, Assassins Creed Unity and other heavy titles that can use all 8 cores.

You know, if you would have comprehended what I posted, you wouldn't blindly assume I was talking solely about game workloads.

EDIT: And since you now talk about games, let me illuminate you, about the fact that FX was RARELY using 100% CPU load (contrary to what you think), because games were not multithreaded enough and heavy enough.

Now I talk about games? Do you even remember what you posted just a page ago about games? It was in response to something you posted. And for clarification, when I said "CPU/thread" I meant a single core on a CPU or a thread. Pretty much every demanding game made for the last 20 years is going to use 100% of a CPU core/thread on said CPU.

Get it now? Total trash you say? Which one of the 2 games, according to you is heavier? Trash for a gamer that doesn't care about money, maybe. Have you any idea about the price difference between Intel and AMD up to 2 years ago? Let me tell you something. Outside USA, there are other continents too, where professionists don't go to spend 500 EUR for a CPU alone, when they can get the job of an 3770 +-20% at 1/3 of a price? That's more of a spoiled gamers sport mainly in the US. The FX is GOLD for people that can run HEAVY applications on them. And by heavy, i mean loads close to 100%. IF it's still not clear to you, there is nothing more i can do. You don't even know basic stuff about FX functioning.

We get it that you're a hardcore AMD fanboy who's currently in a tantrum because someone, somewhere on the internet insulted AMD. You can keep your rose tinted glasses on if it makes you feel any better.

If you want to be disrespectful, try to put words in my mouth and use inane straw man fallacies then it's not worth talking with you any longer.

Addendum:
I mean, you Intel guys don't mind paying some hundred $ more for the CPU and you mind about the "nickles"?

What's the point of this even? I said the power usage and heat output is higher, which is correct and your source confirms it. What it doesn't talk about however (and I've yet to see someone talk about it), is the power usage of the associated heat output being removed by HVAC or window unit A/C systems. I can run 4 Intel quad systems in a 99 SQ foot room and a 5000 BTU window unit has no problems keeping it at 75F by cycling on a couple of times an hour. On the other hand, I can run ONE FX8370 and the poor window unit has to run nearly constantly to keep it at 80F.

The old Phenom II quads had the same problem with heat output. I used to hate dealing with my buddies x4 955 at my house because his rig would crank out so much heat. I eventually put him in another room, which would get ridiculously hot.
 
Back
Top