Pending for deletion

Status
Not open for further replies.
I had a i5 4670K, essentially the same CPU, but it wasn't good enough to play PCVR on my Quest 2 resolution. Constantly at 100% all cores, even with the highly optomized Half Life Alyx game, so jumped to a 10700K and now have a 12700K because I wanted PCIe 4.0 at least for future GPU upgrades. Because VR at 4K-ish resolutions is very hard on your GPU and VR is so cool when done right (high enough FPS/Hz/resolution, wireless streaming). If I hadn't gone VR, would probably still have my 4670K, worked fine for all other stuff with my GTX 980 at 1080p
 
Hi Kiriakos-GR,

Why are you asking the reason behind peoples upgrades and then saying this topic is not for those who upgraded for <insert most likely reasons here>?

I had a 4770k, I upgraded it because it didn't keep up with things like video converting, neural network workloads, handling VM's, hosting game servers, my million tabs i leave open, and my games. It could do any one of those things but I'm a heavy multi-tasker and before pushing things onto my servers I host it on my main pc for a while so I can iterate and tune things quickly and easily. It is also easier for me to diagnose performance issues / bottlenecks of a specific workload on my main PC than on my servers since I don't have to keep uptime on my pc and can close everything but the thing I'm trying to optimize.

I think for gaming, a haswell era cpu with hyperthreading at 4ghz would still be fine most of the time. Especially if all you have is something like a 1070-1080ti/2060-2080/3060 and a 60hz 1080p, maybe 1440p screen. Once you're wanting things like high refresh rates, higher resolutions, the ability to multi-task more than just a couple firefox/chrome tabs + a game... That is when I think one should look at upgrading.

I think most people upgrade because they want their gaming experience to be the best that it can be within their budget. That being said I'm sure a lot of people upgrade because time is money and any time wasted waiting on a pc to load all their VSTs, batch normalize 10k lines of speech, or render a video is a loss that can be reduced by upgrading their hardware. Upgrading just because something is new and shiny is something only degenerates like PC overclocking enthusiasts do.

Also who doesn't have 3 screens at this point? One is just claustrophobic, two is minimum imo.

In summary, I don't understand what question you are really trying to ask here so I rambled instead. Are you asking why YOU should upgrade from a 4790k? Are you asking IF you should upgrade from a core 2 quad system to a haswell era cpu? Are you asking if a 4790k is a good buy today? Or just asking for others experiences when they upgraded from a 4790k to something newer? But then why limit it to people who only upgraded for 'regular use' (regular gaming?) reasons? Isn't upgrading for high frame rate gaming a regular use case? Or to drive multiple screens for a more immersive gaming experience? Or to make money (through work or crypto)?

Closure:
I need some, I won't lie. Please enlighten me.
 
It's a hobby. I'm sure I could get by with a 4770k/4790k, etc. But I upgrade and buy new things because I want to. Just like I'm sure that I could still play golf with my old clubs from years ago. I still buy new ones because I think it would help me improve. I had a 5950x system, and I parted it out and moved to a 12900k just to try it. There was nothing that I do that the 5950x couldn't handle that required me to upgrade. I just wanted to play around with something new.

My father still has a Haswell-E 5960X with 32GB RAM, and when I watch the way his computer works compared to mine, it looks like he's moving in slow motion. Incremental upgrades aren't necessarily readily apparent until you go back a few generations and watch how well (or not well) the previous generations work.
 
My 4790k running @4.6 for three years finally burned to the ground about a year ago. Decided to build a 5600x rig but kept the old rig as a backup and plugged in a 3.00 i5. No i7 but 4790k's are expensive to buy used on ebay. $130.00-200.00 + dollars for a 6 year old CPU. Still a great processor just not for modern gaming.
 
I had a 4770k (OCed to 4.9) that I finally had trouble with a game at and that was Witcher 3 and Assassins Creed Unity. Framerate would drop in scenes with lots of AI models (in town scenes) and thats when I realized I needed more CPU, so I went 5820k. Huge improvement in fps.
 
The 4770k I had was still pretty capable yet, but it had more spikes into low framerates on more demanding games, even at 1080p, and especially when I upgraded video cards. I went to a 10900k and it's no longer stuttery with much more stable low framerates. The 4770k is a beast though to have the staying power that it did.
 
Its like history repeats it self, Q9650 and Battlefield V, an single underground city map this cause excessive graphics load, when most other maps, they are playable.
Vast majority of people they will never blame the software developer of that map, they will simply pay more for a better CPU.
In other words, the poor work of a single software developer, causes impact to other people wallets, and he will never be punished for his own mistakes.

So let me get this straight... You expect a software developer to optimize their software for a (at the time of Battlefield V's release) 10 year old CPU?

I'm all for running old hardware, but I don't blame the software developer when their software doesn't run in an ideal way on my 10 year old CPU. Nor do I expect "punishment" for them. You're being ridiculous.
 
You did not get it straight.. But neither I am correcting people with wrong thoughts, they are millions and I am just one.

Well, do yourself a favor then and communicate better, especially if you don't feel like correcting people's understanding of what you're after IN YOUR OWN THREAD.

You clearly said that it was poor work by the software developer that impacted your wallet and that he should be punished. If you don't mean that, choose different words.
 
I still havn't replaced mine. The most demand game I'm currently playing is Borderlands 3 (and my GTX 1060 is the limit there)

At the time of purchase, my 2500k had died on me, and it was either fumble through firs-gen zen issues, or reuse my ddr3 1600 with a closeout 4790k at my local Micriocenter

I just want the 5700G to fall below the price of my 4790k (and 32gb 3200 ram to hit uder $100), and I'll bite.
 
Last edited:
I skipped that generation. I upgraded from 2600k because it was finally time to do so, and what I had was aging across the board. I didn’t get more cores at the time (6700k), but a significant boost in performance and capability. Later jumped that box to 10700K.

General purpose tasks can be handled by just about anything. This is hardforum though- we tend to push the boundaries. I have a Threadripper and others because it finally got to the point I could do crazy things on a desktop system for a relatively sane price. That being said, I do not really understand the purpose of this thread. What precisely are you looking for? Use cases for modern systems? Justification for an upgrade? Nostalgia for an aging platform and justification for staying on it?
 
Do your self a favor not getting out of topic.
You did voluntarily deliver five lines of feedback anonymously.
For as long you are anonymous, you are not entitled receiving further explanations.

You're missing the point. If you didn't communicate properly (it does not appear that English is your primary language), how do you expect to get responses that match up to what you're looking for? Attacking the messenger is just silly from someone with 78 posts. That wins you a free trip to the ignore list where you won't get answers at all.
 
I went 4770->5775c->1700-> 9900k.

More/faster cores (for compiling/video) is essentially what made me go in that direction. But I haven’t needed more than the 9900k provides so I haven’t upgraded yet. My server is still a 4790 (non k)
 
I went 4770->5775c->1700-> 9900k.

More/faster cores (for compiling/video) is essentially what made me go in that direction. But I haven’t needed more than the 9900k provides so I haven’t upgraded yet. My server is still a 4790 (non k)

The 5775c always interested me, but it wasn't easy to come by upon release.
 
The 5775c always interested me, but it wasn't easy to come by upon release.
That was during the time I "tuned out" of the compute world for a bit. Looking back - that would have been fun and weird.
 
Its like history repeats it self, Q9650 and Battlefield V, an single underground city map this cause excessive graphics load, when most other maps, they are playable.
Vast majority of people they will never blame the software developer of that map, they will simply pay more for a better CPU.
In other words, the poor work of a single software developer, causes impact to other people wallets, and he will never be punished for his own mistakes.
They won't blame the software developer because you're on a 13 year old CPU that is well past both EOL date and expected use-by date. That's like expecting someone in 2015 to release a game that works well on an Athlon 64 x2 3800+ - or someone in 2005 releasing a game designed for early Pentium processors. Eventually you move on.

Everything has a life span. Various tasks won't care (office is office, web is generally the web), but others (3d games) probably will.

So no, no one blames the software developer - that platform is old and dead. It's not poor work, it's simple understanding of the industry and that things advance. I retired the last Core2 era system in 2014 - the world moved on.

Even the guy you quoted - Haswell vs Skylake (or Zen, or heaven forbid any Skylake variant) is a significant change. Sure, at 4k you might be GPU bound enough to not notice - but some games need CPU power too, and they just don't keep up anymore.
 
The 5775c always interested me, but it wasn't easy to come by upon release.
It was a nice concept but was not a great overclocker and these days standard memory is so fast that it beats the integrated memory.
 
I haven’t visited [H] in a number of years I sure hope this thread does not reflect the attitude of the general community.
 
Can you elaborate more regarding the specific INTEL concept?
i7-5775C 6MB cache, this statistically appears 12% faster than 4Gen i7-4770 8MB cache, and even the 4770K can not compete and win at stock clocks.
All three chips designed to use DDR3, and so statistically again, i7-5775C this appears to make better use of DDR3.
He’s talking about the L4 cache example in general. It proved relatively pointless. Compared to any modern CPU, it was rapidly outpaced by improvements in system RAM for the use cases envisioned.
 
Can you elaborate more regarding the specific INTEL concept?
i7-5775C 6MB cache, this statistically appears 12% faster than 4Gen i7-4770 8MB cache, and even the 4770K can not compete and win at stock clocks.
All three chips designed to use DDR3, and so statistically again, i7-5775C this appears to make better use of DDR3.
The 5775c has 128mb of high speed memory that is dedicated to the embedded GPU, but it has a “special trick” that allows this memory to be used as a psuedo L4 cache to the main cache if it is not used for graphics.

High speed in those days however meant about 40-50gb/s (100gb/s bidirectionally), which is about the same speed as good DDR4.

https://www.anandtech.com/show/1619...ective-review-in-2020-is-edram-still-worth-it

The skylake “skull canyon” SKUs allowed a better implementation of this. Unfortunately it didn’t become mainstream in desktop (S series) processors from that point on.
 
Last edited:
I haven’t visited [H] in a number of years I sure hope this thread does not reflect the attitude of the general community.

The answer is, maybe. I dunno. Perhaps.

Anyway, after conferring with my tech-priest (burn the heretics) and consulting the great Ark of Blue Central Processors along with reading the chicken bones of the great Oracle at Delphi 2.0 the reason I upgraded to something new was: more coarz.

That and speed. Not that I don't still use the 4790k because it actually is still a very good gaming cpu considering its 5-6 years old for me. The fx-8350 ain't too shabby these days either which is somewhat surprising. It really only has started to have major problems with newer games in 2020-21.

Five lines and fin.
 
And so, what I did missed by having my back turned against hardware upgrades for a decade, this is the addition of eDRAM at i7-5775C, which this helped the chip to boost throughput at PCI-e transfers.
eh, not really - unless you get a cache hit.
The long story in sort, i7-5775C translates to +20 up to +30 fps VS i7-4790K.
at 1080P minimum settings, about half that for 1440P minimum settings (or integrated GPU). It was a meaningless improvement with any real resolution or quality levels. There's a reason the technology didn't go anywhere.
Anandtech link was a very good find, both CPU's were tested at nothing less than a motherboard this using Z97, this is great because no one can say that i7-4790K this was restricted by the use of older Z87.
Who cares if INTEL lost it footsteps, and did not develop eDRAM according to original planing.
Bottom line is that consumers can gain some extra fps, with their current GPU and delay their next GPU upgrade.
Not really? Play at real settings and the improvement is minimal - couple FPS. Any GPU upgrade will be significantly higher. It's a CPU - it's (relatively) meaningless for gaming unless you're doing competitive ultra-FPS/refresh levels. Intel abandoned eDRAM because they didn't want to release it in the ~first~ place. Broadwell was supposed to be a low-power architecture (and Xeon architecture) only - not for socketed release. They were pushed to, and did - but didn't sell many for this platform line at all (they did sell a ton of Xeon and HEDT though, and plenty of low-power laptop chips). You'd do better buying a used 7XXX series X processor now and X299 than anything from that era - and it's almost the same price (Someone in FS/FT has a board+8c combo up for $300). Aside from that, this entire platform is ancient - I mean if you have a spare board and DDR3, and need an upgrade... sure I guess?
NVIDIA all this time was making money by selling 15FPS on top from GTX1060 to 1070, from 1080 to RTX 2060.
Few FPS on top from a product to another, this become the merchandise so all large corporations to suck consumers blood.
Most people weren't doing that upgrade (1060-1070, or 1080 to 2060), they were coming from older generations. And of course the corporations want to sell things - that's their ~job~.
 
man.. WTF did I just read in this post.. lol

Ok.. you don't like to upgrade... oh well.. some people do.. some people like to support the hardware industry and watch Intel/AMD put out new products and advance technology..

Example scenario... this is like the amish dude next door.. he plows his garden with a mule and plow.. I take my Honda driven roto-tiller out and I'm done in a few hours.. He's smelling shit for weeks.. He asks me why did I upgrade from a mule to that Honda thingamajig..and tells me his mule still works just fine

lmao
 
Mate this is no old friends reunion, neither any one asked your analysis about anything.
I am now 100% convinced that only 6% of people with brain-cells that won certifications due education, they are capable to translate correctly this specific anandtech editor words = article.

But neither this anandtech editor, this is no better than most others willing to talk all day long, by saying just few things of importance.
INTEL product page i7-5775C among other specifications, the most significant one this shines alone.

Now feel free to delete your message, because this is out of topic, and full of inacurate remarks.
What I am going to buy in the end, this is my business.
Dude, I don't have a dog in this fight. But why are you so hostile? You ask people why they upgraded and then turn around and berate them for their answers. Why would anyone want to respond to you?
 
Mate this is no old friends reunion, neither any one asked your analysis about anything.
Yes, you did. In your first post. That's how this works.
I am now 100% convinced that only 6% of people with brain-cells that won certifications due education, they are capable to translate correctly this specific anandtech editor words = article.
I'm quite capable of reading the article, and even looked at the alternative graphs for other settings levels.
Example:
1639598205590.png

1080P. 10FPS difference between it and the max. 2 FPS between it and a 6700K - which is very possibly going to be cheaper to buy used, and on a much more modern platform - again, unless you have old kit lying around you're trying to use. But more importantly, at 1080P - 10FPS from the prior processors you mentioned, which is relatively meaningless at this level.

Heck - just go find a cheap Z490 board and a 10600k:
1639598417705.png


You've never managed to explain what it is you're trying to accomplish. Is the 5775 faster than the 4790? Sure. Is it worth the upgrade? I'd argue no - and that it's not worth what it costs to build a new system used either, since you can easily scoop up the alternatives that are faster for the same price, or cheaper.

Again - what are you trying to ask or do here?
But neither this anandtech editor, this is no better than most others willing to talk all day long, by saying just few things of importance.
INTEL product page i7-5775C among other specifications, the most significant one this shines alone.

Now feel free to delete your message, because this is out of topic, and full of inacurate remarks.
What I am going to buy in the end, this is my business.
It is. I'm pointing out that it's a waste of money. You can get a heck of a lot more horsepower going with something else - a decent Skylake box will run rings around Broadwell and cost less, since a lot more of them were produced and used 6700K are cheap as heck. Picking up a Coffee Lake box will be not a lot more, fully modern, and way faster. If you really want HEDT, look for X99 and a 6800K and the high-end broadwell, instead of this. Or buy a used X299 box and something from the older generations.

Again - what are you trying to accomplish? We're trying to help and understand, but this is all old kit and really not worth the cost anymore unless you either really want to build something from this generation, or have most of the parts sitting there and are trying to pick between two CPUs to buy. But I can't tell what you're trying to do.
 
A real pity, too, because I was enjoying people discussing their upgrades and reasoning...
...and I'm an AMD guy.
 
Status
Not open for further replies.
Back
Top