Intel Tick-Tock dead, Now Process-Architecture-Optimization.

It's also a reflection of Moore's Law. That 2 year cycle process update cycle has been going for decades.

We are now officially slipping form 2 to 3 years.
 
It's also a reflection of Moore's Law. That 2 year cycle process update cycle has been going for decades.

We are now officially slipping form 2 to 3 years.

If AMD's Zen becomes a thing worth buying, it will become the Process-Bribe-Architecture-Optimization cycle
 
Good for consumers because they get a longer bang for their buck and theoretically a longer upgrade path. Bad for mobo companies because they like to have you upgrade as often as possible to new chipsets/sockets. Let's face it. The latest and greatest is only maybe 25% better than Sandy Bridge.
 
Good for consumers because they get a longer bang for their buck and theoretically a longer upgrade path. Bad for mobo companies because they like to have you upgrade as often as possible to new chipsets/sockets. Let's face it. The latest and greatest is only maybe 25% better than Sandy Bridge.

In some applications, Skylake can be 40% faster than Sandy Bridge. Gaming is the one area where Skylake doesn't offer a whole lot outside of some benchmark score improvements over Sandy Bridge but that's because GPU limitations are far more impactful than CPU limits are. The platform offers a lot more than that in terms of value as far as I'm concerned and is the real reason to ditch Sandy Bridge for Skylake.
 
Good for consumers because they get a longer bang for their buck and theoretically a longer upgrade path. Bad for mobo companies because they like to have you upgrade as often as possible to new chipsets/sockets. Let's face it. The latest and greatest is only maybe 25% better than Sandy Bridge.

The main reason to upgrade is basically for new motherboard features like USB 3.1 and PCI-E 3.0. Not to mention DDR4. If you don't need either of those, there's almost no point. I've been running a Sandy Bridge and an Ivy Bridge system for about 5 years, and I'm still happy with those systems. I haven't needed to touch them except to use compressed air to clean them out. At the rate Intel is going, I won't need to upgrade anything unless I get a 4K monitor, and I have no intention of getting one as my desk can't accommodate a 27"+ monitor anyway.
 
That's like idiots blaming Obama for the weather.


...oh yeah, almost forgot

Thanks Obama!

Spare me the revisionist history. Intel moved mountains with Core 2 and Sandy. Now they're content playing their own sandbox. AMD isn't even allowed to play.
 
Spare me the revisionist history. Intel moved mountains with Core 2 and Sandy. Now they're content playing their own sandbox. AMD isn't even allowed to play.

So I suppose you're going to show me proof that Intel is getting left behind by Samsung/GloFo 14nm or TSMC 16nm on big chips? Oh wait, those aren't even shipping until later this year!

I'd like to see you do better on performance than Skylake at 5w. Surprisingly peppy for a core that can be passive-cooled, especially compared to it's predecessor on the exact same process node :D

Review: Asus’ excellent midrange laptop gets much better with Skylake
 
Last edited:
So I suppose you're going to show me proof that Intel is getting left behind by Samsung/GloFo 14nm or TSMC 16nm on big chips? Oh wait, those aren't even shipping until later this year!

Who said Intel was behind? Try to stay on topic.
 
Who said Intel was behind? Try to stay on topic.

You were saying Intel is content to play in their own sandbox, implying they're not pushing performance forward. I posted a counterpoint. You really should address it if you want to have an argument here.

And don't give me the old "frequencies haven't grown" bitch fest - they're not growing beyond 4 GHz for anyone (if you wan to stay under 200w).

We can't avoid the laws of physics. The difference between today and ten years ago is, we actually know what those limits are :D
 
You were saying Intel is content to play in their own sandbox, implying they're not pushing performance forward. I posted a counterpoint. You really should address it if you want to have an argument here.
I read that as "Intel left everyone behind so far to the point now they have a performace/node gap big enough to not worry about others catching up while exploring their options."
 
I read that as "Intel left everyone behind so far to the point now they have a performace/node gap big enough to not worry about others catching up while exploring their options."

It's killing me that I can't find the thread (search only seems to go back a few years now), but in the original Conroe speculation thread prior to release and benchmarks, a supposed Intel insider here was crowing that Core 2 would bury AMD for good and that their intention was to never let AMD back in the race again. To have that kind of confidence you must have the resources to move mountains at will. Lo and behold, it was true. Now we won't even get tick tock. Physics. Pfft.
 
Lo and behold, it was true. Now we won't even get tick tock. Physics. Pfft.

Which is why I posted my summary comparison of Intel's 14nm process versus everyone else. You can't have a tick without a process node shrink. Why do you keep ignoring this simple fact?

Intel is only changing the tick/tock system because the tick is harder. You can't deny that this tick is just as hard for other companies, or else we woulds have had Pascal out last fucking year!

That plus falling revenues is even more reason for Intel to slow spending lavishly on expensive new fabs. Since the cheap people on this forum ALREADY complain about the prices Intel currently charges, they can only justify so much money thrown at the problem. Because guess what? Nobody else wants to pay more for chips either!

They just want to sit here whining at a behemoth they can't change. Whining is free I guess, so whatever makes you feel better?
 
Last edited:
The main reason to upgrade is basically for new motherboard features like USB 3.1 and PCI-E 3.0. Not to mention DDR4. If you don't need either of those, there's almost no point. I've been running a Sandy Bridge and an Ivy Bridge system for about 5 years, and I'm still happy with those systems. I haven't needed to touch them except to use compressed air to clean them out. At the rate Intel is going, I won't need to upgrade anything unless I get a 4K monitor, and I have no intention of getting one as my desk can't accommodate a 27"+ monitor anyway.

I recently upgraded not because my system was slow, but simply because it was getting old . After almost 5 years of 24/7 use it was probably close to the time for components like the mobo, power supply, hard drive etc. to start failing. Toss in a much needed video card upgrade and I figured I might as well just go all out and do a full upgrade.
 
I recently upgraded not because my system was slow, but simply because it was getting old . After almost 5 years of 24/7 use it was probably close to the time for components like the mobo, power supply, hard drive etc. to start failing. Toss in a much needed video card upgrade and I figured I might as well just go all out and do a full upgrade.
Yeah, I've rarely had a hard drive last longer than 3 years. Optical drives aren't great about that either. I have to say, though, that I rarely see motherboards fail unless they have cheap capacitors or something. Several computers of mine are over 10 years old and refuse to die. As long as I replace the drives and PSUs, it seems like they last forever.

But I know what you mean. Just because you probably could replace it piecemeal and keep your computer working for 10 years doesn't mean you really want to. You're just so used to upgrading that sticking with anything for that long feels like "making do" even if you won't get much of a performance boost out of a full upgrade. Right?
 
  • Like
Reactions: maw
like this
Back
Top