Windows 9, 128-bit ?

JoseJones

Gawd
Joined
Jun 6, 2012
Messages
602
Will Windows 9 be 128-bit or not? Some websites discussing Windows 9 are also talking about the possibility of 128-bit. I certainly do remember the 32/64-bit fiasco. Have we already maxed out 64-bit? Will going to 128-bit be another fiasco? Are we ready for 128-bit?

Microsoft Working on 128-bit Windows

What are the pros and cons of a 128-bit Operating System?
 
Last edited:
lol wut?

x86 does not have a 128-bit CPU core, as in not having 128-bit general purpose registers, or instruction encoding that allow working with that size data for regular integer instructions. And it's not likely to move to 128-bit any time soon either.
 
Con - there are no 128-bit CPUs, so there won't be a 128-bit OS from MS.
 
pro: bigger hard drives supported, more ram, etc
cons: No drivers, barely even at 64 bit now, drives/ram aren't pushing the addressable space of 64 bit os's yet afaik. Also I don't know of any x86 based 128 bit architectures available yet.

I'd be surprised if windows 9 dropped 32bit support let alone introducing a 128 bit version.
 
2228874_arm_cortex_a15_chip_128-bit.jpg
 
pro: bigger hard drives supported, more ram, etc
cons: No drivers, barely even at 64 bit now, drives/ram aren't pushing the addressable space of 64 bit os's yet afaik. Also I don't know of any x86 based 128 bit architectures available yet.

Maybe if you (were in a position to) sign an NDA with Intel, you might find out aabout their plans for 128-bit wide CPUs. AMD led the x86 architecture with the transition to 64-bit, but I doubt they have the moxie to lead on 128 bit.

I'd be surprised if windows 9 dropped 32bit support let alone introducing a 128 bit version.

Win 9 has to "fix" Win 8 without further p---ing off MS customers. But win 10? Who knows?
 
lol

128 bit. Not even close to filling up 64bit space. Even in the high end realm. Gonna be a while people.

That being said, I am still waiting for the console bit wars to resume since they seem to be running out of shit to peddle.

256bit Blast Processing!!!
4096bit color!!!
ZOMG TEH AWESOMEZ!!!!
 
The title & OP post reads like the progression of old school consoles. 8-bit, 16-bit, 32-bit, 64-bit :rolleyes:
 
itanium. That's been dead since? When it was released?

LOL! When I worked at Sun Microsystems in the late 90s, Scott McNealy referred to that chip as the "Itanic.":D :D :D He was right on that one, but, but, but he (or at least all of Sun) missed the significance of Linux+low-cost commodity x86 hardware.:( :( :(
 
It's taken over a decade for us to get to the point that 32-bit operating systems are no longer offered since the first x64 capable hardware rolled out, there's absolutely no way 128-bit is anywhere near ready. What has already been said is true as well, a big reason for moving to x64 architecture is the removal of the RAM and HDD limitations that we were hitting with x32. x64 allows for theoretical RAM limits of 16 exabytes...that is 16 billion gigabytes...we are quite a ways off from being able to reach a single petabyte in any type of consumer storage.
 
Maybe in 25 years or so when we have OS's that are basically artificial intelligences.
 
32 bit lasted about 10-15 years before 64 bit came out. 64 bit has been out a little over 10 years now.

I can see Windows 10 having a 128bit x86 or ARM version if Intel or AMD starts making CPU's for it.
 
Shouldn't Microsoft focus on getting 64-bit established first so they can be up to speed with all the other OSes?
 
They'll definitely be going 128-bit before 2020.

Shouldn't Microsoft focus on getting 64-bit established first so they can be up to speed with all the other OSes?
Shouldn't Microsoft focus on getting the start button and menu established first so they can quit goofing around and do some real improvements with their OS?

:D
 
32 bit lasted about 10-15 years before 64 bit came out. 64 bit has been out a little over 10 years now.

I can see Windows 10 having a 128bit x86 or ARM version if Intel or AMD starts making CPU's for it.

No. You can't predict when 128-bit addressing will arrive by extrapolating from the time between 32-bit and 64-bit. It's not linear like that.

The scale goes kilobytes, megabytes, gigabytes, terabytes, petabytes, exabytes, zettabytes, yottabytes. 16-bit allows for 64 kilobytes of RAM, 32-bit allows for 4 gigabytes of RAM, and 64-bit allows for 16 exabytes of RAM.

There is not even a name for the level of -bytes that 128-bit allows; you would need to go another four steps up the scale, like yottabytes, ?bytes, ??bytes, ???bytes and ????bytes, and then it would be 340 of those ????bytes.

Going from 16-bit to 32-bit jumped up 2 steps of the scale. Going from 32-bit to 64-bit jumped up 3 steps of the scale. Going from 64-bit to 128-bit would jump up 6 steps of the scale.

And we have only just begun to scratch the surface of using 64-bit addressing. Windows 7 64-bit is artificially limited (for convenience's sake) to 192GB, which would be six 32GB sticks of RAM. Windows 8 is limited to 512GB; the largest RAM sticks available are 64GB, so unless you have a motherboard with eight RAM slots, you won't be able to hit that limit. 128GB sticks are not available yet and are probably being saved for DDR4.

16 exabytes would require 268,435,456 64GB sticks of RAM. I'm not sure whether or not there are 16 exabytes of RAM in existence right now.
 
16 exabytes would require 268,435,456 64GB sticks of RAM. I'm not sure whether or not there are 16 exabytes of RAM in existence right now.

If there are you probably will find most of them in a Google server.
 
16 exabytes would require 268,435,456 64GB sticks of RAM. I'm not sure whether or not there are 16 exabytes of RAM in existence right now.

Thats just wild, never knew that the 64bit limit was that freaking high, Also, I didn't know that 64GB stick of RAM where produced!? Are they just for servers or what? must be because most mobos only support 32-64gb. Thanks. Stay [H]
 
No. You can't predict when 128-bit addressing will arrive by extrapolating from the time between 32-bit and 64-bit. It's not linear like that.
The amount of time going from 16 bit to 32 bit was longer than 32 bit to 64 bit. The trend shows 128 bit is coming sooner rather than later.

I think we've already established that companies upgrading to 64 bit was not just about memory. Apple upgraded to 64 bit with zero intent on using over 4GB of ram on their devices.
 
The amount of time going from 16 bit to 32 bit was longer than 32 bit to 64 bit. The trend shows 128 bit is coming sooner rather than later.

I think we've already established that companies upgrading to 64 bit was not just about memory. Apple upgraded to 64 bit with zero intent on using over 4GB of ram on their devices.


The jump from 16 bit applications to fully fledged 32 bit was fairly fast though, unlike 64 bit processing where 10 years later we still struggle to find dedicated 64 bit software that isn't hacked together. We wont know for a long time if there will be any noticeable improvements (besides more memory usage) with 64 bit computing because software developers are so slow to adapt.

Didn't help Microsoft screwed up multiple operating systems (Vista supposed to be the last 32 bit version) which pushed the adoption rates slower.

For most uses 64 bit computing will be sufficient for a long time to come. There has been greater than 64 bit computing for a long time now in specialized markets. It isn't something new. They could create 1024 bit chips if they wanted and the market was profitable. Generally speaking, I don't think 128 bit is on the horizon to general consumers and this idea of Windows 9 128 bit was disproved awhile ago from what I remember.
 
The jump from 16 bit applications to fully fledged 32 bit was fairly fast though, unlike 64 bit processing where 10 years later we still struggle to find dedicated 64 bit software that isn't hacked together. We wont know for a long time if there will be any noticeable improvements (besides more memory usage) with 64 bit computing because software developers are so slow to adapt.

Didn't help Microsoft screwed up multiple operating systems (Vista supposed to be the last 32 bit version) which pushed the adoption rates slower.

For most uses 64 bit computing will be sufficient for a long time to come. There has been greater than 64 bit computing for a long time now in specialized markets. It isn't something new. They could create 1024 bit chips if they wanted and the market was profitable. Generally speaking, I don't think 128 bit is on the horizon to general consumers and this idea of Windows 9 128 bit was disproved awhile ago from what I remember.

It isn't just Microsoft at fault. Lots of businesses have hardware or software that won't run greater than 32-bit and has become unmaintained...and white collars see it as cheaper to keep running old Windows OSes and their old software/hardware than license a more modern compatible product.



Hell one of the last places I worked at used a computer-controlled dimming system for the lighting. The dimmers lost programming due to some hand-handed maintenance....we went to reprogram the dimmer brain and pulled out the manual which started off with:

"Requires an IBM compatible PC with DOS 2.0 or higher"

Also needed a parallel port interface. And not even XP could fake DOS well enough to run 5.25" floppy that the EXE to program the dimmers came on. IT LOVED us that day having to dig through their morgue of Win98 and prior era hardware. That dimmer system is STILL there now 3 years later. And no plans whatsoever of replacing it.
 
What's the possibility of Intel's SkyLake being 128-bit?

I say there is nearly a 0 chance of a 128 bit general consumer CPU in my lifetime. I am 42 now. I mean 2^64 is a very big number. It will be a long time before ram modules get anywhere near that big. That is not to say we don't and will not have SIMD instructions and registers that will be 512 bits. Generally when we talk about a 64 bit CPU we talk about the registers that store the addresses.
 
Last edited:
It isn't just Microsoft at fault. Lots of businesses have hardware or software that won't run greater than 32-bit and has become unmaintained...and white collars see it as cheaper to keep running old Windows OSes and their old software/hardware than license a more modern compatible product.



Hell one of the last places I worked at used a computer-controlled dimming system for the lighting. The dimmers lost programming due to some hand-handed maintenance....we went to reprogram the dimmer brain and pulled out the manual which started off with:

"Requires an IBM compatible PC with DOS 2.0 or higher"

Also needed a parallel port interface. And not even XP could fake DOS well enough to run 5.25" floppy that the EXE to program the dimmers came on. IT LOVED us that day having to dig through their morgue of Win98 and prior era hardware. That dimmer system is STILL there now 3 years later. And no plans whatsoever of replacing it.


The gym I go to, the exercise machines run some version of "Caldera DOS." If one of the machines hangs, I just pull the power cord and then watch the reboot. and watch and watch ... I guess it doesn't take much computing resource to connect an exercise machine and its touchscreen control panel to the server. :)
 
8-bit 8080 was introduced in 1974
... 4 years ...
16-bit 8088 was introduced in 1978
... 7 years ...
32-bit 80386 was introduced in 1985
... 18 years ...
64-bit Athlon 64 was introduced in 2003

...and even today, most applications run by consumers remain 32-bit. 64-bit OSs certainly have some additional benefits, but it doesn't make a difference to most consumers.

64-bit integer execution cores and 64-bit addressing are not limitations we'll be running into soon. Widening SIMD units are a trend (see AVX3.x, 512-bits wide), but the core doesn't have the same problem. While 48-bit memory physical addressing used in x64 code is "limited" to 256 TB, that's orders of magnitude larger than what even large memory CPUs directly host today (up to 32GB/64GB RAM); you could double memory size *every year* and not exceed 48-bit addressing in a dozen years. Realistically, 48-bit memory addressing should be suitable for up to a couple of decades, even if x64 eventually gets a bump to full 64-bit physical addressing (and possibly a larger virtual addressing space).

There is no imperative for 128-bit Windows any time soon, and certainly not by 2020. :p It wouldn't address any shortcomings, and the level of code bloat (instruction encoding and/or operand size and addressing) could actually make it slower than 64-bit code for most applications.
 
I have another perspective on this. Back when I was a young teenager in the 1980s, I distinctly remember wondering "What would you do with an entire megabyte of RAM?" after seeing an ad for a 1 megabyte RAM card for the Apple II+. It was very, very expensive.

We are now at the point where one might wonder "What would you do with an entire terabyte of RAM?" So it's taken us 30 years to go from megabyte to gigabyte to terabyte. If you wanted to go linearly up the -byte scale for RAM usage over time, we should be thinking "What would you do with an entire exabyte of RAM?" in the 2040s, and 64-bit would still be good enough.
 
But we're nowhere near 1TB memory sizes directly per microprocessor. You may want to extend that 30 year figure, by oh say another decade or longer. :p

The limitation isn't just cost and channels to connect that much memory, but also transistor density and power on the manufacturing side. Optimistically, it would take at least 8 years of flawlessly compounding bleeding edge manufacturing to deliver (affordable, practical) devices where 1TB DRAM-type memory sizes would be realistic for the highest end. And that's still a factor of 2**8 below what 48-bit direct memory addressing allows (which could be met with another 12 years of flawlessly compounding bleeding edge manufacturing...). If you have any idea how this is going to happen, I'm sure every single semi-co manufacturer in the world would love to hire you. :)
 
The gym I go to, the exercise machines run some version of "Caldera DOS." If one of the machines hangs, I just pull the power cord and then watch the reboot. and watch and watch ... I guess it doesn't take much computing resource to connect an exercise machine and its touchscreen control panel to the server. :)

I have a microwave.
 
But we're nowhere near 1TB memory sizes directly per microprocessor. You may want to extend that 30 year figure, by oh say another decade or longer.

Exactly my point! All the OS writers that went to 64-bit did so with the intention that they shouldn't have to upgrade the architecture again for a human lifetime. Having so recently gone through that mess twice in short order with 16-bit and 32-bit, nobody - not OS writers, not app writers, not corporate customers - wants to do that again, so they have all put it to bed by going up to 64-bit, instead of half-assing it again with 36-bit or 48-bit.
 
Last edited:
128 bit... lolwut. Few servers support RAM in the terabyte range and none more than 8TB unless I missed something really big. Don't forget you can address 16 million terabytes with 64 bit. Hard drives, big hard drive arrays will maybe a problem, however we are in the single digit petabyte range for a full rack of servers; I don't think anyone is addressing that as a single block device and if they do, you still can go five magnitudes higher. 128 bit is very, very far off.
 
Don't forget you can address 16 million terabytes with 64 bit. Hard drives

I thinki that chx has identified the core issue for enterprise server apps, particularly datbase-based apps. Such apps gain a lot of performance if searches can be done entirely in RAM. OK, if you accept that point, then how many online apps are that performance-sensitive that the company will spend the bucks for all that RAM. Over time, probably more and more. But, to chx's point, the fundamental architecture doesn't have to change for a long, long time.

What's interesesting here is that when this discussion first started, I was hoping that we would see the start of the transition to x128 with the release of Win 9. Now I realize that MS and others woudl be wasting time and money to do any work on x128 in the near term.

Now, if Intel wants to bring out x128 CPUs, then maybe there might be some performance advantage if operating in "x64 emulation mode." But I'm a complete iggoramus when it comes to CPU architectures, so I may be full of "it" on this point.
 
The only technical reason for 128-bit would be the fact, that the 128-bit CPU would have 128-bit registers and instructions. But wait, that is what SSE instruction set is for. Move along, nothing to see here :).
 
32 bit lasted about 10-15 years before 64 bit came out. 64 bit has been out a little over 10 years now.

Longer than that. 32-bit X86 processors were released with the 386 in 1985. It took 10 years before 32-bit was the majority of the software used. Using that metric, we are right on track, OS wise with it being about 10 years since most from here on out will only support 64-bit. We still have a ways before 128-bit becomes necessary.

I wouldn't doubt that MS is at least doing some R&D for when the time comes, but that doesn't mean that it's going into a consumer product real soon. It's just doing the prep work for when the time comes.
 
I could understand the OP's post if this was spread out over 10 pages on an ad-ridden spamfest they call blog.

Completely pointless questions like "Have we already maxed out 64-bit?" "Are we ready for 128-bit?"

WTF is this bullshit?
 
Back
Top