Future Intel PCs Will Have No Wires

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
Intel believes that technology such as WiGig will usher in an age of ubiquitous wireless computing. There is also the implication of wireless power, but I don’t see that happening anytime soon.

Garrison is certain that the PC has a future, but the wired up version we know and love does not. Kirk Skaugen Senior Vice President General Manager, Client Computing Group has been talking about Intel's "cut the wires" strategy for a while. Garrison mentioned that enterprise customers have never been faster at embracing new technologies. Windows 10 helps the transition. This is the first time that a move to thin and light business notebooks is happening at the same time as the consumer market.
 
Well, my solar powered pocket calculator is wireless.

Maybe they foresee their chips being that power efficient in the future. Take something like an Intel Compute Stick + a few generations + increased solar panel efficiency.

Gaming PC's not included.
 
They can pry mynwired Ethernet from my cold dead hands.

WiFi is for laptops and.mobile devices. Heck, I even pkug in a laptop if I am going to be in the same place for a while.

No wireless networking technology can be as reliable as a wired network
 
why is it I think that competitive gamers will still use wired interfaces?
 
Wonder if they have done the radiation exposure tests in a test office? Sure 1 router + 1 laptop + 1 cell phone seems safe. But what about 120 PCs, 120 keyboards, 120 mice, 120 cell phones, probably 600+ bluetooth gizmos and this doesn't include wireless charging. Plus however many APs to have enough bandwidth. Oh, and don't forget printers, scanners and such. This would be for 8-10 hours exposure per day. Where I last worked before retirement had several floors with 100+ folks and had more PCs then employees after accounting for special use computers.

Start adding in IOT crap and at some point we are going to exceed safe exposure limits.
 
Wonder if they have done the radiation exposure tests in a test office? Sure 1 router + 1 laptop + 1 cell phone seems safe. But what about 120 PCs, 120 keyboards, 120 mice, 120 cell phones, probably 600+ bluetooth gizmos and this doesn't include wireless charging. Plus however many APs to have enough bandwidth. Oh, and don't forget printers, scanners and such. This would be for 8-10 hours exposure per day. Where I last worked before retirement had several floors with 100+ folks and had more PCs then employees after accounting for special use computers.

Start adding in IOT crap and at some point we are going to exceed safe exposure limits.

I don't think there's any conclusive evidence yet that can show us the relationship between these wireless EM waves and human's health. You can't do any of those study until you have a well established link first. Those are factors that may affect such a link to human's health, but at the moment we don't know yet if there's even a link.
 
Wireless power already exists. I m not talking about the chargers where you lay it on a specific pad. It is just around the corner where you can move around within a reasonable proximity and still charge devices.
 
Think about data centers too.

And, exactly how are they going to MANAGE large wireless networks like this?

Wifi is generally a catch free for all. What's more, where are they going to get the sheer spectrum they'd need to diversify the network enough to cover hundreds or thousands of machines in a building?
And, as noted, how noisy is such an environment going to be in terms of ambient EM saturation?

Yet again, someone with their ass on their head proclaiming "Da Fyoochoor!"

I'll file this with flying cars, teleportation and free energy.
 
Wireless power already exists. I m not talking about the chargers where you lay it on a specific pad. It is just around the corner where you can move around within a reasonable proximity and still charge devices.
Yeah it's only about 100 years old. I guess that is how big is the setback from screwing over Nikola Tesla .
I too would be worried if we go so much ' wireless'... Should we be getting into just less wires , more fiber optics and things like that?.
 
Wireless mice and keyboards were a big thing about a decade ago. I had both because it was future technology, yet I dropped that nonsense and now I use all wired.

Also, the NSA and other government intelligence agencies around the world would welcome this.
 
This reminds me of the paperless office.

in my birth country a lot of goverment offices are pretty much paperless / digital signatures
It was a huge cultural shock moving to the states and have to hear ppl still used fax and check. or paper needed to be signes and inititalis multiple places. i always though tit was a joke in movies.

U.S.A. from me tiny experience just seems to be unable to progress very fast in implanting technology in everyday life.


back on topic.
I'm staying with cabels for anything that is not mobile as long as possible. it is still a lot more stable performance.
 
Wardriving won't be about hijacking someone's visible SSID wireless network anymore if this comes to full adoption without proper precautions...
 
Radio frequencies =! ionizing radiation.

No, but at high enough power levels you can still kill/cook things with non-ionizing radiation from things like radar dishes.

While we have a generally noisy EM spectrum, we aren't running around in an environment that's thoroughly saturated any any given level. We don't have any clue what prolonged exposure to that sort of environment might do.
 
in my birth country a lot of goverment offices are pretty much paperless / digital signatures
It was a huge cultural shock moving to the states and have to hear ppl still used fax and check. or paper needed to be signes and inititalis multiple places. i always though tit was a joke in movies.

U.S.A. from me tiny experience just seems to be unable to progress very fast in implanting technology in everyday life.


back on topic.
I'm staying with cabels for anything that is not mobile as long as possible. it is still a lot more stable performance.

I live and work in the U.S., and I haven't used a fax since the late 90's.

Most of the time if I need to sign something remotely it's either an electronic signature, or a print, sign-, scan and email type thing.

A lot of places DO use digital signatures, but the reason it is slow to catch on is a legal one. Companies are legally responsible for proving that the digital signature systems are effective and not prone to mistakes, forgeries or other problems. This can mean rather significant software validation that can take years and cost in the tens of millions of dollars for a large complex system.

In many cases for small or medium firms (or for less well organized larger firms) it simply is more cost effective to do things the old way.

Personally I work in the medical device field, which means any digital signatures have to conform to, and be validated to 21 CFR Part 11, which can be rather challenging.
 
I was expecting something about wireless PCs, not some marketing dork talking about Windows 10.
 
This reminds me of the paperless office.

I don't know where you work, but my office is pretty close. Maybe I print one thing every month or so. I work for a small software company and we genuinely do pretty much everything on the computer. We don't even have a fax machine anymore, although our customers don't necessarily realize that because we subscribe to a service that converts faxes into emails and vice versa. We really don't get many faxes anyway.

I'm sure there are plenty of 1970's dinosaur offices around but it's not all of them.
 
maybe explore wireless traces, (like no need for fragile traces)
 
in my birth country a lot of goverment offices are pretty much paperless / digital signatures
It was a huge cultural shock moving to the states and have to hear ppl still used fax and check. or paper needed to be signes and inititalis multiple places. i always though tit was a joke in movies.

U.S.A. from me tiny experience just seems to be unable to progress very fast in implanting technology in everyday life.


back on topic.
I'm staying with cabels for anything that is not mobile as long as possible. it is still a lot more stable performance.

If my past few employers weren't legally required to keep paper records for almost everything, they wouldn't. It costs money to print all that crap, and then store it securely. It's not that people don't want to progress, it's that they can't since some government agency run by some 70 year old codger insists paper is the only way to keep records and still has fantasies of some field agents sifting through boxes and boxes of paper in a warehouse to ever track anything down if the need arises... all while ignoring that the paper documents can be just as phony as digital documents(actually, more so since there are definitely better means to verify the validity of digital documents compared to crap printed out on the office laser printer 1 to 10 years ago).
 
This reminds me of the paperless office.

I've been trying, but it's hard to change old processes.
We probably eliminated 75% of the paper over the past 8 year, so we are getting there.
We still have a few fax machines, but they don't get nearly as much use any more.
Almost all incoming faxes are routed into email, so it's rare anyone deals with a paper fax, unless they print it out.

As for making everything wireless, no way.
Why would I want to replace fast 1GB wired connections with a slow shared connection?
Wireless is ok for conference rooms and phones, but no way would I want everyone trying to do their work across the wireless. All I need is for a couple people to be copying a large VM to completely clog the wireless network.

And how would everything be powered? I'm already busy enough dealing with the battery replacements and other problems with the few managers that have enough pull get a wireless keyboard/mouse.
 
I've been trying, but it's hard to change old processes.
We probably eliminated 75% of the paper over the past 8 year, so we are getting there.
We still have a few fax machines, but they don't get nearly as much use any more.
Almost all incoming faxes are routed into email, so it's rare anyone deals with a paper fax, unless they print it out.

As for making everything wireless, no way.
Why would I want to replace fast 1GB wired connections with a slow shared connection?
Wireless is ok for conference rooms and phones, but no way would I want everyone trying to do their work across the wireless. All I need is for a couple people to be copying a large VM to completely clog the wireless network.

And how would everything be powered? I'm already busy enough dealing with the battery replacements and other problems with the few managers that have enough pull get a wireless keyboard/mouse.

Plus, security.

Imagine someone walking in with a mobile device designed to sniff and log wireless traffic.
 
Isn't wired network connectivity much more secure than wireless? I especially don't see companies moving wireless due to security. Wireless routers for every PC can be expensive. I could be wrong.
 
Security is huge, yes.

But so is performance, uptime and reliability. Wireless technologies are awful in all these regards. Add in anything that is latency dependent, and it gets even worse.

Every new wireless release we have promises that wireless will finally get better, but most of the time these are just exaggerated marketing numbers.

I wrote this a little while ago (and have typed varieties of it over the years) and I still stand by it:

Zarathustra[H];1041803468 said:
Personally I go to great lengths to avoid WiFi use unless I ABSOLUTELY have to.

This means, if it's not a laptop, phone or tablet that needs to move around to function as designed, it gets wired Ethernet. I don't care how inconvenient it is, or how many hours I have to spend poking holes in walls, snaking wires, or installing racetracks along floor boards.

WiFi for me will only be used when absolutely necessary.


It is ridiculous that they would refuse even in those circumstances where it is easy. As soon as you introduce WiFi you introduce more complexity, and thus potential problems. The golden rule of engineering is the "KISS" principle. Keep it Simple, Stupid.

With wired Ethernet, you plug in a wire, it auto-negotiates, and you are done. Unless the wire gets damaged, you never have to worry again.

With WiFi, you are have multiple devices with different frequencies, and channels all fighting for the narrow 2.4 and 5ghz space, leading to risks of interference and thus poor signal, even when the distances are close. You have walls of differing levels of resistance to the 2.4 and 5 ghz spectrum leading to dead areas in your home. You have 802.11 a, b, g, n ac, etc, some are backwards compatible some are not. You have different levels of security (open, WEP, WPA WPA2, some of which with a choice of TKIP and AES, some devices are compatible, others not). Simply a ton of stuff can go wrong in the auto-negotiation and authentication steps, and once connected at any time interference (neighbor uses a old microwave, or a 2.4ghz cordless phone, another neighbor buys a new more powerful wireless router, etc. etc.) you could find yourself getting reduced signal strength, sometimes to the point where you are kicked off all together, but usually only to the point where your signal is degraded.

And that's before we even talk about the added latencies, lower performance, and the security risks of an exposed wireless WiFi router to anyone who can download an WiFi hack tool off the internet.



Let's talk about the experience on a typical 802.11b/g/n router or WiFi access point.

Your router is labeled 300mbps. Sounds good, right?

Except this assumes you connect with a device that has two antennas and both of them auto-negotiate to the wider 40mhz bands each.

If you live in a hut in the middle of nowhere and only have a few devices, this may actually happen, but if you like most Americans, live an in an urban or suburban setting, the band is probably too congested. With all likelihood you are probably only going to connect with one antenna (or your device may just have one antenna) and that one antenna will use a narrow 20mhz band for less speed.

If your signal strength is perfect, this means you are connecting at 65mbps, instead of 300mbps. But wait, you are behind a wall, or the 2.4ghz band is congested in your area? Then step that down a few steps, maybe down to 52mbps or 39mbps.

Do you ever use traffic that goes in both directions at the same time (pretty much everything does, game servers, torrents, but even regular TCP traffic as it sends a confirmation packet, so even a straight download causes upstream traffic)

Guess what? Unlike wired Ethernet, WiFi is half duplex, meaning it can only send traffic in one direction at a time (wired Ethernet is full duplex, down and up at a gigabit all day long in both directions)

This means you just went down from an effective bandwidth of 39mbit/s to half that, 19.5mbit/s.

Whats that? WiFi protocol overhead? Oh, lose another 40% or so. You are down to about 11.7Mbit/s effective bandwidth, which is ~1.5 MB/s

Your pings are probably 50ms higher than they would have been using wired Ethernet, too.

And you are only a microwave ovens use away from losing your signal.

Compare this to gigabit Ethernet. 1000mbit up and 1000mbit down at the same time, latencies in the fractions of a ms, completely reliable (unless you do a poor job of cable management and run over the cord with your office chair too many times in a row) You can get as high as 120MB/s data trasfer rates (if you use the right protocols, SMB generally foesnt go above ~87MB/s or so)

It is just amazing to me how much more people value even very slight convenience over "Doing it Right".

Now the above is - of course - a little bit of a worst case situation for WiFi. Back when I used a consumer router in a congested neighborhood I could actually get between 4 and 8 MB/s data transfer rates, and you can limit the problems by going with a pro-sumer / entry enterprise unit like a Ubiquiti Unifi access point instead of a consumer WiFi router, but it is still going to be stupidly slow with stupidly high latency compared to gigabit Ethernet, even if the box has really big bandwidth numbers on it.

Those numbers are for marketing only and don't mean anything in the real world. They may only be achievable in an RF isolated lab, with two devices testing a few feet from each other (and even then you still suffer from protocol overhead and half duplex, so the best case is ~ a quarter to a third of the labeled bandwidth). In the rest of the world, the planets would have to align for you to get anything even close.

Yeah, no thanks, I'll be using wired Ethernet for as long as I am able. I even ran an optical 10Gbit cable from my Desktop to my NAS server in the basement recently.

The smart choice is, use WiFi only when you absolutely have to. For everything else, wire it.
 
Back
Top