“Super” Wi-Fi May Finally Be Coming Your Way

Megalith

24-bit/48kHz
Staff member
Joined
Aug 20, 2006
Messages
13,000
If you live in the countryside, you may soon be getting the privilege of trying out Super Wi-Fi. Unlike traditional Wi-Fi, this variant uses the TV band, which means it can travel longer distances and easily go past walls or other obstacles. City dwellers are probably out of luck, though, as such areas are swamped with television signals—introducing Super Wi-Fi to these environments would likely interfere with broadcasts.

What gives Super Wi-Fi such great potential is that it’s transmitted over the same portion of the airwaves that are used by television broadcasters. Compared to regular Wi-Fi or even most cellular transmissions, signals sent in the TV band can travel much longer distances. They can go through walls, trees and other barriers that can thwart other types of signals. And because the spectrum is regulated and largely reserved for television signals, Super Wi-Fi transmissions don’t have to contend with interference from random devices like microwaves or cordless phones, as do signals in other wireless bands.
 
So kind of the FCC to give us a tiny sliver of the useful airwave band for wireless internet. It's not like connecting everybody in the country to the internet has massive economic benefits or anything...
 
this is probably a bad idea.
TV (channels 2-13) occupies 54 to 216mhz. The general rule of thumb is the lower the frequency, the further it goes, but the less bandwidth that's available.
LTE starts at around 450mhz, but goes up to 5ghz, with the majority of bands starting at 700 and going to around 2.5ghz.
lte is fairly fast, but cell connections get away with having tons of repeaters (cell towers) and low power (10 watts per cell tower) compared to tv stations which have a maximum wattage output of 500 kilowatts.
It'll be interesting to see the details and check how much effective bandwidth and simultaneous users these solutions really have.
 
I assume this will be a one way recieve signal? Kinda like satellite back in the day. Satellite down. Dialup up.
 
this is probably a bad idea.
TV (channels 2-13) occupies 54 to 216mhz. The general rule of thumb is the lower the frequency, the further it goes, but the less bandwidth that's available.
LTE starts at around 450mhz, but goes up to 5ghz, with the majority of bands starting at 700 and going to around 2.5ghz.
lte is fairly fast, but cell connections get away with having tons of repeaters (cell towers) and low power (10 watts per cell tower) compared to tv stations which have a maximum wattage output of 500 kilowatts.
It'll be interesting to see the details and check how much effective bandwidth and simultaneous users these solutions really have.

Im more interested in how they are going to handle the uplink. You are not going to have a 10 watt or 500 kwatt setup on your house.
 
this is probably a bad idea.
TV (channels 2-13) occupies 54 to 216mhz. The general rule of thumb is the lower the frequency, the further it goes, but the less bandwidth that's available.
LTE starts at around 450mhz, but goes up to 5ghz, with the majority of bands starting at 700 and going to around 2.5ghz.
lte is fairly fast, but cell connections get away with having tons of repeaters (cell towers) and low power (10 watts per cell tower) compared to tv stations which have a maximum wattage output of 500 kilowatts.
It'll be interesting to see the details and check how much effective bandwidth and simultaneous users these solutions really have.
Bad idea because it would be slow? The article states it won't be as fast as LTE but faster than dial-up. Unfortunately, the article doesn't give any concrete numbers. IF what is said is true in the article, that someone is able to stream video, then this is a great solution for rural people.


I assume this will be a one way recieve signal? Kinda like satellite back in the day. Satellite down. Dialup up.
Satellite "back in the day", just as it is today is two way. Any kind of internet connection is always two way. Think about it, how would a connection know what content to send you, it doesn't send the entirety of the internet at all times to you. Satellite is an OK solution but, last I checked we hadn't figured out how to deal with the latency involved in sending microwave signals up to a satellite in space and back down, physics and whatnot.


Im more interested in how they are going to handle the uplink. You are not going to have a 10 watt or 500 kwatt setup on your house.
When I worked as a satellite internet installer, I don't recall what the actual output power of the transceiver was but there was a sticker there not to stand in front of the satellite, lol. Looking online, there are refurb Hughes Net LNBs that have an output of 2 watts. I wouldn't image they'd need even that much to hit a terrestrial station. Then again, I guess it would depend on whether they have a directional or omnidirectional antenna on the customers end.
 
Bad idea because it would be slow? The article states it won't be as fast as LTE but faster than dial-up. Unfortunately, the article doesn't give any concrete numbers. IF what is said is true in the article, that someone is able to stream video, then this is a great solution for rural people.
It's probably a bad idea for multiple reasons. For starters, it would be expensive. Specialized radio equipment can get pricey. Even in the armature radio market, transceivers can go for upwards of 2000$.
It wouldn't have the speed of wifi. It wouldn't even be close.
For long distances, you'll probably have to set up a directional antenna to communicate. Cell phones and wifi don't need special antennas because of limited distance.
Practically everything is faster than dial up. That's a really low bar.

I get the whole last mile costs, and how it's prohibitively expensive in rural areas. Honestly, it would probably make more sense to subsidize 4g cell towers in rural areas on power lines.
 
So super wifi sounds great so long as you don't live in a city...which more than half the U.S. populations does, so there is a limited market for starters. The FCC might not even allow it in the areas that could use it. I can't see this taking off, at least not in the U.S.

Chalk up another one to the "great on paper, bad in reality" ideas.
 
I'm willing to bet that the purpose of using this part of the spectrum isn't to make a single tower that covers an area of a 50+ mile radius. It would essentially be setup the same way it is now for most WISPs but with a change in equipment to handle the different spectrum. However, this would mean that you don't need absolute line of sight between the house and the tower. Being able to better penetrate obstructions like trees and such means more people could be reached with fewer problems. This also means that people farther away could be reached because the signal would travel farther but we're not talking about massive towers you use for over the air TV stations.

As an example, let's say the current max distance for a WISP is 1 mile (I know it's farther because I use a WISP and I'm over a mile away) and requires clear line of sight. Using the TV band spectrum this could be increased to maybe 3 miles with less reliance on unobstructed line of sight. This would be a rather large increase in coverage and you would not need massive amounts of power to do it.
 
It's probably a bad idea for multiple reasons. For starters, it would be expensive. Specialized radio equipment can get pricey. Even in the armature radio market, transceivers can go for upwards of 2000$.
It wouldn't have the speed of wifi. It wouldn't even be close.
For long distances, you'll probably have to set up a directional antenna to communicate. Cell phones and wifi don't need special antennas because of limited distance.
Practically everything is faster than dial up. That's a really low bar.

I get the whole last mile costs, and how it's prohibitively expensive in rural areas. Honestly, it would probably make more sense to subsidize 4g cell towers in rural areas on power lines.

I guess you don't know much about current WISPs. At least on the consumer end, directional antennas are already used for upstream back to the tower. I also don't see how the costs for equipment would increase massively. We're talking about using a different part of the spectrum, not some new type of spectrum. Keep in mind we're talking about a spectrum which has been in use for decades already. I highly doubt there would be massive cost increases for anything.
 
The big drawback to current radio & microwave internet is that you need line of site to the tower. Maximum transmit power without a license varies by frequency band, but in most cases is pretty low (1-2W, some are even lower). But current point-to-point radio can travel upwards of 50 miles with line of site with that, because you are aimed at a very specific point in a tight beam, not transmitting in a broad pattern or omnidirectional like a cell phone antenna does.

Having line of site may not be a big deal in states like Nebraska or Indiana, where it's pretty damned flat all over the place. Get in the hills though, or stuck behind a treeline, and your back to being screwed.

The tower will probably have a license and a high power transmitter. Your rooftop ~probably~ will not though.

Cell towers are so much densely installed compared to TV/Radio towers because - you need 2-way communication, and you can't pack a big powerful antenna into a cell phone.

I have radio right now. It's ok. My work pays for it, it's $200/mo for 3Mb service. I get that speed in off-hours. During peak times, or if there is high wind, there are times I can't even stream Pandora. Latency isn't bad (around 100ms), but varies a lot and the signal drops in and out a good deal if there is any weather. The only good thing about it is that it isn't metered or capped, and it's symmetrical up/down. In order to get line of site, I had to put up a 40' tower to clear a hill line. It takes days if I buy a new game on the PS4 or Steam to download. Large Windows 10 updates pretty much stop everything for a day or two. Netflix is pretty much only an option after 10pm (and nothing in HD). But it's the best option I have available for where I choose to live.

My cellular LTE is a hell of a lot faster, but it's also capped at 4G/month (with a $10/G overage fee) - I use that for gaming if my radio is acting up, but if Windows decides it wants to update (and because it connects via USB and not WiFi I can't call it a metered connection), or god forbid I leave it connected overnight and Steam starts running patches, I've pretty much blown my entire cap right there. Honestly, given the state of the current cellular infrastructure, I don't see why they don't expand on this. I was pretty excited when I saw companies bringing back the "unlimited" data plans, but it still isn't something you could use for daily household use.

Apart from those two options, the only other options out here are WildBlue/HughesNet or Dialup. I can't even imagine gigabit service, at work we have 25Mb Comcast, and that seems orders or magnitude better than what I get at home. I take my PS4 in there sometimes to plug in and download games, and what would take me days at home takes a couple of hours there.
 
Im more interested in how they are going to handle the uplink. You are not going to have a 10 watt or 500 kwatt setup on your house.

:D You're at [H]. You're challenging several people right now. They are probably already on the roof installing the antenna and wiring up the amplifier. :D
 
:D You're at [H]. You're challenging several people right now. They are probably already on the roof installing the antenna and wiring up the amplifier. :D
10 watt is simple stuff. Amateur radio does 50 watt typically on 144 and 35 on 440mhz. Standard TV (channels 2-13) are 54-216mhz. The general rule of thumb is the higher the frequency, the harder it is to send a powerful signal (the bigger the amplifier).
The 50 watts i quoted on 144 is usually done by mobile radios, the ones they install in cars/trucks.
You are right though, they will probably use the base station to be around 50-100 watts and have the end points (the homes) probably have 2-5 watt transceivers. Again this all comes down to a balance of how far you want to be able to communicate compared to how many people will be possibly stepping over each other.
 
Satellite "back in the day", just as it is today is two way. Any kind of internet connection is always two way. Think about it, how would a connection know what content to send you, it doesn't send the entirety of the internet at all times to you.

Satellite back in the day was receive only and you transmitted over a dial up modem. Bi-directional satellites for consumer internet connections didn't start to show up until the late 90s early 2000s.

https://en.wikipedia.org/wiki/Satellite_Internet_access#One-way_receive.2C_with_terrestrial_transmit
 
did they stupid what it does to human when these signals travel thru our body?
I invested in a good router and still get dead spots/weak signal in the house, just wish, there would be no dead spot and its fast signal anywhere in the house.
 
10 watt is simple stuff. Amateur radio does 50 watt typically on 144 and 35 on 440mhz. Standard TV (channels 2-13) are 54-216mhz. The general rule of thumb is the higher the frequency, the harder it is to send a powerful signal (the bigger the amplifier).
The 50 watts i quoted on 144 is usually done by mobile radios, the ones they install in cars/trucks.
You are right though, they will probably use the base station to be around 50-100 watts and have the end points (the homes) probably have 2-5 watt transceivers. Again this all comes down to a balance of how far you want to be able to communicate compared to how many people will be possibly stepping over each other.

Yea, I really need to get into the high frequency stuff. I'm on the 144/440 now. I would love to be able to use the low power, high frequency stuff later (need to update my license).
 
did they stupid what it does to human when these signals travel thru our body?
I invested in a good router and still get dead spots/weak signal in the house, just wish, there would be no dead spot and its fast signal anywhere in the house.

Water pipes and ductwork turn into grounding shields to wifi/mobie radios. You will never penetrate these, best thing to do is install extenders. I have found mounting the router up high on a floor gives me a good distribution of signal in the house (ranch style home).

Also 5ghz, unless your connecting device can physically see the router, is going to suck as drywall can reflect the signal.
 
Water pipes and ductwork turn into grounding shields to wifi/mobie radios. You will never penetrate these, best thing to do is install extenders. I have found mounting the router up high on a floor gives me a good distribution of signal in the house (ranch style home).

Also 5ghz, unless your connecting device can physically see the router, is going to suck as drywall can reflect the signal.
If you have access to the house/building and you don't rent, it's much easier to just wire up some cat 6, get a 20 10-20 port switch and call it a day.
If you need to have wireless, once the house is wired, just throw in access points in the rooms that get shitty coverage. As long as you use the same network name, the wireless devices should automatically switch from one access point to another without having to reconnect.
 
If you have access to the house/building and you don't rent, it's much easier to just wire up some cat 6, get a 20 10-20 port switch and call it a day.
If you need to have wireless, once the house is wired, just throw in access points in the rooms that get shitty coverage. As long as you use the same network name, the wireless devices should automatically switch from one access point to another without having to reconnect.

Oh I fully agree. I have everything wired to a switch, the phones and some amazon echos are the only things that run on wireless.

Just can't compete with wired performance and reliability.
 
Back
Top