Streaming quality questions

duh FooL

n00b
Joined
Aug 29, 2005
Messages
41
Not sure if this would be the right place, but figured it's one of the closest.

I'm trying to track down some high bandwidth usage for a family.
They received a notice from the cable company about coming close to their 1TB cap, along with an upsell to buy more bandwidth.

Cox shows 50% being streaming, which I'm trying to focus on, since it's about 8-10 hours of watching.
(Yep, there's a couch potato living in the household).
I'm just helping out, so going by what's reported to me.

I know I can set the streaming quality for Netflix and assuming other services allow the same thing.

My question is:
Do the streaming players detect the display quality and then ask for the lowest between display and setting or will it just stream at whatever quality is in the configuration?

e.g.
Netflix parameter is set to stream UHD (4K).
Player on media stick is plugged into a 1080p TV.
Will the bandwitdh used be:
UHD (4K) bandwidth and then downscaled to 1080p by the player/TV?
or
FHD (1080p) bandwidth (Netflix servers only send at most 1080p quality picture)

I believe streaming services might downgrade due to connection quality, so let's assume there's no connection quality issue (since that would reduce bandwidth).
 
Usually the software will pick the quality that matches the screen. So 1080p screen, it streams at 1080p. Usually it does this by default and you can override those settings.
Without knowing the exact hardware I guess you can't know for sure, also you don't know if they changed settings.
 
Hardware ranges from:

Roku box
Roku stick
I believe they also have a built-in application on a SmartTV for Netflix.

On Roku, they run Peacock, Disney+, Paramount+, Hulu, and Netflix apps.

I'm sure I know how bandwidth is managed on a computer or tablet, where as you said the software can control it.
It's the dedicated streaming boxes that I'm not sure.
On Netflix web page, I can set the quality per profile, but I don't remember seeing anything to manage it on a per device setting.

So far, in about 11 days, the bandwidth usage has shows 450GB used, with about 230 of it attributed to streaming.
(I have to track down the other usage at some point too)
 
Not sure if this would be the right place, but figured it's one of the closest.

I'm trying to track down some high bandwidth usage for a family.
They received a notice from the cable company about coming close to their 1TB cap, along with an upsell to buy more bandwidth.

Cox shows 50% being streaming, which I'm trying to focus on, since it's about 8-10 hours of watching.
(Yep, there's a couch potato living in the household).
I'm just helping out, so going by what's reported to me.

I know I can set the streaming quality for Netflix and assuming other services allow the same thing.

My question is:
Do the streaming players detect the display quality and then ask for the lowest between display and setting or will it just stream at whatever quality is in the configuration?

e.g.
Netflix parameter is set to stream UHD (4K).
Player on media stick is plugged into a 1080p TV.
Will the bandwitdh used be:
UHD (4K) bandwidth and then downscaled to 1080p by the player/TV?
or
FHD (1080p) bandwidth (Netflix servers only send at most 1080p quality picture)

I believe streaming services might downgrade due to connection quality, so let's assume there's no connection quality issue (since that would reduce bandwidth).
8-10 hours isn't that much time. So, they must be pulling down really high bitrate streams.

It will be tougher to manage, with that many different sets of streaming hardware. But, each different piece of hardware, should have local settings. Possibly per profile.
May be easiest to buy Roku sticks which aren't 4K capable. I did that for my Mom, whom streams a certain show, all day every day. When they want 4K for something specific, they use the TV's built-in app.
 
8-10 hours isn't that much time. So, they must be pulling down really high bitrate streams.

It will be tougher to manage, with that many different sets of streaming hardware. But, each different piece of hardware, should have local settings. Possibly per profile.
May be easiest to buy Roku sticks which aren't 4K capable. I did that for my Mom, whom streams a certain show, all day every day. When they want 4K for something specific, they use the TV's built-in app.

What do you mean that 8-10 hours isn't much time?
(side note: Growing up, my parents thought 2 hours of TV was too much)

Using Netflix bandwidth numbers of 1Mbps for SD, I'll get
1Mbps * 3600sec * 8hrs == 28,800 Mbps == 28.8Gbps.
Seems close to what I'm seeing.

Cox also has a bandwidth calculator.
Doesn't really define what it considers as HD video, but plugging in 8 hours per day shows a daily usage of 24GB per day.

I'm really surprised that SD streaming would already take up that much data.

However, A&T articles seem to go against what I've calculated (and what Cox shows), so I may be missing something in my math.
https://www.att.com/support/pages/data-calculator/
 
Last edited:
The context of having a couch potato should be a clue that it was more than 10 hours a month for the whole family unit.

If not the context of the parents thinking that 2 hours was too much should be a big enough clue that it is daily usage.
 
The context of having a couch potato should be a clue that it was more than 10 hours a month for the whole family unit.

If not the context of the parents thinking that 2 hours was too much should be a big enough clue that it is daily usage.
Nothing about the original post suggests to me that we are trying to break down per-day usage.

The newest post does directly say that, yes.
 
Back
Top