I hope this doesn't fall under the "Support" category. I'm not asking for help, I just want some feedback on the topic of router/modem connection. The guy who installed my HughesNet service told me if I connected the modem to the TV using the ethernet cable it would drain my data. So he set up the HughesNet app on my laptop and kept everything wireless. The ethernet cable was cast aside. Since then I've done some research online and found articles that say direct ethernet connections are a little faster and more reliable, and use only a tiny bit more data than wireless connections - if any. I've also come across articles advising to disconnect the cable connecting the TV and modem in order to conserve data. I'd like to know anyone's opinion and/or personal experience regarding this matter. I'm afraid to run an experiment myself in case the ethernet cable does eat data and throws me the middle finger as it watches my plan data # go to 0GB.
Solved! Go to Solution.
The TV will only use data on its own if it's set to update or refresh its apps and such automatically, so you should check to ensure that whatever apps you use on the TV aren't set to use data automatically. It's not dependant on Ethernet vs WiFi. Also, the one that might actually use more data in practice is the WiFi connection, but pretty much only if the WiFi connection is weak, as it may need to resend data packets to transmit all of the data. If the WiFi connection is strong, you'll use the same amount of data either way.
An ethernet cable can transmit data faster than most WiFi, and therefore more data during a given time, but that's only its capability, not necessarily what it would actually do in practice.
Technically, an ethernet cable is going to always provide the widest bandwidth, therefore the greatest thruput, which will always seem faster. Wifi depends upon the protocol used: 802.11ac when using multiple 20MHz channels (up 80MHz wide) will approximate ethernet thruput, whereas 802.11a, b, g using a single channel will not.
Data is data and neither uses more than the other, however a wider/faster path will always use data *faster* so people tend to think it's using more data.
The TV will only use data on its own if it's set to update or refresh its apps and such automatically, so you should check to ensure that whatever apps you use on the TV aren't set to use data automatically. It's not dependant on Ethernet vs WiFi. Also, the one that might actually use more data in practice is the WiFi connection, but pretty much only if the WiFi connection is weak, as it may need to resend data packets to transmit all of the data. If the WiFi connection is strong, you'll use the same amount of data either way.
An ethernet cable can transmit data faster than most WiFi, and therefore more data during a given time, but that's only its capability, not necessarily what it would actually do in practice.
Technically, an ethernet cable is going to always provide the widest bandwidth, therefore the greatest thruput, which will always seem faster. Wifi depends upon the protocol used: 802.11ac when using multiple 20MHz channels (up 80MHz wide) will approximate ethernet thruput, whereas 802.11a, b, g using a single channel will not.
Data is data and neither uses more than the other, however a wider/faster path will always use data *faster* so people tend to think it's using more data.
Oh...That makes sense. I appreciate your response. I'm trying to get more educated on all this stuff so I don't make dumb decisions.
GabeU,
You misunderstood me! I was not being sarcastic or mocking you and maratsade. It honestly tickled me to read y'all's different methods of responding to certain threads. I agree with your approach that the customer isn't ALWAYS right. I'm on your side so take it easy, Killer. Thank you for helping me understand more about my question.
We love you too, Lacy dear. Good luck to ya.
Thank you!