Some years ago, an EMT vehicle backed over and destroyed my Hughes dish. The next morning I called my trusty tech company, and the first thing he asked me was if the coax was damaged, and next if had I cut the power to the modem.
I didn't discover that the coax was broken until the next morning, and I hadn't powered down the modem for all of that time. He said that with all the rain we had that night, that the modem was probably fried, so he would make sure that the tech he sent brought a new modem with him, and for me to unplug the modem at once, even though it was probably too late by now.
He was right, the modem was fried.
I knew that coax was designed to carry extremely weak signals, and I didn't think there was enough power available to cause any damage. Looking back now, I think it was probably the transmit signal to the dish that probably had enough power to do the damage, and the water on the broken end of the coax caused a short on the transmit circuit of the modem. At least, that's my theory.
I suspect that this may be true for the newer generation modems also. I'm going to play it safe from now on, and if anything ever happens to my dish again, I'm going to immediately cut the power to the modem.
PREJUDICE, n. A vagrant opinion without visible means of support.
The Devil's Dictionary
Solved! Go to Solution.
Yes, there is DC power supplied by the modem going through the coax to power the radio on the dish. Under certain situations the modem can be damaged if the power is shorted out. There are protections built in but they don't always work.
Also one reason we say to power down the modem from the wall receptacle, not the plug on the back of the modem.
Companies like HughesNet, Viasat/Exede, etc should use LMR400 as a baseline, with N-type connectors, and certainly no dielectric. Techs often overlook (or don't understand) how a dielectric grease will impact impedence in the connector. We rely on internal dielectrics in the cables to maintain a matched impedence and minimize signal loss and transmit power gain...yet we make it common place to impact impedences and increase the loss/gain while using a (sure its solid copper core) cable that has its own inherent signal loss issues regardless of impedence matching.
It is truly a shameful practice in the RF world. If HN can't pinpoint problems with my system, I will be building an LMR400 cable from the dish to the modem, removing dielectrics, and changing all connectors to N-type and see what happens. Guarantee my speed will increase and transmit power will decrease. There are a lot of factors at play that should be customizable in these systems, but alas it is a cookie cutter product that is meant to work for Joe in the inner city with a 20 ft cable run, and Susie in the farmland with a 100 ft run.
You are going to have a much bigger problem trying to use 50 ohm LMR400 in your 75 ohm HN system than you will with some dielectric grease in your F connectors. Also keep in mind that the cabling between the modem and the dish is not carrying RF. The radio in the system is located at the antenna feedhorn and is integrated with the LNB. Signal quality as measured by the HT2000W is determined at the dish itself.
............................. Also keep in mind that the cabling between the modem and the dish is not carrying RF. The radio in the system is located at the antenna feedhorn and is integrated with the LNB. Signal quality as measured by the HT2000W is determined at the dish itself.
Very interesting, but I'm somewhat confused ( as I often am ). Does this mean that the coax is carrying digital data as well as the DC power for the radio, and then the radio converts it to RF? If so, wouldn't a piece of cat 5 cable work just as well?
I'm asking this because some years ago I had Mainstreet broadband, which used a small square antenna which was hooked to a modem with cat5. I almost cried when they went out of business, $28 a month with no added taxes or fees, blazing fast, and with unlimited data.
So, I've done some more research on the HT2000W and its corresponding ODUs and found the following:
-I assumed, since HN reps and installation documents routinely call the ODU a "radio" that all of the system modulation and demodulation was being performed outdoors. Apparently that's not the case, and HN is using incorrect terminology.
-It appears that the system modulation is performed in the IDU according to its datasheet: https://www.hughes.com/sites/hughes.com/files/2017-04/HT2000_H56163_HR.pdf
-Since the cable between the IDU and ODU is RG6, obviously Ka band RF (25+ GHz) is not travelling across that cable. The ODU is downconverting to some IF. Hughes won't tell us what that IF is. https://community.hughesnet.com/t5/Tech-Support/Need-to-know-IF-frequency-used-between-antenna-and-m...
-Modern microwave systems often do have all of the RF components located in the ODU, yet still use coax to connect to the IDU, even though it's just digital data flowing between the two. Especially in residential satellite, installers are used to working with coax, and there would be additional costs and time associated with training them to work with outdoor twisted pair cabling. Also, coax conenctor ends are (in general) easier to waterproof than RJ series connectors are.
So, bottom line, yes losses in the IF cable matter, but a slight impedance mismatch due to dielectric grease in a connector is not going to ruin the system margin. Here's the official doc from HN on IF cable lengths, which is telling: http://www.satelliteguys.us/xen/attachments/fsb_080202_01a_spaceway_cables-pdf.36329/