The signal strength reported by the HT2000W is typically 95 to 100. Of course, during bad weather it drops.
If the signal strength is > 60, system works. When drops below 60 (say, 50) I loose internet connectivity even though the System Status still indicates it is working (is not!)
Is it normal to loose the internet when signal drops below 60 even though the modem says all is OK?
Woody - KZ4AK
The SQF/Signal Quality Factor number does not seem to impact everyone in the same way as it depends on a number of things. I read someplace that if your SQL is poor relative to others on your beam, then your encoding changes to have more redundancy to be more robust relative to others on your beam. A typical value for SQF varies widely depending on location -- my highest SQL in the last 24 hours was only 91 and 64 at the lowest. During that time everything seemed normal. Often it seems that my SQF has to drop to the mid 60s before it seems to step down to a lower gear with the encoding. Usually I can still do http/web traffic down to about 31 and basic data is exchanged with the ground station down to about 15.
Maybe other folks that share your beam and give there experience with SQF to compare. Also, perhaps someone in the know might know if SQF can be high, but still have a lot of resend traffic due to signals reflecting back to the dish from near by objects/trees causing a higher SQF to be required to get traffic back and forth?
Mine's about the same as @MrBuster: Typically between mid-90s, but mostly upwards of ~110. When storms hit it starts getting wobbly when it drops to between 60-70, and finally quits when it gets down to 31 or below. The upper number may partially have to do with your location within the spot beam reception area (dead-center being best) as well as the takeoff angle and distance to the satellite on the equator. Theory being that the more distance and lower the takeoff angle, the more potential for attenuating moisture and other particles in the atmosphere to disrupt the signal.
My understanding is the SQF is actually a linear representation of the inverse of a logarithmic bit error rate. That may partially explain the wide variation of upper values. So for all intents and purposes anything above 90(ish) should be fine given redundant forward error correction techniques. Similarly, the lower cutoff of 31 is likely the maximum BER threshold the system can tolerate.
These are really good answers. The short version is that so long as you are over 31 SQF, you're technically "operational". The number depends on a variety of factors, some examples:
As MrBuster and MarkJFine mentioned, there is included redundancy in case there is a lot of 'noise' over the air for whatever reason. We attach backup data just in case something isn't delivered the first time. Like a delivery person carrying an extra slice of pizza in case he got hungry on the way... 🙂
We also factor that into overhead, so if your system has to add in that 'make up' data, we do not count it as additional data usage.
The perception of losing internet signal when your SQF dips is likely subjective (if your devices say you're connected to Wi-Fi but there is no internet connection, that is a different story). You may time out trying to get somewhere data intensive, or struggle to reach a site if you have many devices online (not necessarily browsing) at the same time. Consider that when the SQF drops, your terminal and radio have to do some extra lifting which would be the difference between say... how fast you can carry a 7oz box of tissues vs a 10lb ream of paper from one place to another. Spread that over all the devices in the house, during peak hours, to a popular website and you're probably going to feel like you just can't get there at all.
Hope I was able to add to this great discussion!
"Like a delivery person carrying an extra slice of pizza in case he got hungry on the way..."