Signal Degrade? why did switch over did not occur - ITU-T G.806 (reference)
In this article let us understand the criteria possibilities of service switch over in OTN network .There are two possibilities of determining the signal quality in terms of signal degradation .
-Poisson distribution
-Burst distribution
The Poisson distribution is a discrete probability distribution used to model the occurrence of rare events in a fixed interval of time. On the other hand, Burst distribution is not a continuous probability distribution but rather a concept that describes the occurrence of events clustered closely together in time, creating bursts of activity.
In the context of ITU-T standards either of these can be used however i would prefer burst distribution is preferred sue to the reason that i would need to identify burst errors if at all they occurred .
The degraded signal defect (dDEG) shall be declared if DEGM consecutive bad intervals (interval is the 1-second period used for performance monitoring) are detected. An interval is declared bad if the percentage of detected errored blocks in that interval, or the number of errored blocks in that
interval >= Degraded Threshold (DEGTHR) which can be provisioned in the range of 2-10 and When based on a number of errored blocks, it shall be in the range 0 < DEGTHR <= Number of blocks in the interval.
Now let us understand this via an example if we have ODU0 which constitutes of ~10000 blocks Signal degrade would be detected if we have defined default errored blocks (defined lets say 15% as default ) so if we have 1500 blocks error for 2 continuous bad intervals then SD will be declared and switching will be triggered