hello Pix,
we have implemented DL DTX in 2 BSC in our network.But the global result was an increase in DL Quality HO.Some BTS were the most impacted , the others less.
The parameters related to DL DTX are at BSC level not cell level
What’s the cause of that degradation?Is there some tunning to overcome that??recommendations?
Please check 3GPP Series 05.08 section 8.3 i.e Radio Link Measuremens and Aspects of DTX, apparently this increase in DL Quality is because of relatively inaccurate RXQUAL Measurements by Moble Station as lesser number of frames are being transmittted in DL because of DTX.
Thanks Pix and 2112 for clarifications, But i have a question.
If it’s a story of inaccurate measurements reported by MS , there’s a non DTX parameter (to affect a weight) for Quality HO in non DTX mode : from 1 to 3.
So if the decision of a DL quality HO is token for example with 6 consecutive measurements reported to the BTS and let’s say that 3 of them are DTX (weight =1) and for the 3 others real measurements the weight is 2 (we have affected the value 2) then the decision will be token especially on non DTX SACCH’s since the weight is 2.the decision is so more logic.
But even with this parameter no decrease in DL quality HO.
Any reply??