when you talk about FR and HR, is this related only to AMR-HR and AMR-FR ?
Did you notice any change in the utilisation of AMR vs. non-AMR codecs ?
The RXQUAL does not represent the voice quality, and is not related to the codec used. You can change codec but that won’t impact the BER… indeed, the rxqual only represents the BER.
In other words, changing codec do not modify the RxQual measured.
(but it does modify the FER!)
Regarding your problem, i suspect:
1/ higher timeslot usage –> higher frequency usage –> higher interference (indeed, before HR was more used, so less ts were busy overall)
2/ depending on your system capabilities : the HO Quality and HO interference are not triggered at the same RXQUAL threshold than before. (in some systems, it might be possible to tune different thresholds for AMR FR, AMR HR and non AMR calls). By increasing the threshold of HO Qual, you actually force users to stay in the “bad quality” cell. So the average rxqual in your network is increased.
3/ last possibility : DTX and PC are not activated for AMR-FR or non-AMR-FR, but they are activated for HR.
I can’t think of anything else… After that, you might have to suspect AMR “implementation” in the system itself, or in the MS.