Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors

AMR speech codec

Viewing 9 posts - 1 through 9 (of 9 total)
  • Author
    Posts
  • #66856
    Smart
    Guest

    Hi,

    How can I perform a AMR codec measurement and optimization?
    What AMR speech codec are you using for AMR-FR and AMR-HR?

    Regards

    #66857
    Hauiu
    Guest

    We use 12.2, 10.2, 7.95 and 5.90Kbps for both AMR-FR and AMR-HR. AMR defines the proper bit rate according to the radio environment (interference), FR-HR is related to resources at BTS.

    #66858
    Hauiu
    Guest

    AMR_12.20 12.20 FR
    AMR_10.20 10.20 FR
    AMR_7.95 7.95 FR/HR
    AMR_7.40 7.40 FR/HR
    AMR_6.70 6.70 FR/HR
    AMR_5.90 5.90 FR/HR
    AMR_5.15 5.15 FR/HR
    AMR_4.75 4.75 FR/HR
    AMR_SID 1.80 FR/HR

    I was wrong at the first sentence.
    12.20 and 10.20 bit rates are used in FR only.

    #66859
    Smart
    Guest

    How you decided on what to use. or you are using default parameters?

    #66860
    Hauiu
    Guest

    If your vendor equipment support all of them, it is OK. If not, you will have to choose according to your radio environment. Higher interference-Lower bit rate.

    #66861
    Dell
    Guest

    Hi,
    when we use 4.75 codec for both FR and HR there’s more bad frames in percentage than the case with 5.90. Is that mean that 5.90 is better? As I know 4.75 is more robust codec. And why such a high number of bad frames in 4.75? Is that meaning that frames are bad because of 4.75 or 4.75 is used because of low C/I and many bad frames?
    Anyone explain please!
    Regards,
    Dell

    #66862
    Hauiu
    Guest

    Dell,
    In case there are errors in the frame, then these errors are fixed by using error correction bits. If the decoder fails to correct these errors, then you have a bad frame. In order to lower the number of bad frames (lower the Frame Error Ratio), we increase the error correction bits, and this lowers the bit rate.

    So using 4.75 bit rate you should have less bad frames then in 5.90 bit rate.

    #66863
    Dell
    Guest

    Hi Hauiu,
    thanks for the answer.
    In our network in most of cells 4.75 has almost 50-50 bad-good frames. 5.90 and higher codecs have little bad frames. Is that mean that is something wrong with the system? We are using ALU B10. I know the solution is to disable 4.75 but wanted to know what can cause this.
    Thanks again and best regards,
    Dell

    #66864
    pix
    Guest

    dell,

    you should try to disable 4.75

    then you might see that it is now the next codec (5.9) which has lots of bad frames.

    if that’s the case, it means that either your C/I thresholds are wrong.

    or it measn that there are some severly interefered zones in which even the most robust codec can’t fight.

    keep in mind that the most robust codec is the one used in the WORST areas of your network, so you would expect to see the worst radio quality. Therefore, conduct a drive test in the worst cells, find out where the worst C/I are localised, and do an voice quality (in UL and in DL) to ensure the codec is unable to keep up with the interference. Try modifying the codecs to see the impact, etc.

    If the experiment shows that 5.9 is showing less bad frames than 4.75, then it might mean there is an issue with 4.75. But it can also mean something else 🙂

    regards
    pix

Viewing 9 posts - 1 through 9 (of 9 total)
  • The forum ‘Telecom Design’ is closed to new topics and replies.