4th August 2011 at 16:36
#66861
Guest
Hi,
when we use 4.75 codec for both FR and HR there’s more bad frames in percentage than the case with 5.90. Is that mean that 5.90 is better? As I know 4.75 is more robust codec. And why such a high number of bad frames in 4.75? Is that meaning that frames are bad because of 4.75 or 4.75 is used because of low C/I and many bad frames?
Anyone explain please!
Regards,
Dell