i am making some project abt voip packets and related algorithms. BUt there is some concept problme i m facing. At receiver side there is suppose adaptive playout buffer algo implemented which takes as input each packet delay and outputs changed playout delay. So can someone explain wat is this playout delay which is being outputted by the algorithms? why is that delay different from the inputed delay? just some conceptual explanation is needed.