Hello All,
Im running Wireshark on server which streams RTP packets to a client.
I see some values for packet loss, mean jitter and max delta.
Since Wireshark is running on server , on what basis it calculates
packet loss / jitter / delta .
I understand the importance of these measurements when wireshark runs
on client but it relly bothers me when i see values for these metrics
even on the server.
In case of UDP packets , there is no ACK being sent even if the packet
it lost , so how does wireshark calculate packet loss.
So what do these metrics(packet loss or jitter or delta ) mean when
wireshark run on a server ?