Hello, i am analyzing a huge amount of packets,and i am interested in the usage of the frame.time_delta_displayed filter,but i am honestly not understanding what's happening...
I generate a UDP packet every 10ms,and i want to see if there are packets whose delta time exceeds a certain amount of time. I attach a sample capture file,exaplining the steps i'm doing: 1)i start with a total of 313 captured packets 2)i set the first sample as time reference 3)i order the displayed packets by the deltatime:i can see a maximum delta time of 0.017989 and all other packets' delta time is less than 0.014 4)i set a filter equal to frame.time_delta_displayed>0.014 expecting to see a single packet 5)instead,i see about half of the total number of packets,with different delta times from the original ones
(i.e.,packet number 54:without any filter,the delta time is 0.010407; with the filter on,it becomes 0.021125)
Wireshark version in 1.0.5 (SVN Rev 26954)
Probably i misunderstood the usage of the filter...Can you please explain me why this happens?
Thanks in advance!
PS:Just an additional off-topic question:is there some tool capable of building chart on captured packets?
|
Attachment:
all.pcap
Description: Binary data