# Calculate time difference between the capture of two packets

Hello,

I have a question about time difference calculation between two packets captured using jnetpcap. My goal is to calculate the throughput in real time of a video stream between a sender and a receiver. I have a server which is sending the video frames and is capturing using jnetpcap all outgoing packets and saves their timestamps. I have a client which is receiving the video frames, displaying them, and also captures all incoming packets and saves their timestamps. I would like to calculate the difference between the receiving time and the sending time of each video packet as accurately as possible, which will give me the actual time each packet spent "on the wire" and then use this value, and the size of the packet to calculate the throughput.

I already have the application working, my clocks on the server and the receiver are synchronized before starting the stream and I'm correctly capturing 90% of the packets, but the calculated throughput is not close enough to the actual value. The packet size is fixed and I know it, I have my own protocol to keep track of the frame numbers so I'm pretty sure I'm using the correct Received-Sent packet pairs to calculate the difference, the standard deviation of the captured packets is not high so the calculated throughput values are fairly close to each other but not close to the actual value so the only problem I can see right know is that the calculation of the time difference between the capture times is not accurate enough...

For the calculation I'm doing the following:

long sendTimeInMillis = pcapPacketOut.getCaptureHeader().timestampInMillis();

long receiveTimeInMillis = pcapPacketIn.getCaptureHeader().timestampInMillis();

long timeOnTheWireMillis = receiveTimeInMillis - sendTimeInMillis;

long timeOnTheWireSeconds = timeOnTheWireMillis * 1000;

double throughput = (double) timeOnTheWireSeconds / packetSizeInBits;

So after the long intro, here are my questions:)

1 Can you see something wrong with this logic? Especially am I assuming correctly that timeOnTheWireMillis = receiveTimeInMillis - sendTimeInMillis; gives me the elapsed time in milliseconds between sending and the receiving time?

2 If the logic is correct do you think it is accurate enough for throughput calculation? If yes there might be something wrong with clock synchronization although that's pretty accurate from what I see right now.

3 Would using pcapPacketOut.getCaptureHeader().timestampInMicros() give a more accurate calculation?

4 Any other suggestion on how I would calculate the throughput more accurately in real time.

I would really appreciate any suggestions that you might have.

Thanks!

You may have the wrong packets that you are comparing. Hard to really tell from just a snippet.

Keep in mind that the timestamp is not a hardware timestamp but one performed by 'pcap' library. It could be quiet a bit off in reality, although your standard deviation seems to point that its tolerable.

Also in your calculation for throughput, shouldn't it be the other way around. bits/time? And not time/bits?

Your throughput should have the data/time units and the inverse of that.

Lastly the method 'timestampInMicros()' returns timestamp in micros, but the resolution of the pcap capture is still in millis. So you will gain nothing. Its just a convenience method for converting the timestamp, if it has the necessary accuracy and actual information.