said by AnonymousPerson :
For a signal to go 4.35 miles and then come back, it should take 46.7 microseconds. 44 milliseconds is an increase in latency by a factor of 1000. Can processing a ping really create that much overhead?
That number was for a video streaming application and not a ping. It includes all the delay in both the hardware and the application.