The measurements will only be correct if the client and the server clocks are synchronized.
Yes, then... never, unless both devices are at your disposal.
I could make a sort of ping; the client sends a request and receives an answer, calculates the time (using only its own clock), divides it for two and get something similar to the latency (it could do it 10 times and divide by 10. Moreover, the client should do this job in a different thread, not in the main).
This, however, does not ensure that when you need to use this average latency you will have just that delay.
In short, users of my servers will still have problems and then... they will send me to hell, this is the logical consequence
Thank you.