Measure network latency in React Native

3.3k views Asked by At

What would be the best way to measure network latency in React Native?

E.g.

const startTime = Date.now(); 
let response = await fetch(url)
const endTime = Date.now(); 
const totalTimeInMs = endTime - startTime;

If I place start and stop timer before/after the network call as shared above that might not give true network latency because the JS might be busy doing some other work and would come to this eventually when there is nothing in the event loop / call back queue / task.

Hence wondering if there is any better way to measure network latency?

3

There are 3 answers

5
Muhammad Numan On

Solution 1:

install Axios library

yarn add axios

response.latency will give the totalTimeInMs

Full Code

import React, { Component } from "react";
import { Text, StyleSheet, View } from "react-native";
import axios from "axios";

const axiosTiming = (instance) => {
  instance.interceptors.request.use((request) => {
    request.ts = Date.now();
    return request;
  });

  instance.interceptors.response.use((response) => {
    const timeInMs = `${Number(Date.now() - response.config.ts).toFixed()}ms`;
    response.latency = timeInMs;
    return response;
  });
};
axiosTiming(axios);

export default class App extends Component {
  componentWillMount() {
    axios.get("https://facebook.github.io/react-native/movies.json")
      .then(function (response) {
        console.log(response.latency);  //17ms
      })
      .catch(function (error) {
        console.log(error);
      });
  }

  render() {
    return (
      <View>
        <Text> test </Text>
      </View>
    );
  }
}

Solution 2:

by using fetch, but fetch does not give us default timing attributes

const start = new Date();
 return fetch('https://facebook.github.io/react-native/movies.json')
.then((response) => response.json())
.then((responseJson) => {
   const timeTaken= (new Date())-start;
  return responseJson.movies;
})
0
Tevon Strand-Brown On

You can use the react-native-debugger to get the familiar network tab from web development for use in react-native!

https://github.com/jhen0409/react-native-debugger/blob/master/docs/network-inspect-of-chrome-devtools.md

0
Doddie On

I found this question while looking for ways to measure just request-response time in axios....but I thought it was interesting enough to have an alternative answer to the core question.

If you really want to know the network latency the technique used by the Precision Time Protocol could be some inspiration.

The concept
This drawing hopefully explains what I mean by "network latency":

     API Request             API Response
        |                          ^
        v                          |
  UI ---+--------------------------+-----------> Time
       A \                         ^ B
          \                       /
           \                     /
            \                   /
             v                 /
 Backend -----+---------------+-------------> Time
            a |               ^ b
              |               |
              +- processing --+
                    time

Where:
 - A is the time when the UI sends the request
 - a is the time when the backend receives the request
 - b is the time when the backend sends the response
 - B is the time when the UI receives the response

The time it takes from A->a is the network latency from UI->backend.
The time it takes from b->B is the network latency from backend->UI.

Each step of request/response can calculate these and add them to 
the respective request/response object.

What you cannot do with this

  • You probably won't be able to precisely sync the clocks this way, there will be too much jitter.
  • You can't really also tell the inbound/outbound latency. As you have no way to know the relationship in time of A to a, or of B to b.

What you might be able to do with this
The total time seen in the UI (B - A), less the total time seen in the backend (b - a), should be enough to get a good estimate of the round-trip network latency.

i.e. network_latency = ((B-A) - (b-a)) / 2

Averaged over enough samples this might be good enough?

FWIW, you could just have the backend include its own "processing_time" in the response, and the UI could then store "A" in the context of the request and calculate "B-A" once a successful response comes back. The idea is the same though.