Showing posts with label Latency. Show all posts
Showing posts with label Latency. Show all posts

Sunday 30 July 2023

Latency Metrics for API's

Overview:  More and more software is built on API's, we often need to know what are our slowest performing API's and how important are they.  Monitoring latency is how we determine performance and performance issues.  We need to know the fastest, most used, slowest, and average time to complete.  You need to look at all to get a full picture.  Latency metric let us know what percentage of monitoring metrics fall into a range.  

For instance, if you API end averages 1 seconds for all requests (10k) over an hour that sound okay, but if the majority of requests don't have data say 90%, and the slowest 10% of requests could be averaging 5 seconds.  Monitoring metric percentage take out the slowest % of performance requests and show the  faster performers, so in this scenario the slowest 10% of requests are excluded .  Often referrer to by the percentage as P90 i.e. 90%.  I use 95%/P95 normally but it's becoming more common to us 99%/P99 or even P99.9.