Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Your math is somewhat incorrect. First, average[1] page load time is only relevant if your data distribution is a perfect bell curve. It never is. It's more likely to be Log-normal, in which case a Geometric mean is a better number, but again it it's unlikely to be perfectly Log-normal. It's likely to be double-humped (though you may not notice it), but the median and the entire distribution are very necessary. You'll find that the median load time is typically lower than the arithmetic mean, but the 95th or 98th percentile is typically much higher.

Secondly, you cannot simply divide by 70% to get the load time for 70% of your users, because again, that assumes a very specific distribution (a linear distribution, which doesn't exist for any site with more than 5 hits). What you really need to measure is the "empty-cache" experience, which is different from the "first-visit" experience, and is harder to measure since it's hard (but not impossible) to tell when the user's cache is empty.

Lastly, you're assuming a user drop-off rate without looking at your own data for user drop-off.

You should probably use a real RUM tool that shows you your entire distribution, but also shows you how users convert or bounce based on page load time. Looking at actual data can be surprising and enlightening (I've been looking at this kind of data for almost a decade and it still surprises me and forces me to change my assumptions).

My company (SOASTA) builds a RUM tool (mPulse), which you can use for free. Other companies like pingdom, neustar, keynote, etc. also have RUM solutions, or you can also use the opensource boomerang library (https://github.com/lognormal/boomerang/ disclaimer, I wrote this... BSD licensed) along with the opensource boomcatch server (https://github.com/nature/boomcatch).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: