The average Google fiber user can only get a 3.5Meg stream from Netflix? Or is that the average of all the customer streams to Google fiber users? If the latter, this ranking means exactly nothing. Whose to say what the sample sizes are and the distribution of the actual streaming content bitrates across those sample sizes? This is fuzzy math at best which can not provide any real useful conclusions.
Alternatively, it says to me that Netflix really sucks at distributing their content. I have 24Meg Uverse and can easily sustain 4 to 6 way multi-point video conferences at 3Meg bitrates with little to no artifacts. I have run perf tests to various systems and can sustain 22.4Meg of throughput. I've also tested over an IPSec VPN connection from Dallas to San Francisco and have been able to sustain 12-14 Meg throughput during peak traffic periods through a fairly heavily loaded VPN gateway with around 3000 other users on it. An average 1.9Meg streaming bitrate is pathetic for a service that performs like mine.
If Netflix is trying to use these numbers to say the average streaming experience for a customer on said ISP is better or worse, they should be prepared to reveal how they rolled the data. Is the unique client and stream sample size n greater than 1000 for each carrier? Is the collection methodology identical for each carrier? Were the stats for the streams organized by similar encoding quality? I doubt the answer to the last question is yes. They likely averaged it all out. It would provide much more meaningful results if they used a client and stream sample size greater than 1000 and categorized the results based on the actual encoding bitrate and not the average of the content.
There are way too many variables in play for them to make any meaningful statistical analysis, unless they reveal their control set for the data.--
Scott, CCIE #14618 Routing & Switching