now 65GB. Unlike the days of the floppy disk and its set size of 1.44MB, most of today’s data is obtained via the cloud. Cloud content like streaming video and music is the driving force behind this surge in data usage. And again, unlike the floppy disk, cloud data leaves no footprint, which makes its size difficult to quantify. The graphic below helps add some context to this incredible volume of data, so let’s have a glance at what 65GB of data looks like:
What does this mean to CSPs?
By understanding the breakdown of how much data each type of service needs — whether it’s a movie, a new photo album, a series of eBooks, etc. — providers can make better decisions as to how much data will be required to meet the package details of their subscribers. By combining this knowledge with subscriber-specific utilization behaviour, gathered with smarter network insights, operators can increase quality of experience (QoE) and reduce the risk of bill shock by offering adaptable service packages with optimized network functions. These packages can be tailored to meet the requirements of each subscriber, giving them the bandwidth they want, when and where they want it.
How would this work?
If your subscriber is a Netflix addict, you can give them the option to optimize bandwidth speeds during their typical viewing period. This way, they won’t have to worry about buffering issues or low picture quality when they’re binge-watching their favourite series.
We all know improving QoE is essential to retaining customers — now is the time to take a step back and leverage the Big Data picture to actually achieve this.