17 Dec 2021

Giving some perspective to the internet minute

Steve Pass of DataQube explains the consequences of our social media and wider internet dependency and the impact it is having on data centres

IoT and digitization have become so integral to everyday life that the total number of internet users globally has more than doubled over the last ten years. As of July 21, the internet reached 65% of the global population and represented 5.17billion people. Internet traffic, on the other hand, has grown 12-fold over the same period.

Combine these trends with the consequences of the pandemic which have given rise to a digital explosion, internet traffic is growing unprecedently, leaving a surging data footprint in its wake, with social media, video conferencing and TV binge watching being major contributors. Indeed, according to Statista the total amount of data consumed globally so far this year is 79 zettabytes and this is predicted to more than double by 2025. 

Technologies at the heart of the world’s digital activities and the everyday services and applications we have come to depend upon collectively churn out relentless quantities of user activity and associated data on a minute-by-minute level. Such a short timeframe is of little consequence in human terms but in the data centre world it’s a different story altogether. To give our digital interactions some perspective here’s snapshot of what happens in an Internet minute (source www.domo.com ) and these figures only scratched the surface. 

  • Amazon customers spend $283,000 

  • 12 million people send an iMessage 

  • 6 million people shop online 

  • Instacart users spend $67,000  

  • Slack users send 148,000 messages 

  • Microsoft Teams connects 100,000 users 

  • YouTube users stream 694,000 videos 

  • Facebook Live receives 44 million views 

  • Instagram users share 65,000 photos 

  • Tiktok users watch 167 million videos 

Data generation is growing yet data centres are shrinking

The drive for the hyperscalers to re-evaluate their data handling processes and rethink their business models have also been given a turbocharge by the pandemic and our increased dependency on technology for business and leisure activities. Such is the projection for change, Gartner estimates that 75% of  enterprise-generated data will be created and processed in outside centralised facilities by 2025. In tandem, the global market for edge data centres is also expected to nearly triple to $13.5 billion over the same period. 

Conventional data handling facilities are under enormous pressure to keep pace with the exponential growth in real time data generation and there’s much more to meeting these seamless data handling requirements than just physically expanding. As our internet usage increases, the hyperscalers must rethink their business models because more data per minute does not translate into bigger premises; quite the opposite in fact because much of the data being generated (due to the service/application being powered) must be handled at the edge of the network if positive user experiences are to be delivered or IoT based applications are to perform optimally. 

Take photo uploads, video streaming, or mobile gaming, for example, all this content is data heavy and needs processing as close to the source as is practically possible to avoid jittering and/or latency issues. Additionally visual data must be handled by HPC computers for efficiency and storage reasons and these in turn require specialist infrastructure that can house more servers per rack to assure 24x7 operability, and all this needs to be achieved whilst consuming less power overall for sustainability reasons. 

The race is on to move to the edge 

Facilities at the network edge robust enough to handle data at these levels are in short supply and the regular data centre providers are struggling to meet demand because of planning permission/building regulations barriers, capex shortfalls and physical location challenges. The hyperscalers need a viable means of moving to the edge quickly and cost effectively but thus far this has proven difficult. This could be about to change, however, thanks to a disruptive approach to edge computing being spearheaded by DataQube, which, if successful could change the face of the industry.  

Built from the ground up by a of industry experts who understood the impact edge computing and IoT are having on data analysis and storage methods, DataQube have developed a standalone system that can be installed at the edge of the network within a 6-month timeframe and for significantly less capex compared to alternative setups. Moreover, the system comprises the necessary infrastructure to accommodate HPC servers and associated 5G and cloud connectivity infrastructure in a secure and sterile environment unsuitable for conventional data centres. 

The internet, along with the devices, applications and technologies it supports, is now integral to everyday living. This dependency combined with advances in AI and machine learning and the universal rollout of 5G will result in even more data being produced, not minute by minute, but second by second, nanosecond by nanosecond, even. And the handling of said data must be done at source swiftly, efficiently and securely lest invaluable and highly sensitive information will fall into the wrong hands and end up on the dark web. Data centre infrastructure in their existing format are no longer up to the job. 

Ends