The problem with big data
In this networked world, data has become the undisputed king. And there’s a ton of it–so much so that the entire notion of big data is a problem in and of itself.
More organizations are storing, processing, and extracting value from data of all forms and sizes. Systems that support large volumes of both structured and unstructured data will continue to rise. The market will demand platforms that help data custodians govern and secure big data while empowering end users to analyze that data. This is herein lies the problem.
Gartner asserts that by 2022, 40% of all enterprises will combine big data and machine learning functionality to support and partially replace network monitoring, service desk and automation processes and tasks.
Truth is — big data is only as useful as it can be used to solve complex problems and provide real answers that add value to our lives. But given the immense volume of data whizzing around, it’s simply untenable for humans to consume and analyze it all. Nowhere is this problem more evident than in the data running over today’s modern computer networks.
Fortunately three technologies are converging to solve the big data conundrum: cloud computing, artificial intelligence and sophisticated data analytics.
This is creating the perfect storm for companies looking to harness the power of big data to solve some of the most important questions that, up to now, have never been answered.
The focus on digital business has driven a fundamental shift in attitudes regarding how these technologies should be deployed and used, and how to prioritize the functionality being offered by the vendor community. Despite increasing demand for agility, many organizations lack a strategy for these technologies, leading to tool sprawl and overlapping functionality between constituent IT operations domain groups. Recent advances in analytics technology hold the key to making big data useful.
REAL ANALYTICS AND BIG DATA
Think of Netflix recommending movies you’ve never heard about based on the historical analysis of usage or Google’s Waze application that analyzes a huge amount of different data source to recommend the fastest route home. This is how real analytics should work.
By applying this scheme to the enterprise network, companies can now use big data to make REAL business decisions backed by hard [data] facts.
For businesses, a majority of untapped data runs over their computer network but has never been really used to capitalize business strategy or improve user productivity. While capturing the data is simple, making demonstrable use of it isn’t.
Most companies don’t have the time, expertise, or wherewithal to crunch all the data running over their networks to better understand what exactly is going on within their business , when, where and why.
A NON NEGOTIABLE BUSINESS IMPERATIVE
In turn, the automating the analysis and correlation of network data is becoming a non negotiable imperative for enterprises looking to stay competitive. These digital communications infrastructures have become the backbone for business operations costing business millions of dollars each year to operate and maintain.
And enterprises with complex, heterogeneous network environments no longer want to adopt a siloed analytics solutions from just one network element data source.
Answers to their questions are buried in a host of sources across the network all of which must be examined and compared to provide an accurate picture of how the network is performing from the end user point of view and what recommendations, if any, can be provided to make things better.
While there are lots of products that take big data and help visualize it in pretty charts and graphs, this isn’t analytics. Real people still need to look at (analyze) and compare (correlate) all the information to understand what it all means.
The problem is, these people, typically IT staff (not business analysts or data scientists), have their own jobs and simply don’t have the time. Meanwhile the data volumes are only increasing — making things exponentially worse. Getting in front of this problem has become of vital importance for enterprises.
ENTER A NEW ERA OF ANALYTICS PLATFORMS
Fortunately new data analytics platforms have been developed to solve this problem by eliminating and automating manual analysis of all this data and, after analyzing it all, spitting out simple answers to questions that companies might not even know to ask or solve problems they might not even know they have.
This represents a huge win and strategic advantage for organizations that want to streamline operations, growth the top line and cut operational costs.
These new systems, such as those from upstarts like Nyansa, Extrahop and others, constantly stare at every bit and byte running over enterprise networks to understand trends and identify anomalies that IT staff don’t have the time to perform.
what’s more, this seemingly disparate data, from a variety of different sources, is automatically analyzed and correlated across all parts of the network.
From wireless client devices to cloud-based applications, IP network services (eg. AAA, DNS, DHCP) to wide area broadband connections, these systems now figure out how network-attached users and devices are actually performing as well as what’s working and what’s not (with the when, where and why of it all).
That said, organizations will continue to evaluate analytics platforms based on their ability to provide live direct connectivity to these disparate sources.
These new analytics service platforms,leverage highly available compute, storage and columnar (big) databases in the cloud combined with artificial intelligence to quickly learn how to best interpret petabytes of network data and suggest recommendations to improve things. Something manually impossible today.
All this has given rise to cloud-based network analytics SaaS [services] that most companies don’t have the expertise to architect or the deep pockets to deploy and manage. These services allow enterprises to now answer questions that impact the bottom line for businesses.
“How are networked devices on the production floor performing?” “Are users able to connect to Office 365, and is it performing properly given all the network independencies?” Is the call quality of clinician smart phones good?”
The vast majority of network problems today happen when users access the network. This is the area where most of the innovation and investment is required but few companies have yet to really address it.
Today’s modern enterprise access network is undergoing some profound changes due to the explosion of wireless mobility, smartphones, IoT devices and cloud-based applications. Finding and fixing these problems in this new context simply can’t be easily performed without some sort of big data analytics.
Ultimately, the problem with big data can be solved through the use of advanced analytics, machine learning and data correlation that leverages cloud-based computing.
As big data traversing networks only gets bigger, companies need to be quick to adopt new vendor-agnostic analytics platforms that support such functionality.
Enterprises must begin the deployment of such platforms to maximize the investments being made in their networks, take advantage of untapped network data to improve business processes as well as improve productivity of users on them. Either that, or begin hiring data scientists. Good luck with that.