Loving Legacy and the 3 Vs of Big Data

By: Scott St. John

Very few organizations dare to even dream about a rip-and-replace approach, which represents a stark contrast to the vast number of technology providers that either promote this approach, or whose solutions require it.  Rip and replace is costly, time consuming, and fraught with risk – and it can be a career killer for anyone daring enough to try to take it on.

But legacy, by its very nature, doesn’t keep up. In today’s climate of rapid change and obligatory transformation, service providers must be nimble to compete. New, dynamic service offerings are being launched at a blistering pace by operators around the world.  This helps to ensure their future but creates a highly competitive environment where providers must become and stay increasingly agile. But legacy ain’t agile.

This poses a unique conundrum, particularly for service providers who have just begun their digital transformation journeys and those that rely on legacy systems for mission-critical functions. Chances are you fall into one or both of these camps. So, do you take on the daunting challenge of ripping out legacy systems to put in newer, but more costly systems with all the added time and risk; continue the costly approach of trying to manually integrate legacy systems to make them work manually, which increases expenses, slows innovation, and impedes competitive agility; or is there another way?

The spider’s web of support systems

If you haven’t picked it up by now, I’m not a big fan of legacy; but I do understand its place. On several occasions, I’ve had the opportunity to view the architectural system diagrams of some of the leading service providers around the world. They’re mind numbing. A virtual spider’s web of hundreds, if not thousands of loosely connected, independent and dependent systems. The complexity becomes drastically compounded by the acquisition of service providers by other service providers; a persistent trend over the last few years. This not only underscores the impracticality, or impossibility of a rip-and-replace approach, it also stresses the critical importance of legacy systems.

Today, much of what is being defined as agile is being driven by data, and much of this data is locked away in those same legacy systems. Tapping into these data sources opens the door to innovation, transformation, and much more.  For example, access to this information can enable service providers to drive down cost through automation and improving customer experience management (CEM) — reducing the amount of time it takes to solve a customer’s problem. That suggests that cracking legacy may be both the barrier and key to successful transformation.

Consider an example of a Customer Service Representative (CSR). In a typical eight-hour shift, a CSR will spend approximately 10% of his or her time just accessing various systems to help solve customer issues.  And this is before it’s possible to address the actual root-cause of the problem effecting the customers’ services. Not only does this create a significant impediment to creating a positive customer experience, particularly when customers may have to engage with multiple CSRs who then have to access multiple systems, but it also creates bloated costs and increased Mean-Time-To-Resolution (MTTR). While 10% may not seem like a lot, it becomes significant when based on 200 events per operational employee per day and a 4-hour MTTR, which equates to about $5,000 per day and nearly $2M in waste primarily due to basic inefficiency.

In a recent interview, Anand Thummalapalli, Head of Product Management at gen-E told Pipeline, “We have been talking to the top wireless and cable operators in the United States and they all have a similar need. They all express the same need for a consolidated console as the next evolution of single sign-on to quickly access the data produced by multiple systems and networks to better serve their customers.”

The 3 Vs of Big Data

Tucked away in these legacy systems is incredibly important and relevant data. The sheer volume of this data is massive. Networks, systems, users, and machines produce an endless stream of relevant information. Organizations themselves produce enormous amounts of collaboration data in email conversations and enterprise networking platforms. In some cases, this data is being leveraged for fault and performance; but in others it’s not due to the lack of the ability to contextualize this data and present it in a way that is useful to various organization departments.

The velocity of data in many cases requires near-real-time processing to keep up with the rate at which the data is being produced.  The access to the most recent, relevant information is the hinge-pin of agility.  It’s not uncommon for a top-tier service provider today to take 25 million data measurements every five minutes. This data velocity only increases as smarter, more “chatty” devices and new use cases such as residential and industrial IoT (Internet of Things) continue to be brought online at an unprecedented pace. Tackling the increasing velocity when it’s compounded by volume becomes even more daunting.

Each data source also produces its own unique variety of data. Some of the data is structured, some unstructured, and each contains specifically relevant and different pieces of information that need to be tied together. For example, device and network information being generated is distinctly different than call data, CRM information, and email.

All three factors – volume, variety, and velocity – must be addressed to provide actionable insights that by which service providers can quickly solve customer issues, better make informed decisions, and rapidly capitalize on new revenue opportunities.

A different approach: Big Data Playground

gen-E is no stranger to data mediation, and it has been working with top operators globally to solve this problem of aggregating data from legacy systems. Instead of removing legacy systems gen-E has developed a consolidated console that taps into the rich data stored within. It then contextualizes the data and models the data in an intuitive user interface – opening access to critical data in a single system.

The gen-E solution includes many standards-based, RESTFUL APIs to common platforms to tap the data from virtually any system. gen-E has also amassed over 2,000 key performance indicators (KPIs) and includes over 300 pre-defined KPI mappings available out-of-the-box today. The solution then presents the data into an easy-to-use console based on five different personas, which are designed to quickly present digestible information relevant to the particular user’s unique needs. KPIs continue to be evaluated, added and grouped into these functional areas to populate these tailored personas for customer service, operations, finance, human resources, and more.

gen-E has also simplified the visualization of data by including customizable drag-and-drop widgets that digest the data and present them in informative tickers, charts, graphs, cascading event windows and more to make the information easy to consume. The ramp-up time is minimal, with online training being available for those that need it.

“We want to make the information as easy to model and digest as possible,” Thummalapalli commented. “So we made it intuitive, drag-and-drop, and easier to use than Excel.  By having it all in one system, it provides immediate access to the most relevant information.”

gen-E is also using state-of-the-art data processing models for speed, scalability,and to reduce the dependency on third-party licenses. They are leveraging technology developed by some of the world’s largest social networks and packaging in one, efficient solution that has been benchmarked to write up to 2 million events per second, on three inexpensive machines.

gen-E has essentially created a scalable “Big Data Playground” whereby organizations can connect any legacy or future data source, contextualize the data by stitching it together into a consolidated platform, present useful information departmentally by leveraging categorical KPIs, and put simple to use widgets into the hands of the end-user to make the information highly relevant and immediately consumable. This is big.

Changing the data game

It may be too soon to tell how big of an impact a solution like gen-E will have, but its launch marks a monumental shift in how data is and can be leveraged by organizations. It breaks the dependence on expensive, third-party database licenses and empowers the organization and end-user to maximize the value of the data being generated across all data sources; new and old. It tames the volume, variety, and velocity of data and negates the risk of a rip-and-replace approach by unlocking the data which has been locked away in legacy systems and enables service providers to continue to leverage essential systems to become increasingly agile, innovative and fuel transformation.

Don’t hate your legacy, embrace it. It’s integral to agility and it can now be an asset, not an obstacle, to your organization’s future success.

Unified Service Assurance from gen-E
Bringing Value to the Equation

gen-E has nearly 2 decades of experience helping some of the most recognizable brands in telecom and enterprise keep their networks running at peak performance, and meeting their customers’ demands. Applying our industry-lauded Unified Service Assurance framework, we have helped our customers work through network upgrades, acquisitions, and system consolidations. Our goal is to work across organizational silos and allow our technology to correlate events and use analytics to help identify problems across the entire network faster, reduce MTTR, and get to the optimal state of network management. For more information, click here.



gen-E and IBM are hosting an IBM ITSM Workshop on Achieving Smarter IT Operational Awareness at the IBM Atlanta Technic...
Join gen-E and IBM for an exploration of the common operational challenges many DevOps teams are facing today, how the t...
gen-E, Microsoft, Verizon, and others will join a panel discussion moderated by Pipeline Publications on October 17th. 
gen-E and IBM are hosting an IBM ITSM Workshop on Achieving Smarter IT Operational Awareness at the IBM Atlanta Technic...
Load More

There are still a few seats left in our Cognitive #ITSM workshop in Dallas on December 6th - Register now! gen-e.com/cognitive-itsm…

About 2 weeks ago from gen-E's Twitter via Twitter Web Client