{{item.title}}
{{item.text}}
{{item.title}}
{{item.text}}
Mismatch in data intake versus analytic output
In this technology-driven age, massive amounts of data have become the lifeblood of organizations. Yet for all the data companies process, a wealth of data still lies dormant simply because there’s just so much of it. With data basically stockpiled in disparate silos and in various formats, organizations struggle to understand what all their data means and to make actionable decisions based on everything they can know.
Of course, analyzing data demands a significant amount of time from employees. In our work with many organizations, we’ve found that more than half of an employee’s day can be spent on low-level data preparation tasks to convert data to a useable format. This often involves copying and pasting across different systems and basic data cleansing just to prepare data for analysis. In fact, many data analytics solutions are now accessible to a wide number of employees, and that has resulted in even more data and more time demands on employees.
Because data bottlenecks prevent companies from maximizing the resources they have, many organizations pursued short-term fixes. Analytics functions were centralized to regain control and certain activities were moved offshore to reduce costs. Still, problems linger and will likely only get worse without a solution.
But now there are increasing options for resolving this issue. To restore the imbalance of data intake versus analytic output, business intelligence and analytics tools need help in leveraging greater amounts of quality data in real time. Introducing automation to open analytics platforms may be the only way for companies to keep up with today’s rapid data flow and finally use analytics tools to help them understand their priorities or make strategic adjustments.
Automation’s growing role in real time analytics
Global payments companies have a responsibility to identify fraudulent activity and protect consumers. This requires real-time analysis across internal system data and external sources, including social media, at large scale. Leading payments technology companies can process more than 200 million transactions per day.
The only viable way to keep up with data and get ahead is with proactive alerts. Essentially, companies must remove manual bottlenecks like data preparation and integration that play small, isolated roles in making data more useful. Introducing automation and complementary technologies like artificial intelligence can save a Global 2000 company in the range of $20 million to $40 million in annual costs.
This significant cost-cutting measure, which can also free up employee time spent on manually preparing data, can lead to significant operating and strategic gains. How do we know these technologies are ready to digitize manual processes? Because they have crossed the threshold of emerging trend to mainstream adoption. Process automation tools already are broadly adopted across many financial services institutions because the technology has proven to be cost-effective and to produce a quick return on investment.
Automation plays a critical role in reducing the impact and cost of fraud when consumers receive real-time alerts. Automating repetitive tasks also enables payment processors and affiliated banks to make strategic adjustments such as offering different rewards programs or products that can improve the overall bank-customer relationship.
Intelligent automation will recoup lost efficiencies
Typical data preparation has several pain points. Among them: cleansing raw data into a usable format, locating data from isolated email or local hard drive storage, combining data from dozens of internal systems, and building individual models that lack consistency across teams, departments, or regions. These functions alone occupy a significant part of the analytic role, yet deliver almost no value.
Automation can help relieve these pain points. It can extract information from unstructured sources and convert it to a usable format as the first step toward reducing manual and repetitive functions typically handled by higher cost analytics individuals or engineers. These tasks historically rely on a lot of button-clicking, and they are ideal tasks that scripts can automate.
More sophisticated data management techniques can be developed when automation is combined with complementary AI or machine learning technology. Consider optical character recognition (OCR), which converts documents, files, and images such as emails and bank documents into standardized text. This software typically ‘learns’ from continually refining data sets to improve accuracy. Incorporating machine learning into automation can advance overall competence by moving beyond record-and-play scripts.
Also consider how automation can affect changes to a broader organizational operating model. A comprehensive analytics platform will involve an open approach (e.g., open standards-based APIs such as JSON) to work with all data sources and storage silos to natively integrate data. A resulting open platform — available to the organization — can capture more strategic activities from broader interaction with all of that data.
The bottom line
The process of cleansing and consolidating data from multiple sources for decision making is not new, but the vast number of employees with data analytic tools and the speed at which data is produced and decisions are needed is. These factors have created challenges in analyzing date in a cost-effective way that doesn’t consume valuable employee time. For many organizations, simply applying standard automation to modern data analytic tools can result in material 30% to 40% efficiency improvements.
Automation technologies have matured and are being validated with mainstream adoption across many financial services institutions. Adding this technology can be nonintrusive as it does not need to replace existing analytic solutions to alleviate manual bottlenecks between the inflow of data and analysis.
The right approach begins by identifying immediate areas for automation — areas characterized by highly repetitive data extraction and manipulation tasks. By productionizing these quick wins, the business case can be proven through cost rationalization, which will assist in aligning on a future state for a more comprehensive operating-model-driven solution. How far more intelligent software will then be applied to the decision-making process will be different for each organization. But given the safe assumption that the data backlog will only intensify, getting ahead of this curve can be an advantage with both cost relief and analytics differentiation.