7 Key Reasons to Outsource Your Firm’s Data Aggregation
By Jennifer McMackin
The financial services industry is among the most data-intensive sectors in the global economy. In our experience working with buy-side investment managers as well as solution and outsourced service providers around the world, we know that a myriad of front-, middle- and back-office processes and workflows rely heavily on efficient access to high-quality data.
But acquiring, validating, normalizing and enriching data when and where it’s needed remains a key challenge faced not only by the investment management community to drive their reconciliation, fee billing and other workflows, but by platform vendors and service providers across the wealth management and capital markets spectrum as well.
Here are seven main reasons why organizations should outsource their data aggregation.
1. Resource Optimization
Data aggregation can be a highly manual and resource-intensive process, from setting up and maintaining links with disparate providers – each with a different method, format or timeline for delivering data – to normalization and validation. Some still require teams to scrape data sets manually, creating more even more work and exposing the firm to various types of risk due to delays and higher potential for errors. For most firms, this is not the best use of their time and resources.
2. Data Richness
A provider that specializes in data aggregation can provide rich data sets from a wide range of sources beyond custodians including asset managers, prime brokers, retail brokers and hedge funds. In addition, the provider should be able to collect all the data points available from a custodian, not just a subset based on a particular reconciliation or operations platform, and not limited to a distinct set of fields delivered in a SWIFT message.
3. Data Quality
Data Aggregation services help to ensure high data quality by implementing controls and validation processes. Examples include monitoring data completeness, data and file types; and identifying missing data, changes in file size day after day as well as changes in file specifications.
4. Data Timeliness
Data delays create inefficiencies throughout different mission-critical processes within a firm. Typically, data comes in at various times and in many different formats, and there is no easy way for internal staff to validate the data before it moves on to the next stage. A data aggregation provider ensures all source files are collected, each validated file is sent to the firm without delay, and any late-posted data is retrieved and sent to the firm as it becomes available.
5. Data Ownership
When a firm outsources the data aggregation process, the relationship and connectivity with each data supplier are managed by the provider while the client maintains data ownership. This ensures the client maintains access to data sources online as well as owns their high-quality data for use across multiple processes and workflows – without manual or redundant effort.
6. Activity and Status Reporting
With access to a tracking and reporting dashboard that provides both a summary and detailed view of all the firm’s incoming and expected files and the provider’s response to source issues, teams have full transparency of data status at all times. As a result, they can anticipate and prepare for any necessary adjustments in downstream workflows.
7. Data Auditability
By constantly monitoring feeds and the audit status of month-end data, the data aggregation provider is able to deliver audited data and PDF statements as they become available. This saves the client yet another step involved in memorializing the data and ensures the safety and integrity of documented data records potentially needed for compliance and governance reviews or audits.
Adding Value Across Workflows
When firms are constrained from collecting high-quality data, all other systems and processes that rely on it will struggle – portfolio management and accounting, to reconciliation, compliance, and numerous other operations functions.
The process of managing data feeds, tracking down missing data, and adjusting systems and workflows to changing file formats requires ongoing attention from operations staff and technology. However, many firms lack the time and resources required to perform the task thoroughly, accurately and efficiently.
The most efficient and cost-effective way for firms and providers to obtain the timely, accurate data they need is to work with a proven, full-service data aggregation provider – one that offers full automation, high-quality data, Tier 1 data security, business continuity and dedicated client support.
# # #
Jennifer McMackin is the senior vice president and lead for Electra Data, an operations service that helps investment managers overcome the challenges of aggregating, validating and enriching data to drive multiple post-trade workflows. She also oversees Electra Managed Services, a scalable and cost-effective reconciliation service that helps firms simplify workflows, increase team productivity, and repurpose staff towards higher value.