Data has become driving force for modern businesses. The more you have, the greater your perceived potential. As a Data Steward, which is more important – the Quantity or the Quality of your content? The answer may seem obvious, but you might be surprised how often these get mis prioritized.
The Ask
Having worked for some of the biggest data aggregation services in FinTech, I can tell you how important a comprehensive data platform is for customers. The primary reason they hire data aggregators is to compile comprehensive data into a single source. They want to know,
- Can you get the data?
- Where will the data reside?
- How can we access the data?
- Is the data clean?
Oddly enough, the latter is portrayed as the least critical of the four and Data Stewards often make false promises of their data cleansing capabilities.
The Problem
While the majority of retirement account data can be obtained from a small set of mega data providers, there are a plethora of data feeds for sub-sets of financial data. These lesser known entities are riddled with inaccuracies, lack format governance, and delivery reliability. However, that doesn’t preclude clients from requesting it.
The Data Consumer’s goal is a central repository for all things data. Data Stewards desperately want to meet that goal. Their sales teams believe promising something the competition can’t (access to unreliable data), will win the bid or make their service to existing customers that much more sticky. This is usually a short-term victory but ultimately damages the relationship and the reputation of the service.
Damage to Data Stewards
For Data Stewards with existing data schema and cleansing technology, these unreliable data feeds wreak havoc under the hood. Sales and Leadership assume the existing tech can massage and map the data in the same way as existing reliable sources, resulting in an under-funded project bid. When data managers are honest in their true assessment of the work, Leadership warns that a client will not pay for the proper implementation for such a small sub-set of data – and they are right.
Schema expansion is expensive for the Data Steward and the Data Consumer. While the repository is expanded to house new sets of data, the expansion has a trickle-down effect of additional effort. Newly designed front-end systems, reports, API, data mapping and technical specifications to name just a few. Thus, the decision is often made to repurpose current schema, and map to existing front end fields where the data is similar, but ultimately different. This compromise can be detrimental to your quality reputation.
Damage to Data Consumers
For Data Consumers, what once was a reliable repository has suddenly lost its luster. The existing cleansing tools aren’t able to remedy the extreme variances in the new data source’s formatting resulting in – garbage in, garbage out. Data that was forced into existing schema returns unexpected results.
Take for example retirement account balances which are mostly reported as an asset. If you map a held-away account balance such as a loan (liability), front-end systems may not be able to easily distinguish between the different classifications and report them inaccurately. While this is a trivial example and one that can be remedied if properly implemented, this is what can happen when shortcuts are taken for short-term gains. There are much larger and more complicated data anomalies which can occur.
Data Consumers expect perfection in reporting and UI systems. Some Data Stewards underestimate the gravity of misreporting account data. In fact, there will be times that the Steward will need to remind the Consumer of the importance of quality.
The Hard Truth
One of two things should happen when a Data Steward decides to increase their data Quantity by taking on additional feeds.
- Expand Schema, Adapt Services and Bill Fairly
- As a Data Steward, you’ve worked hard to build a robust and reliable service. Don’t let the confidence of your entire product be called into question by taking shortcuts to make the RFP response palatable for your prospective client. If the new data feed is that important to one client, decide if this is a strategic advantage for your business and if other clients would benefit from the data. Bill the client at a reasonable rate and pursue additional returns on the investment by expanding the service to other interested clients.
- Deny the Request
- If there is no strategic advantage in expanding your service to accommodate a new data feed and if the client is unwilling to shoulder the burden of the entire effort, confidently deny the request. A strong and talented Relationship Manager should be able to maintain a positive relationship without always bowing to client demands. They should be able to confidently articulate the value of the current service offering and how the unreliable data source would jeopardize the service.
GPC Group has Business Advisors with decades of experience in financial data aggregation. If you are looking to build, refine or expand your current service offering, schedule a complimentary consultant and let us put our expertise to work for you.





