Data aggregation: Part art, part science

Deciphering the metadata

Data aggregation has never been more important than it is today for buy side firms.   Whether you are reconciling back to your counterparties, preparing performance reports for your clients or considering how to meet existing and approaching industry regulations, post-trade data is at the heart of meeting these goals. 

These evolving needs are pushing firms to rely on technology vendors to aggregate data for them.   In addition to having scalable processes that can collect data from dozens or even hundreds of sources, vendors can also provide that key ingredient that often plagues financial services firms – data normalization.   Additionally, firms that use their solutions providers wisely are able to take advantage of the vendor’s ability to connect to a wide range of data sources while maintaining consistency in how the data is provided to the consumers.

A long-standing phrase in the data world is “garbage in, garbage out” and this is no less true in the area of mapping account-level information from financial institutions.   That is not to say that data from financial institutions is bad, rather, the ‘garbage’ saying is more a commentary around ensuring you are getting the best output of data that matches the requirements of your destination system and business goals.

Any high-quality vendor of interfaces insists on having these certainties:

  1. A Direct Relationship -with the custodian or broker in order to get access to data in its native format directly from the source.  Initial communication with the source is equally as important;  if the requirement involves data for reconciliation purposes, the directive to the data source revolves around ensuring they can provide at a minimum daily positions, transactions and cash balances on a T+1 basis.
  2. Accurate Data Mapping -This is part art and part science.  When considering SS&C Advent accounting systems, the goal is to transform the entire range of files from any data source’s native format into files that are compatible with SS&C Advent accounting systems.  An important step involves turning transaction codes from the data source into standard SS&C Advent codes.  Many large brokers and custodians have hundreds of native transaction codes that require mapping reliably to a relatively small number of SS&C Advent-specific transaction codes.  The same is true with security types.  If this isn’t done correctly, it can cause serious problems reconciling your accounts.
  3. Testing, Automation and Monitoring - Later phases of interface development should always involve rigorous testing, including simulated tests with the destination systems and also with beta testers especially interfaces that are being created for scalability and used by the masses.

The quality and consistency of data delivered to buyside firms through broker and custodian interfaces is highly dependent on the data quality of the data source and a vendor’s data mapping expertise.  It is not unusual for counterparties to process certain information differently, to lack detail on certain activities, or to change their mapping, requiring a vendor to act quickly.   Choosing the right data aggregation vendor is critical to ensuring success with your data collection goals.

For more information visit SS&C Advent Data Solutions  or contact us. 

Tagged with: , , ,
Posted in Traditional and Alternative Asset Management

Leave a Reply

Your email address will not be published. Required fields are marked *


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>


Recent Tweets




If you have questions or would like to submit a topic you would like for us to consider covering, email us at