Problems have appeared in the hedge fund marketplace and they are not about fund management, but rather data management. Consider Archeus Capital, a once proud, $3 billion hedge fund, which has apparently suffered from administrative inadequacies, both self-inflicted and via its hedge fund administrator (HFA). The firm's closure has brought attention to the myriad pitfalls hedge funds can encounter that aren't related to their ability to generate returns.
Hedge funds and their administrators must seek to improve their operations to avoid massive clerical mistakes that can take their firms down, and also to quell increasing investor discontent and the subsequent outflow of funds. The realisation of the importance of proper and rigorous administration has paralleled the rise in popularity of the hedge fund market. This awareness is akin to the increasing scrutiny the institutional asset management industry faces from regulatory authorities and their clients. As that market has adjusted, largely through enhanced processes and the take up of key technologies, so must hedge funds seeking to maintain and grow their share in the market.
Effective hedge fund administration typically involves full maintenance of multi-currency accounting records (including some partnership accounting), calculation of fund related fees, net asset value calculations, reporting, tax return preparation, payment of distributions and dividends, and assistance in regulatory reporting. Administrator duties have grown to also cover areas like trade processing, security master maintenance, corporate action processing and collateral management.
Surely the duties entrusted to this segment of the market will continue to grow. However, every functional area a HFA takes on relies on effective, comprehensive data management. Without a strong data management practice, as in the case of Archeus, the fund risks complete extinction.
The important question to ask is: what are the components of a strong data management practice to support the burgeoning hedge fund marketplace?
Firstly, successful data management in this context must be programmatic in nature; a strong process with effective governance must be in place to achieve the operational goals the firm seeks. The application of this process must extend beyond the hedge fund's four walls, to reach the administrator and ensure the same level of data management with this trusted partner. The administrator must apply to its own operations a level of discipline with its data that emulates and often exceeds that of its client base; often being called on as a final arbiter when discrepancies and other problems arise.
Effective data management must be comprehensive. There are various types of data that a hedge fund and HFA must manage: security reference data, corporate actions, customer, counterparty and positional data, to name a few. A full view of the enterprise must be available, and without each piece, this view cannot be achieved.
Data management must focus on quality. Hedge funds regularly strike net asset values; they must accurately portray their positions in the market for the benefit of their investors. Often this activity is complicated by the assets being valued – derivatives for instance, which are a challenge to effectively price. To promote quality, data must go through a rigorous process of cleansing, which includes normalisation, verification, and enrichment where necessary. A composite record, constructed based upon preferred hierarchies of data providers, is then produced, which can be used with confidence to power internal hedge fund applications (trading, accounting, risk management) as well as to synch up with the administrator. In this manner both administration and investment operations are improved.
Consistency is another contributor to data quality – without it, a fund cannot confidently provide the necessary valuations and other portfolio data demanded by its client base, on an ongoing and timely basis. It's for the sake of consistency that data management must be conducted on a centralised basis. It's not necessary (nor realistic) for every area of the firm to consume the exact same data record, or composite. However, if the composites being created are not coming from a centralised source, the fund cannot attribute any level of confidence to the output over time and has no baseline from which to accurately proceed.
Consistency and centralisation are only valuable to the extent that the data is timely. As data becomes available from vendors as well as legacy systems, the process as defined by the hedge fund and its administrator must be optimised for timeliness – that is, to process the data as it becomes available in as near to real time as possible.
Centralisation of data brings value not only via ongoing operations. It's likely that the administrative difficulties Archeus faced will not soon be sorted out, if ever. Why? One reason is the lack of an audit trail. In a fund optimised for data management, the ability to audit down to the record level is possible. This gives the regulators what they need, but it can also be used to more effectively market the fund. Potential investors can be confident in the fund's ability to track its activities and provide a rationale behind its investment approach.
Since records will be (or should be) maintained at the hedge fund and its administrator, effective communication is also critical for effective data management. While the use of standards to simplify communication is of value, the most crucial aspect of communication is that it is defined in the data management process (what is to be communicated, how often and via what medium) and is agreed to by both the hedge fund and administrator.
If a hedge fund has sought to maximise data quality via careful sourcing and construction of composite records, seeking consistency and centralisation throughout, they will be in a position to benefit from improved investment decisions and significantly enhanced client service. As hedge funds continue to gain exposure to the broader, more mainstream investment marketplace, their ability to implement and refine processes around general administration will be as crucial to their continued survival as their investment return capabilities. Data management is the foundation for any successful attempt to sorting out the problems faced today.
The role of HFAs to facilitate this process will continue to grow, and their data management practices will likely become the baseline for the entire industry. A good example of this new breed of HFA is OpHedge, a new entrant in the HFA marketplace. The firm's focus is firmly placed on customer service and is being born out via strong data management practices.
OpHedge has streamlined its reporting and data management process by creating, maintaining and publishing a standardised reference data repository for all financial instruments. The solution centralises securities master data from multiple data providers, cleans it, then publishes it to OpHedge's internal securities and exotics trading systems. OpHedge can develop pricing solutions around vendor data to suit market hierarchies or tap into a 'securities of interest' list hosted on the real-time GoldenSource platform. OpHedge now has the flexibility to cost-effectively tailor data services for its clients.