Tackling the Transparency Challenge

Defining a methodology for valuing complex OTC instruments

A JOINT WHITE PAPER FROM CMA AND NUMERIX
Originally published in the March 2009 issue

"Financial innovation has also brought, inevitably, the challenge of complexity…Perhaps the best example of innovation is the over-the-counter (OTC) derivatives markets. These markets have grown tremendously; but the infrastructure has not kept up – and it must” Secretary Henry M. Paulson, Jr, formerly US Department of the Treasury.

Though the credit crisis appears still to be in its infancy, a clear change in the way institutions value and manage complex derivatives and structured products is currently underway. Not only has the credit crisis brought market inefficacies to light, it has also forced institutions to evaluate, from an enterprise perspective, how they participate in complex markets – in terms of valuation; model validation; risk management and overall internal controls; in addition to enterprise-wide pricing policies.

It is well-understood that the valuation space is receiving a lot more attention from regulators, accountants and financial institutions as a result of the credit crisis and ubiquitous mis-pricing. Pursuant to this, today’s institutions need to develop valuation methodology and systems that are consistent, well-defined and repeatable – from front-to-back office. Clearly, FAS 133/157 and IAS 39 are key regulations and guidelines governing asset valuation, but there are so many different regulatory authorities and trade associations weighing in on these issues today.

With the regulatory landscape changing, the new mandate for institutions is transparency and consistency. Institutions must invest in clearly defensible valuation policies. Whenever a firm marks a position on their books, it must be able to produce an auditable report that explains the source of the valuation. This can either be a mark-to-market using an executable quote, or a mark-to-model. The latter can come from a valuation service or internal software systems. In both cases, the software model configuration and market data should be recorded at the valuation date.

Best practices: implementing a successful pricing policy
In terms of best practices, institutions implementing successful pricing policies will need to introduce transparency into the valuation process and adhere to the following standards:

• Use robust, quality data
• Enforce consistency by enabling reports to be run the same wayfrom one day to the next
• For illiquid and hard-to-value securities, demonstrate how a particular value was determined
• In the front, middle and back offices, all valuations need to be clearly audited by examination of the pricing policy used
• Most importantly, the valuation methodology needs to be well-defined and repeatable

The concept of an ‘enterprise pricing policy,’ defined as a codified and repeatable methodology for valuing specific instruments, creates an audit trail for how prices and valuations are derived – from front-to-back offices.

Valuation challenges and solutions
One of the main challenges for institutions today is developing the needed support for increasingly complex OTC instruments, including: structured credit, structured notes and hedges. The trend is for institutions to continue investing in flexible software systems for the valuation of derivatives and other sophisticated financial products. Pursuant to this, there are several key issues that these solutions need to address; for example, from the front office – the development of standards between the desks which support different asset classes, model validation for new products, and trade capture and then throughout the institution – including connectivity between data and system resources.

The good news is that valuation systems have changed with their support for complex financial instruments. The Numerix product offers modern architectures that are inherently ‘future proof,’ in that they support the flexibility to describe financial instruments with an ‘object-oriented approach’; this enables end users to handle new products without requiring an update of the software. Additionally, pricing mathematics is separated from the instrument, allowing for even greater adaptability. New innovations include multi-threading to take advantage of multi-core CPUs and grid networks, as well as 64-bit architectures.

Selecting a pricing model
Institutions typically choose from several available pricing models, based on the ability of the model to ‘capture’ various risk factors that affect particular instruments. For example, some models are more adept at capturing the ‘smile effect’ than others. When a deal is particularly sensitive to the volatility smile, you need one of these models. Other models are more proficient at capturing ‘fat tails,’ ‘correlation skews’ and ‘correlation dyanamics,’ so relevant to credit. Often, there is a trade off between accuracy and performance. It is always desirable to use the model that has the highest performance; however, alternative models may be selected based on their ability to handle more and more of the market effects. Institutions continually back test model performance and run ‘sensitivity analysis’ reports that can empirically demonstrate model effectiveness under different conditions and settings. In all cases, it is critical for institutions to be using a ‘pricing policy’ database that can store model selection and settings. This database should be referenced for all valuation reports – enforcing consistency, and the choices should be saved along with the report – enforcing transparency.

The New World: central clearing houses for OTC credit derivatives
The launch of central clearing houses/counterparties for OTC credit derivatives, as planned by a number of players in the market, will have a powerful impact on the valuations space – particularly in respect to price transparency. One example of an organisation that has promoted and advanced the concept of a central counterparty for OTC credit has been the CME Group, the largest derivatives exchange in the world. Unlike clearing platforms that are dealer-led, CME is market-neutral – ready and willing to do business with institutions on the sell and buy sides. As well as helping toreduce counterparty risk in the OTC credit market, central counterparties provide the CDS market with an offering which will foster a greater degree of transparency.

As this relates to pricing, an increase in transparency can be expected to enhance market-based price discovery, as the most accurate pricing data is required for mark-to-market. This will be a critical development as CDS pricing has historically been biased toward sell-side valuations, which often fail to accurately reflect true market values, as has been graphically and painfully illustrated by the numerous write-downs and re-valuations of credit derivative positions in sell-side institutions.

In contrast, CME can serve both the buy-side and the sell-side in a manner which allows valuations to be driven by traded market levels – and not by the indicative valuations of the sell-side firms, many of whom have an economic interest in the valuations they provide to their clients. Companies currently offering dealer-based valuations will likely begin to feel some pressure, as the competition intensifies and the value and reliability of the model itself is called into question.

The importance of data quality and consistency
The pressure for consistent use of pricing data is being felt between institutions, as well as within them. To avoid conflict and misunderstanding, institutions need to ensure that the data they see is the same data seen by their counterparty, risk managers and administrators, and auditors; and, that can only happen if everyone is using the same data and the same management system. Firms that would have previously developed proprietary, in-house valuation models are turning to specialist providers to help them with accurate and on-time valuations – particularly if their portfolio contains many types of complex assets, since the time, cost and expertise required to develop and test these systems can be prohibitive, particularly for smaller firms.

Institutions need consistently reliable and accurate data to feed valuation systems. In the hedge fund world in particular, we see that managers are constantly improving their client reporting and transparency. This has led to portfolio valuations becoming more frequent – weekly or even daily in some cases. Therefore, valuation providers need to ensure their service is agile enough to allow fund managers to provide accurate and timely valuations to their investors, regardless of their size or the types of assets they are using.

Due diligence on your data provider should include an examination of both the source of the data and the processes used to ‘clean’ or monitor the data quality on an ongoing basis. In contrast to many OTC market data providers, CMA sources its credit data from a unique buy-side consensus model based on observed, tradable quotes, as opposed to dealer markets. This not only has the advantages of eliminating dealer biases and reflecting true market values, but also ensures the data is as fresh as possible, so clients can be better-informed and respond faster to changing market conditions. Data quality monitoring will vary across provider firms, but employing a combination of automated and manual cleaning processes will help eliminate outliers, quickly identify suspect data, correct typos or identify new entities that are not yet recognised by the system. Because so many trading, risk and accounting systems rely on the best possible data, nearly 25% of CMA’s workforce is focused on ongoing data quality control.

AN EXAMPLE OF CDS VALUATIONS
In the OTC markets, valuations are often calculated based on the mid price of a derivative – the average of the bid and offer. This approach is often used because data providers are unable to offer reliable bid and offer values, as well as a mid. CMA is unique in providing buy-side consensus based bid and offer prices, as well as mid values for CDS. In the below example we will look at how the choice of mid or bid can drastically affect the valuations process:

On December 8th, 2008, the CDS for Kazcommerzbank had a mid value of 2082 bps. The bid was 1932 bps and the offer 2232 bps. If you bought protection on Kazcommerzbank as the market closed at the end of the day at 1932-2232 then you would have paid 2232 bps for the CDS. When you come to mark your book minutes later the level you choose to value against is all important. In this case, if you used the mid for valuation you would mark the position at 2082 bps. However, if you chose to enter an offsetting position by selling protection, you could only expect to receive 1932 bps. Thus the value of your position is drastically overstated by marking to the mid. In this case you would be overvaluing your position by nearly 150 bps. An investor with a wide variety of CDS positions may encounter unexpected portfolio devaluation if they consistently use the mid price to value their open positions for the simple reason that they will not be able to offload the CDS into the market at the price they expect to receive.

CMA pioneers ways to boost the effectiveness of OTC credit market professionals. CMA’s real-time pricing services and data are used by investment professionals in over 130 leading investment banks, hedge funds, and asset managers worldwide to improve trading performance and provide valuable information not only for the front office, but also risk, finance and research groups. CMA DataVision is available to more than 260,000 users around the world via the Bloomberg Professional service.

NumeriX is the leading analytics institution providing cross-asset solutions for structuring, pre—trade price discovery, trade capture, valuation and portfolio management of derivatives and structured products. NumeriX is the only analytics company to support derivatives and structure products across all major asset classes including credit, equity, fixed income, inflation, commodities, hybrids and foreign exchange/cross-currency. Delivered on an integrated framework with all the fundamental building blocks to create and build unlimited structures, the sophisticated layered architecture allows for the construction and valuation of the most complex exotic deals. Given the flexibility in which our analytics can be deployed, over 45 partners, including most of the world’s largest STP systems offer Numerix analytics inside their systems.