Workflow vs. Performance

Making the most of your technological tools

Kirsti Suutari and Charles Fiori, Reuters
Originally published in the November 2006 issue

The ability of hedge funds to deliver market-beating returns for their investors is dependent on many factors. At the head of the list are trading skills, market knowledge, the willingness to assume risk, and the ability of a trader to do all of this just a bit better than the competition. In the end, however, the brilliance of the market strategy and the execution skill of the trader won’t make any difference if the quality of the data inputs and the execution tools aren’t up to scratch. Indeed, a suitable analogy would be employing a master carpenter to build one’s home whilst only allowing him access to poor quality tools.

No matter what your position within the market – whether you are a broker, fund manager, or trader – when you look to create high returns the process is largely the same: ideas are spawned and tested; decisions are made; orders are generated, executed and settled; and records are kept to ensure regulatory compliance, transparency, and for performance measurement purposes.

Traditionally this process involved a highly manual component, but increasingly these functions are being migrated from people to machines and in many cases are even being outsourced. This migration imposes even more stringent performance burdens. Regardless of whether manual or automated, several themes pervade the entire process; these include: quality of information as measured by completeness, breadth, depth, and accuracy; latency or speed; and performance, chiefly where automation is concerned.

Open any trade publication and expect to be inundated by marketing messages from a parade of data and technology suppliers advertising their versions of the “better mousetrap” which will simplify processes, improve performance, increase speeds, and even butter your toast.

Chances are many of these products actually do what is claimed on the box but rarely do they provide a comprehensive answer to all of a fund’s workflow needs. In the end, it is up to the fund managers to determine what pieces are required and to interconnect and integrate them into the enterprise’s operations. When the fund manager’s expertise is focused on investment ideas and not in C++ programming, this may make the process extremely daunting.

One must ask, then, is it possible to pull together a performing set of tools to meet workflow requirements without enormous IT and market data budgets?

Perhaps the issue would be best illustrated with an example. One of the more advanced tools a statistical arbitrage hedge fund will use is an automated trading platform where alpha-generating investment strategies are composed, translated into a computer program, and allowed to generate trades without human intervention.

This can be very lucrative if properly done. The translation of human processes into a computerized implementation by its nature places technology at the core of the process: it changes the requirements for accuracy and speed, since a machine cannot exercise judgment as to what is “good” or “bad”, and a machine’s speed and capacity far exceed that of a human being. Furthermore, it means that the various components that are needed to compose a system like this must act in concert, in the same manner as do musicians in an orchestra.

“Just as one would prepare for a long vacation, be sure to plan the entire ‘data & technology trip’ from beginning to end”

The shopping list needed to pull an automated trading system together will include information inputs, or the data, and the technology that uses the data to generate the decisions, generally what to trade, how much, where, and when, and to how act on those decisions.

The data elements

The data elements are many: fast or low latency streaming data to monitor current markets; cross-asset content for hedging strategies or handling currency differences; extensive tick history for research, back testing and predictive analysis; corporate actions data for normalizing and maintaining clean security master files; and hopefully some leading edge inputs, such as automated or machine readable news.

The latter represents the cutting edge for investment models since news text has historically been interpreted only by humans. The information that news contains is relevant to market behaviour, and is of great value to an automated trading process if its content can be rendered readable by a computer.

The technology elements are also many: applications to build, test, optimise and run models; other applications to warehouse the historical data for instant retrieval; engines to analyse the historical and current streaming data to generate decision inputs to the models; and still others to make sure the resultant orders are executed in the market. As a list, it appears it would require significant technical expertise to choose, assemble, own and operate all the items. However, an informed shopper will understand ways to reduce potential technical overhead.

Let’s start with the data. No matter how great the skill of a particular trader, he or she will have a difficult time reaching their fund’s return goals without having a complete list of data sources on which they can access and that are broad, deep, reliable, fast, and of very high quality.

Further, it is equally important that each of the data components is compatible with the others, to avoid the risk of introducing any element of complexity and impacting performance. If one source, say, history, is indexed by CUSIP, and another, say, streaming low latency market data, is requested by exchange symbol, then yet another system must be devised whereby both can be requested by the same identifier. It is much easier if both natively understand the same naming conventions, and observe the same field structures, so that requesting the “bid price” retrieves the same content from each service.

From an operations standpoint, there are a few obligations that can act to a fund’s detriment if ignored. For example, exchanges increasingly compete for liquidity, and in so doing, generate regular changes to their outputs that must be managed by streaming feed handlers. Further, market data delivery rates are escalating at a rate that puts heavy emphasis on capacity management. Switching up to more or bigger computers becomes necessary, and requires the associated technical expertise. Ignoring these factors is done at your performance peril. It may not however require an entire IT department to manage. Consider a managed service. The more fully these intricacies can be managed by your supplier, the less you need to be concerned with managing them yourself.

Besides needing data to make informed trading decisions, expected developments in the regulatory environment will also require a hedge fund to access reliable data to assure transparency of the fund’s activities and to allow for a robust compliance function within a fund’s parent company. As much as existing components can serve compliance requirements, for instance, audit trail, storage, retrievability, they will also reduce any additional investment in technology and implementation resources.

The big picture approach

Let’s discuss the various considerations a hedge fund needs to address when considering data vendors, sources, and suppliers. It in fact does matter whether a fund emphasizes domestic or international equities, fixed income, or commodities, or a specific market. There will always be a large cross-asset component to fund management. This creates the need for a big-picture, holistic approach to the management of data usage, to include oversight of markets, sources, and data quality.

Hedge funds, and indeed, most consumers of data, rarely do anything other than keep even the basics of data management at arm’s length, neglecting to perform ongoing analysis to determine if the data they are receiving matches what is required to support the business. Data management should be as integral a part of a fund’s workflow as the trading system.

Of the components required to properly conduct a detailed analysis of enterprise-wide data usage, it could be argued that the most neglected is data quality. Attention to data quality will also prove to be a valuable use of time and resources as it will allow the fund to focus on creating returns. Second guessing decisions due to suspicions of potentially bad data will generate costs that will be, while difficult to measure exactly, prohibitive and painful to absorb.

This becomes an even more critical consideration if that same data will ultimately be used for compliance and transparency monitoring purposes. Hedge funds will rightly complain to a vendor or data provider when faced with bad data as a result of wide or single-sided markets, missing data or spikes, timeliness and latency issues. This tends to be driven event to event and not from an honest look at longer-term trends.

For a fund to properly analyze its data usage, it needs to first understand what it is doing with the data it purchases. Amazingly, funds often pay for data they do not use, and also fail to fully use the data they purchase. Data is not free of charge. Paying for only the data your fund requires and using that data you purchase more efficiently is a simple and quick way to cut costs and improve the fund’s bottom line.

This is less difficult than one might imagine. Successful and respected data providers make it part of their ethos to act as business partners with customers who buy their data products. For a fund manger to truly reap the full benefit of their data, an active, regular two-way dialogue between the customer and the data provider is needed. This dialogue enables the data provider to fully understand the customer’s need and use of data content and requisite latency, thereby increasing the likelihood that the fund manager is getting the data, and only the data, it needs to achieve alpha and enterprise efficiencies. The customer will not only gain immediate cost savings from lowering data consumption costs, but will achieve long-term salutary effects as a result of active data management.

Data must be reliable and meet the fund’s needs, movements and tendencies. It should therefore be closely monitored, and all the data sources should, to the maximum degree possible, work together harmoniously, working in concert in a single application or arriving together in the same format. Considering this in advance of making your product choices will have an impact on your ongoing IT requirements.

Technology

Now, let’s look at technology. Many offerings are available in the market today, some complimentary, some overlapping, some directly competitive, and in varying levels of sophistication.

One may provide both data warehousing and streaming analytics, whereas another may provide only one or the other. One modelling application may simply generate a trade signal and hand it off to a trade execution application, whereas another may also translate that signal into an order and prepare it for launch into the market. Which one(s) you choose will depend on your workflow requirements, and the extent to which the applications meet your needs off the shelf will have an effect on the in-house expertise required to do any level of customisation. But beware of the functionality fallacy.

Doing the right things in and of itself is not enough. It is equally important to do the right things right, meaning to do them well. Technology is like the engine, where data is like the fuel. Low octane fuel in a high performance engine isthe equivalent of our master carpenter using poor tools to build your home – you will get the outcome you expect. So technology that works is good; technology that can work with some of the data inputs is even better. And, technology that you know to work well with all the data before you purchase either one is best of all, especially where IT investment is concerned.

Few technologies will stand alone. They must interface with other technologies to do their jobs. A data warehouse must be easily queried for relevant content. An automated trading model must have instant access to historical queries and streaming content.

But many of these components will not easily talk to each other off the shelf. In programming terms, one may speak Cantonese and the other Danish. The more applications you run, the greater the danger of needing the entire United Nations interpreter team.

On the other hand, some applications work well off the shelf with some others, and with existing data sources. It is important to have the answers to the right questions up front.

How many APIs will you need to work with? How many data structures will you need to accommodate? Are there data loaders already available in your application for the data service? Do they all have compatible capacity and throughput or might you expect queuing? And, do they have enough overhead to accommodate pending data update rates on the existing hardware? Or will you need to invest in new boxes within a year?

Ease of integration and a simple, clean, and flexible configuration will reduce the chance of inter-process glitches. Good technical due diligence and advice up front will mean savings in implementation schedules and ongoing operating and maintenance requirements.

So is it possible to run a high performing technical operation without a serious in-house IT investment?

Most definitely. Equally important as making sure that each of your investments is a solid one is how important it is that your IT investments work together well, especially off the shelf. After all, if you don’t need to customize it, you won’t need to maintain it. The suppliers will, saving you time, technical resources, and complexity.

In closing, as is often the case in life, in fund workflows the whole is definitely greater than the sum of the parts, no matter how good the individual parts may be on their own. Just as one would prepare for a long vacation, be sure to plan out the entire ‘data and technology trip’ from beginning to end and do not forget to adequately prepare for the many wonderful side trips which will make the experience that much more enjoyable and beneficial.

While IT resources are critical to hedge fund management, with proper planning, they should never take priority over your investment skill.

Kirsti Suutari is Global Business Manager, Algorithmic Trading, at Reuters Enterprise Information, and Charles Fiori, SVP, Global Product Business Owner, Reuters DataScope Mutual Funds