There is a good story Advanced Trading this week about the challenges of applying yesterday’s risk management solutions to today’s market requirements. The whole article is a good read, but you can cut to the chase and just read the summary:
There are three specific data pitfalls that can obscure risk analysis at the portfolio manager and risk officer levels, according to Adam Sussman, director of research at TABB Group:
1. Out-of-Sync. The frequency of the risk data updates lags behind the fast- moving markets. Similarly, the time horizon of the analysis can be misaligned with the investment objective of the portfolio.
2. Opacity: Unfamiliarity with the model behind the analytics puts people at greater risk of making bad decisions.
3. Rigidity: By looking at the same data in the same way, funds are more likely to be negatively impacted by one another. Similarly, approaching risk from a too narrow or rigid viewpoint can obscure vital changes to the risk of a portfolio.
Clearly after the unending string of “oh @$^%#” moments that the banks have had in past few months, it is not a surprise that risk management has risen to the top of the IT project list. Most risk management strategies were built with an eye on “the trees” (i.e. the exposure of a given trader or trading group, or to a certain asset class) but they didn’t let analysts see “the forest” until everything got consolidated at the end of the trading day. Today, trading occurs across global markets and any number of asset classes. Dramatically higher trade volume and increasing volatility have made real-time risk as essential as an overnight book in the recent past.
Few firms can say they have a real-time picture of their global risk picture, but virtually everyone would like to! Many firms have been forced to leave trade data out of the real-time loop altogether in favor faster trade execution. There is a street-wide cry for a way to “bring risk management in-line!” so they can leave the high-risk model of post processing trade data behind — but of course without sacrificing any other aspect of trading performance.
What I find interesting is that this makes risk management system technology requirements closer to market data requirements than ever before. Emphasis on latency, high volumes of data, etc. There are added complication for sure:
If only there were an infrastructure that could process market data level volumes with MQ-style guaranteed delivery, could do so inline with message delivery instead of requiring a hop on and off the bus, could be easily integrated with data grids, could route trade data based on complex content rules, and added only microseconds of overhead. Impossible! You’d need some kind of magic box to do all that, right? 😉
Based on the conversations we’re having, firms are rethinking risk management from the ground up. It’s long overdue, and I guarantee that the risk management systems in use two years will look very different from today’s.