How and why have firms changed the way they measure the cost of processing a trade?
Triance: The practice of transaction cost analysis has undergone a transformation in recent years. Traditionally, measurement focused on so-called frictional costs of a transaction. The idea was to let buy side organisations understand the cost on a given platform of buying assets that they were seeking to trade.
For example, if by executing a trade you lost a few bps compared to the price you were expecting to trade at, or if there was a time delay between the order and the execution during which the price moved against you, you calculated the cost as the difference between your expected and actual prices.
But this provided a narrow picture of overall cost, because there are other costs associated with a trade than those arising from the execution platform. An accurate measure includes all elements in the life of the trade, from execution through to reconciliation in the books and records supporting the final P&L ledger.
Measuring each component in the lifecycle means taking into account the cost of the several infrastructures that are involved in processing the trade, from beginning to end, including those associated with the CCP and the CSD. By including all of these in the final cost analysis, today’s measurement tools are more granular and comprehensive, and so provide a more accurate estimate of overall costs.
Cuthbertson: This is important because when you consider trade processing from an outsourcing perspective many players on both the buy side and the sell side were failing to fully appreciate the importance of taking this end-to-end view on what it costs to process a trade.
Some of this was because it was hard to obtain the information and some of this was because they weren’t thinking about the trade in those terms and so were not asking for it.
Today, as the number of execution venues and CCPs has grown, participants have been forced to consider the options that are available and to ask the question of whether or not doing it all themselves is the best way.
Particularly on the sell side, historically the belief was that firms should look after the entire lifecycle internally; they used their back office because it provided them, they believed, with a competitive advantage or that they had few other options.
For many, that thinking has now changed; it is seen as a cost centre, a function whose servicing has become commoditised and which typically, firms are looking to outsource. That is where we come in.
What is behind the growing scrutiny on costs?
Triance: Following any financial crisis when trading volumes drop, brokers had to work hard to find ways to improve their cost-income ratio to maintain revenues. Then new regulations started to be formulated, which increased this focus – Basel III, for example, changes capital adequacy requirements for bank.
So many banks had to either grow their revenues significantly – a big ask given that trading conditions had become more difficult and volumes were, at least, more erratic – or reduce their cost base – something they could control.
Now, as banks and other players have sought better to understand this cost base they have realised that owning the total lifecycle may not be the best way to keep costs down. Remember that a lot of the motivation to retain processing within the firm was based on fear, following the fabled corporate action failures of the 1990s and 2000s.
But now firms have realised that data management, as well as the measurement of risk associated with data management has moved on. A rule of thumb offered by many consultants is that nine out of ten financial institutions are considering more outsourcing or agree that it adds value and reduces costs.
Cuthbertson: The focus now is on optimisation: there is a realisation that a firm may not be optimising a process if it insists on doing everything itself and that other firms can bring to bear superior systems, higher skill levels and economies of scale to produce more cost effective solutions.
People often think that this approach is new, but it’s not. When a firm uses a subcustodian it is effectively doing the same thing – outsourcing the process to a firm that understands the local market and can deliver settlement in a timely manner, because of the scale and expertise it has in a given country.
Triance: By the same token, much of what we do that doesn’t fall under the traditional ‘outsourcing’ heading, operates through some of the same principles. As an account operator, for example, we are saving the costs associated with putting headcount on the ground while allowing clients to keep their own accounts, with the look and feel of having their own custody operation.
Consider third-party clearing, too, which is one step further. Here clients relinquish their clearing obligations and we support this additional level of processing. A key benefit here is that they are no longer required to commit the regulatory capital required by local rules to support a clearing business in that country.
Explain how shared processing has emerged as part of this greater scrutiny on the cost-based side of the business.
Triance: In the first case, participants have decided to mutualise a function by building a solution with several peers that can support shared processing. Again they have realised that there is no competitive advantage in keeping this in house and that, on this commoditised type of function, they will cooperate rather than compete.
There are several areas where this works and they are all functions where firms do not win business by how well they perform them.
For example, during my years on the trading desk I don’t remember anyone ever ringing me up saying “that was a fantastic Euroclear settlement you just did for us on that last trade – here is some more business”. I would say that nearly three quarters of the processes that support the trade lifecycle are now of this commoditised type. One example is the KYC utility that a number of providers have built through Markit, a mutualised common platform co-owned by a number of peer firms.
Cuthbertson: But shared processing is not the only area of innovation we are seeing around costs. A second concerns the streamlining or integration of a number of different technologies that have been assembled as companies have merged or been taken over.
Each challenge is different here; following acquisitions the parent firm must understand how best to reduce the duplication created by the existence of several different technology systems.
Another example concerns the traditional outsourcing question. Here, clients are dividing increasingly into those who follow the traditional path of a global lift and drop of large-scale functions and those following a more tactical approach.
This means focusing on areas where they feel the most pain and deciding no longer to own the part of the process that are causing that pain. Much of our work today is focused in this second category.
Is the offshoring era over?
Cuthbertson: I don’t think so. But at some stage in the not too distant future it will reach capacity. Obviously, there was a big run to offshoring in India which lead to cost inflation as the demand for skilled people drove up what firms had to pay them.
So we saw organisations look to new sites – at HSBC we have centres in other popular offshoring locations in order to find alternative locations for skilled people, with the same technology, to make processing more efficient.
But eventually you may run out of places to go and the only places where you can still get the cost savings will be those where other risks – political or legal ones, for example – are too high to make sense.
Triance: So that is the supply side limitation. Then there is a demand side limitation: eventually you will have offshored everything that you can and you are left with your core competency where you eek out your competitive advantage, which you have to keep in house. In the US, the Department of National Statistics believes that in the next few years these twin forces will mean the offshoring business may hit capacity.
How are you changing the way you receive and record data and how close are we to one-touch processing?
Triance: Traditionally in this space we have seen a high degree of duplication. A front-office system recorded static data. In the second, middle office system, which supported the risk analysis, this data had to be recreated.
Then finally in the back office system the data had to be entered a third time to support the books and records. There are additional layers introduced by a further chain leading to an individual client who may be in a different jurisdiction.
HSBC and our entities will typically store data on the same transactions between four and six times, between the onshore business through which the trade has been made all the way back to the record for, say, a client at the private bank. Now, as the amount of data that the industry is producing increases, the costs associated with recording and maintaining these data systems increase too. On Google alone there are now 3.5 billion searches per day; the direction of travel when it comes to data is clear.
Cuthbertson: So, now there is a move to create common repositories for data and securities. People are realising that they don’t need to record something multiple times and that three sets of data, for example, can be replaced by a single central repository. The job turns out to be harder than it sounds.
For example, it may be important that different legal entities or business lines see the information presented in a different way, so a single system must be able to produce different views on the same data. And increasingly data privacy rules constrain where data can be kept and accessed and how it should be protected, which adds complexity and cost.
But the principle is that if we can find a way to reduce the number of times you record and process data then you will reduce the number of times that you need to move it between systems and both of these will reduce the cost and optimise a process.