ICE algorithm to be applied to LIBOR problem

08 April, 2016
"

Arguably, while the London interbank market remained small outside control was not needed. Now, however, LIBOR underpins more than US$350 trillion in outstanding contracts which means that regulators are likely to want to keep a firm finger on its pulse.

Rigging LIBOR was one of the grubbier scandals uncovered by the financial crisis. It was possible because the London Interbank Offer Rate (LIBOR) relied on the expert judgement, and so honesty, of the contributor banks – rather than transaction data. That meant that, if they wanted to jive the numbers to hide an inability to borrow at an affordable rate, which is what happened during the credit crunch, they could. The Treasury Select Committee published its report on LIBOR in August 2012 and it outlines a number of unsavoury practices.

Four years later, Intercontinental Exchange (ICE) has announced [PDF] that ICE LIBOR will give transaction data “the greatest possible role” in calculating the average cost of unsecured funding in the London interbank market – the aim being to move to using an algorithm that eventually removes the need for expert bank judgement in setting LIBOR – even in market circumstances where transactional data is limited, like the credit crunch seen in 2008. The new system will also look beyond the interbank lending market to include the short-term bank funding rates levied by central banks and non-bank financial institutions. 

This should not only mean that LIBOR is less open to being rigged, it should also encourage more banks to take part in setting LIBOR. Encouragement is needed because, as things stand, being on the LIBOR panel brings regulatory and legal risk. (It is reported, for example, that some banks would have liked to exit the panel, post-crisis, but were ‘encouraged’ to stay put by the regulator). Using only transaction data, it is hoped, will reduce perceived conflicts of interest, incentivising more submissions and set up a “virtuous circle” of ever deeper and broader market information.

A feasibility study on the plans is now underway and, assuming a positive outcome, ICE says that there will be talks with the FCA on a regulatory green light for the planned algorithm, processes and controls in Q2 and Q3 this year. The proposed timeline for banks to move to the real-time transmission of transaction data to ICE benchmark administration (IBA) is H2 2016, and IBA expects to have “full centralised responsibility” for setting LIBOR in 2017.

LIBOR was first set up in 1970 to provide lending banks funding themselves in the short-term interbank market with a benchmark for setting long-term rates to clients, and standardised in the 1980s. Initially, it had been based on a rate that the submitting bank believed a “prime bank” would be offered deposits in the market. It changed in 1998 to a rate at which the panel bank itself could borrow funds. The final number is based on a mean of the submissions, with the top and bottom quartiles excluded so that the number is not skewed by outliers. Oversight was limited.

Arguably, while the London interbank market remained small, and banks could police each other, outside control was not needed. Now, however, LIBOR is, as ICE points out “the primary benchmark for short term interest rates globally; it underpins more than US$350 trillion in outstanding contracts and much of the world's financial system."" LIBOR also gives central banks insight into the health of global banking, which means that regulators are likely to want to keep a firm finger on its pulse.

Ouida Taaffe is the editor of Financial World magazine.

"