Connect with us

Insights by Third Bridge (Sponsored)

MiFID II: The Main Challenges for the Sell-Side

The long-awaited implementation of the Markets in Financial Instruments Directive, or MiFID II, on the 3rd of January will alter the face of research. The primary aim of the regulation is to increase overall transparency and efficiency and, by the end of this year, the research market will be forced to change drastically. Joshua Maxey, Managing Director of Third Bridge, explains what the main challenges will be for the sell-side and how the buy side’s choices for research providers will reshape the market.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Insights by Third Bridge (Sponsored)

Investment Research in 2018: The Big Cleanup Post-MiFID II

 4 min read / 

investment research

“A catalyst”. “Disruptive”. “Disorientating”.

Whatever you think of MiFID II, it will change the research market dramatically. And one of the main issues is that the industry has to adapt quickly. If it doesn’t, research firms will risk losing clients or even shutting down.

Who’s Got a Target on Their Back

The buy-side will need to look at the different types of research they’re receiving and decide whether it’s useful, whether they need it, and how much they’re willing to pay for it. At this point, for example, it will be extremely difficult for new entrants, as the buy-side is not willing to commit to anything new until they understand who they have on their books and what the added value is.

In the long term, MiFID II will create an even playing field between independent research providers and the sell-side. In the short term, however, the industry will go through the motions. And with larger investment firms expected to cut their research budgets by 30-50%, a massive clean-up operation – in terms of who’s on who’s books and for what – is coming.

One could intuitively say that the top five sell-side banks in a dense and fragmented market are safe. However, the rules are a bit different when it comes to research.

Indeed, players 6 through to 20 and beyond will experience the most turbulence, from pivoting their research strategies to closing their research divisions completely.

At the same time, as we know, research has never been a moneymaker for banks, it was more a support mechanism for corporate finance. Which means that some of the bulge bracket investment banks could potentially decide to give up their research departments altogether, if the corporate finance teams decide to stop sponsoring corporate research and more analyst talent drips out.

The Big Squeeze

For smaller asset management firms, it also becomes more expensive to continue to receive high-quality research as they lack sufficient scale to get the right level of “research purchasing power”. They need to find more money to pay for research, which is hard when their fees are being squeezed already.

With the boom of passive fund management and somewhat lacklustre performance of actively managed funds, one question arises: why would an investor pay the 2% management fee for actively managed funds, when passive funds charge 0.1%?

With active management fees already high, the additional burden of the cost of research cannot be easily passed onto the client, and thus has to be absorbed by the firms – they will need to find a way to survive, while being caught in this race to the bottom.

For those buy-side firms that choose to scrimp on the costs of expensive, high-quality research, this may have an impact on their performance, especially in an environment where investors are desperate for alpha.

So the other option is funds accepting to take a hit on their own margins. Being squeezed from all sides, it’s a catch-22 that will likely drag out throughout 2018 and beyond, or at least until the pricing models become more normalised.

The Right Price

In order for the buy-side to feel comfortable having a number of providers on their panels, the pricing models have to make sense.

At the moment, subscription models prevail, and we will see a huge concoction of subscription models. Research houses will be offering a variety of subscription options centred around the following themes:

  • Asset class: equity, fixed income or macro
  • Sector based – industry verticals, large cap v. small cap  
  • Regional pricing – based on client location or research coverage
  • Usage based: “all you can eat” model,  per unit/seat pricing in bundles or single units
  • Product based: analyst access, written collateral, corporate access, conference attendance

Research providers will likely bundle these different parameters or offer bespoke pricing based on usage. For example, one firm can offer one subscription for one million dollars a year, and that will probably give the client waterfront coverage. Another one can say: for five analysts on the buy-side, we will charge $20,000 and cover them for access to Asia consumer research. A third combination may be: we will give you access to twenty of our analysts calls a year, for $100,000.

Come 2019, whoever will be left standing will also be able to prove beyond any doubt that they are offering real value to their clients.

Keep reading |  4 min read

Insights by Third Bridge (Sponsored)

The World in the Era of MiFID II – Here’s What the Future Holds

 6 min read / 

MiFID II Future

It’s the perfect storm. The mad rush to get ready for MiFID II has only set the stage for what’s coming. MiFID II has not claimed its pound of flesh yet. But it has the next 12 months to do so. By the end of 2018, when the dust settles, we will know who’s left standing.

The first thing to happen? The great unbundling. Or, in plain terms, clients will have to pay for what they get and, at the same time, research providers will have to up their game and focus on relevant information that clients are willing to pay for.

Then, the changes in the industry will follow.

Who Will Pay for What

A client will want two key things from their research houses: access to management and actionable insights. And now, because they will actually be obliged to pay for the product and the product alone, these two aspects will have to work together to build up the intrinsic value of the research delivered.

At the same time, a company’s attribution of value to research may well be directly reflected in its net performance. The lower the budget, the less valuable the information attained.

Large asset managers have, at any given time, up to 200 research providers on their books. These firms will – in theory – provide them with a plethora of information to make investment decisions.

Putting together the pieces of the puzzle is not easy. Research providers who are able to deliver the right type of information in order for the client to be able to move quickly on their decisions, will thrive. And everyone has to move and adapt pretty quickly, as clients are set to cut research budgets by a third and even by half in some cases. The number of providers on their panels is set to go down by similar percentages.

The New Playground

One thing that will keep the wheels of the industry turning, for now, is the “me too” effect. Before clients start cutting the number of their providers, they will worry first about what coverage their competitors have so as to not miss out on any potential alpha.

However, the main problem has been that the quality and the end value of the research has been limited – many providers would bundle together anything from interim updates to company reports and analysts’ opinions, and hope something sticks. It’s been the industry’s worst kept secret that there’s a lot of junk that’s being provided to the client. Research was in some cases seen as a means to an end – for example, keeping the line open for other transactions and other types of business relationships between parties.

Now, MiFID II pushes for clarity on who’s paying for what, and regulators want to make sure that the money exchanging hands is actually meant for the purposes stated and that the research is used for investment decisions, not as an extra to “grease” other mechanisms. So all actors in this play will have to make significant changes.

Research providers will have to actually seep through the data and create specific packages for specific clients, and they will also have to make sure to follow some cardinal rules. The first one is transparent pricing: this client is paying for that service.

The second requirement is that they must be MiFID compliant at all times, for those firms opting to pay using a research payment account (RPA). They will need to keep track of the research consumption and have an audit trail. This way, they will avoid any regulatory conundrums when it comes to reporting. The rules are especially strict for those using an RPA to pay for their research. For those opting out of the model – which is the majority, from what we’ve seen – compliance is much easier to achieve.

And MiFID II comes at a good time: the perceived value of sell-side research has been in steady decline for almost two decades – since Eliot Spitzer shook bankers and analysts alike, forcing a Chinese wall between the two sides. Still, everyone carried on paying for different reasons: corporate access, liquidity, trade execution (particularly in fixed income) and the somewhat rare, but good, trade idea. The relationship angle is also very important. However, the good old days of assigning immediately actionable information to one’s favourite broker for the old “quid pro quo” are done.

However, quality might still continue to decline on the sell-side research front. As they get squeezed on fees, they will find it more difficult to hold onto their experienced analysts. While there is always new, young and hungry talent in the market, those who have been through the economic cycles and have seen the ups and downs of the last few decades are in short supply. And those are the people that can interpret the data for clients, as well as know where to get the relevant information from and have the right top-tier connections to get input from current and former management, analysts or industry experts in order to move the needle for their clients.

These very valuable individuals will probably leave to start up their own shops, or will join independent research houses – that’s something we have been on the lookout for – and they will be helping the research houses that truly add value for their clients to move forward and thrive.

The Future

With all the commotion in the market, the industry is bound to see a turbulent time. In the short to medium term, consolidation will occur amongst the small to medium players.

At the same time, the big investment banks might decide that trying to make research profitable is a losing game. This translates into top players potentially closing down their research departments altogether. Which means, in brief, that nobody is safe until they can find a way to provide something critical to their clients.

This is where independent research houses, such as Third Bridge, can have an edge, considering their freedom of movement and access to great talent.

Players will also have to find the right balance between quantity and type of research provided on the one side and pricing mechanisms on the other.

The name of the game will be: bespoke; targeted; relevant; specialised. Basically, providing something that others cannot.

Keep reading |  6 min read

Insights by Third Bridge (Sponsored)

Quant Funds Are Data Hungry: Where Will They Get the Numbers from?

 6 min read / 

Quant Funds

“Rubbish in, rubbish out” is an old adage in equity research. Whether an analyst or fund manager’s financial modelling is basic or intricate, the forecasts and conclusions that come out at the end can only be as good as the data or assumptions that are input at the beginning.

Analysts have always been in a constant battle to find, and verify, the right inputs into their financial models to reasonably predict the future performance of a company or asset. And it is common for the best financial analysts, and models, to get things very wrong because they are being fed irrelevant or bad assumptions.

The need for relevant inputs and accurate data has been fundamental to the work of equity analysts and fund managers since the days of modelling future financial performance on an Excel spreadsheet. But the requirement for quality data is becoming even more central to their work as they move into quantitative or systematic investment strategies, which augment or replace human judgement with data, algorithms and machine learning to manage funds.

Data Hungry

Quantitative investment is gaining increasing importance in the fund management industry, with funds run by systematic strategies making up close to 20% of total hedge fund assets at the end of 2016. And this figure ignores the increasing number of hedge funds using ‘quantamental’ techniques, which add quant analysis and big data inputs to support traditional fund management teams in their search for investment ideas.

These quant and ‘quantamental’ funds, which have benefited from the significant growth in cloud storage capacity and Artificial Intelligence (AI) programming to crunch through enormous datasets, invariably boast proprietary models and different methodologies to analyse inputs. Most of these models are ‘black box’ or secretive, as individual fund managers try to maintain their edge over the competition.

But what they all have in common is a need for data. And as with the old method of an analyst tinkering with his Excel spreadsheet model, the “rubbish in, rubbish out” theory is still crucial. The data needs to be relevant and accurate for the machines to get to the right conclusions – and there needs to be lots of it.

Trawling Through the Data

There are numerous examples of how fund managers are applying big data analysis – from using AI programmes to trawl through collated credit card statements to predict quarterly Netflix subscribers, to designing software that captures satellite imagery to count the number of cranes in Guangzhou and produce accurate forecasts of Chinese housing supply.

And there are massive amounts of data sets out there. According to US quantitative fund manager Two Sigma, the world produces a billion gigabytes of data per hour. And much of this is new – IBM estimates that 90% of data in existence today is less than two years old.

Easy and cheap access to cloud storage and clever programming can easily handle the mass of data nowadays, and these tools are becoming prevalent in the fund management industry. But the real trick is how to work out what data is relevant and helpful for fund managers to get to the right financial conclusions. Essentially, there is little point counting the Guangzhou cranes if the satellite cannot see that a third of them are in fact idle.

Too Many Words

So where will the quality data come from? Fund managers have traditionally relied upon investment banks and brokers to provide them with research and information to help them build their models and make investment decisions. But these organisations have large legacy research departments that mostly provide a wordy narrative. While they do build earnings, balance sheet and industry sales forecast models, which can be amalgamated and fed into a fund manager’s quant models, much of their output is in the form of text, opinion and soft variables, such as evaluating a company’s management teams.

Data analytics companies that service the fund management industry have sprung up and can provide vast data sets of real-time information across numerous industries and markets. But in its raw form, much of this data is available to all and gives no genuine information edge. Back to the Guangzhou cranes: there is no advantage in counting them if everyone can do it.

The real challenge, therefore, is how to make the data intelligent and achieve the highest signal-to-noise ratio. London-based Winton Asset Management, which manages over $30bn in pure quant funds, spends much of its time “cleaning” the vast datasets it receives in order to extract value. Multi-Asset fund manager GAM, which runs systematic funds with its Cambridge-based Cantab team of data scientists, overlays alternative data sets, such as weather patterns, with financial data to signal assets that can outperform the markets.

New generation research firms, which have emerged as challengers to the investment banks’ research model, may be in a better position to provide the intelligent and clean data that quant funds increasingly need. These firms, who collectively analyse a multitude of companies, industries and markets, are unencumbered by large investment bank research structures that rely on the wordy narrative.

They distribute a streamlined, data-led primary research product and are probably best placed to develop methods to turn soft variables into data. They also have the financial market experience, and know-how, to both choose what data is relevant and help interpret the results. They have the knowledge of, and contacts with, Guangzhou’s local construction companies to be able to contextualise the crane data and add a qualitative layer to the AI generated data.

Blending Human and Machine

As quant and quantamental investment strategies evolve, the real value for the fund management industry will be in devising methods to sift through the vast data that is available and blend it with the soft variables that are an everyday part of investment decision-making.

While data is increasingly ubiquitous and available to all, fund managers and their research service providers will need to focus on which of it serves their analytical needs and enables them to gain an edge over the competition. What matters is how they frame the data requests, how they verify the quality of the data and how they quantify the relevance of individual data sets to their investment processes. And in this process, traditional human investment expertise is likely to remain crucial. The data analysis can only be as effective as the human know-how that sets its parameters.

Both investment banks and new generation research firms have analysts and market professionals with years of experience in evaluating market trends and the instincts to spot mispriced assets. These very human qualitative skills are likely to be decisive in enabling fund managers to work out what data they really need to look at, and how to interpret it once it comes out of the other end of the machine.

Keep reading |  6 min read

Trending

Send this to a friend