Unfair Deals and Unbundled Costs - Will Market Data Fees Stop Rising?

by David Kimberley
  • A mix of technological advances and market competition have raised costs drastically over the past 5 years.
Unfair Deals and Unbundled Costs - Will Market Data Fees Stop Rising?
Bloomberg

Rising market data costs are continuing to be a source of frustration for institutional investors. With a number of market participants reporting a near three-fold increase in their market data costs in the past five years alone, many in the industry are calling for greater transparency.

This was made apparent last December, when 24 banks and asset managers, including Morgan Stanley and UBS, asked the Securities and Exchange Commission (SEC) to review the market data fees charged by Exchange operators.

That request was followed up by a petition, also to the SEC, by two hedge fund trade groups. Sent on August 22nd, the Managed Funds Association and the Alternative Investment Management Association asked the US regulator to review market data costs and force exchanges to be more transparent about their fee structures.

Unbundling market data

There appear to be a number of reasons for the rise in market data fees over the past few years. Foremost among them - at least according to the hedge funds - is the unbundling of fees. In their petition to the SEC, the two hedge fund groups drew an analogy between market data fees and the price of a hamburger.

Whereas several years ago, a hamburger may have cost $20, now each part of the sandwich is priced individually. That might mean the meat costs $15, the bread costs $7 and the cheese, lettuce and tomato, $1 each, leaving you with a hamburger that costs $25.

The same logic, according to the two hedge fund groups, applies to market data. Exchange operators have sliced apart different data sets. Were one to purchase all of those now-segmented pieces of data in order to get the original whole, the price would be much higher than it was previously.

Speaking to Finance Magnates, Dan Marcus, Chief Executive Officer of spot FX trading platform ParFX, noted that unbundling was a major cause of increasing market data costs.

“Over the past decade, the costs and fees associated with market data have skyrocketed,” Marcus told Finance Magnates, “existing packages are often unbundled and sold separately to improve profit margins – further increasing costs for trading institutions.”

Dirty deals, low latency and market data

Unbundling may be part of the reason for rising data fees but so too are deals between exchanges and market participants. Larger organisations are now cutting deals with exchanges, promising to trade a set amount on a given venue in return for better market data pricing.

Parfx, Market Data

Dan Marcus, CEO, ParFX

That has left smaller institutions out in the cold. As they cannot promise to trade large volumes, exchanges are less willing to provide them with lower pricing. That gives their bigger competitors an edge that they cannot match.

Parallel to this growth in deal-making has been improvements in technology.

“The past decade has seen a rise in disruptive trading strategies that place speed at their heart of their success,” Marcus noted. “They require a low-latency ecosystem of Connectivity , technology and market data – which is expensive.”

Not only has high-speed technology made market data more expensive but it has also created a seemingly never-ending battle for faster market data. Perhaps best described by Kevin Rodgers, formerly Deutsche Bank Global Head of FX, as a “code war in the kingdom of microseconds” - firms are willing to pay more to get data faster. That, according to Marcus, is pushing up prices for everybody.

“Platform providers recognised that institutions utilising these types of trading strategies are prepared and willing to buy faster and faster market data to feed their strategies and gain an advantage over the peers.” Said Marcus, “The consequence of this is a rising tide of market data costs for everyone. Smaller institutions are forced to try to keep up with their larger competitors or risk falling by the wayside.”

An end in sight?

Can this situation be halted? Probably not entirely is this author’s opinion. So long as faster market data provides a competitive edge, and institutions are willing to pay a few extra pennies for it, then there is nothing pushing anyone to change.

Ditto deals between market participants and exchanges - though this looks more susceptible to regulatory pressure than the demand for faster data speeds.

Unbundling looks the most susceptible to change. If enough institutions can pressure exchanges into being more transparent, the latter will be forced to reveal the logic and means behind their unbundling of data fees. Were that to happen, and if market participants see that they are being gypped, it’s possible they may get to pay for the whole, cheaper hamburger again.

Rising market data costs are continuing to be a source of frustration for institutional investors. With a number of market participants reporting a near three-fold increase in their market data costs in the past five years alone, many in the industry are calling for greater transparency.

This was made apparent last December, when 24 banks and asset managers, including Morgan Stanley and UBS, asked the Securities and Exchange Commission (SEC) to review the market data fees charged by Exchange operators.

That request was followed up by a petition, also to the SEC, by two hedge fund trade groups. Sent on August 22nd, the Managed Funds Association and the Alternative Investment Management Association asked the US regulator to review market data costs and force exchanges to be more transparent about their fee structures.

Unbundling market data

There appear to be a number of reasons for the rise in market data fees over the past few years. Foremost among them - at least according to the hedge funds - is the unbundling of fees. In their petition to the SEC, the two hedge fund groups drew an analogy between market data fees and the price of a hamburger.

Whereas several years ago, a hamburger may have cost $20, now each part of the sandwich is priced individually. That might mean the meat costs $15, the bread costs $7 and the cheese, lettuce and tomato, $1 each, leaving you with a hamburger that costs $25.

The same logic, according to the two hedge fund groups, applies to market data. Exchange operators have sliced apart different data sets. Were one to purchase all of those now-segmented pieces of data in order to get the original whole, the price would be much higher than it was previously.

Speaking to Finance Magnates, Dan Marcus, Chief Executive Officer of spot FX trading platform ParFX, noted that unbundling was a major cause of increasing market data costs.

“Over the past decade, the costs and fees associated with market data have skyrocketed,” Marcus told Finance Magnates, “existing packages are often unbundled and sold separately to improve profit margins – further increasing costs for trading institutions.”

Dirty deals, low latency and market data

Unbundling may be part of the reason for rising data fees but so too are deals between exchanges and market participants. Larger organisations are now cutting deals with exchanges, promising to trade a set amount on a given venue in return for better market data pricing.

Parfx, Market Data

Dan Marcus, CEO, ParFX

That has left smaller institutions out in the cold. As they cannot promise to trade large volumes, exchanges are less willing to provide them with lower pricing. That gives their bigger competitors an edge that they cannot match.

Parallel to this growth in deal-making has been improvements in technology.

“The past decade has seen a rise in disruptive trading strategies that place speed at their heart of their success,” Marcus noted. “They require a low-latency ecosystem of Connectivity , technology and market data – which is expensive.”

Not only has high-speed technology made market data more expensive but it has also created a seemingly never-ending battle for faster market data. Perhaps best described by Kevin Rodgers, formerly Deutsche Bank Global Head of FX, as a “code war in the kingdom of microseconds” - firms are willing to pay more to get data faster. That, according to Marcus, is pushing up prices for everybody.

“Platform providers recognised that institutions utilising these types of trading strategies are prepared and willing to buy faster and faster market data to feed their strategies and gain an advantage over the peers.” Said Marcus, “The consequence of this is a rising tide of market data costs for everyone. Smaller institutions are forced to try to keep up with their larger competitors or risk falling by the wayside.”

An end in sight?

Can this situation be halted? Probably not entirely is this author’s opinion. So long as faster market data provides a competitive edge, and institutions are willing to pay a few extra pennies for it, then there is nothing pushing anyone to change.

Ditto deals between market participants and exchanges - though this looks more susceptible to regulatory pressure than the demand for faster data speeds.

Unbundling looks the most susceptible to change. If enough institutions can pressure exchanges into being more transparent, the latter will be forced to reveal the logic and means behind their unbundling of data fees. Were that to happen, and if market participants see that they are being gypped, it’s possible they may get to pay for the whole, cheaper hamburger again.

About the Author: David Kimberley
David Kimberley
  • 1226 Articles
  • 19 Followers
About the Author: David Kimberley
  • 1226 Articles
  • 19 Followers

More from the Author

Institutional FX

!"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|} !"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrstuvwxyz{|}