CHAPTER 9
Market Volatility in the Age of Fintech

  • —Why does a risk control bot have a reputation as a terminator?
  • —It terminates trading bots.

Minimizing volatility is important to investment managers focused on capital preservation. After all, lower volatility helps protect capital and improve the key portfolio performance metric, the Sharpe ratio. This ratio, average annualized return divided by annualized volatility, becomes acceptable in the 1.8 range. The higher the Sharpe ratio, the better, and the sky is the limit. Some high‐frequency trading funds produce Sharpe ratios as high as 20. Even very small positive returns can produce large Sharpe ratios that attract investors, but only if the volatility of the portfolio is tiny.

Volatility can also be considered a stand‐alone phenomenon, something most investors seek to limit. Indeed, investors often talk about minimizing their “exposure” to the markets for individual asset classes, sectors, and across financial instruments in their portfolios (correlation exposure or dispersion exposure). Such volatility‐related exposure management is topical across the entire fintech spectrum. Cross‐border payment companies need to limit their exposure to exchange rate fluctuations. Real‐time insurance outfits need to engage volatility stabilizers to protect their assets against the same shocks as those that may affect their clients.

Cartoon representation of Market Volatility in the Age of Fintech.

Traditional businesses also need to worry about their volatility exposure in the age of rapid “you snooze, you lose” dynamics. Flash crashes, news, and social media, among other issues, affect portfolios in real time, and old‐fashioned hedging techniques and passive diversification are not always sufficient to protect portfolio managers from instant problems. Tracking real‐time risk can also help thousands of businesses reduce their regulatory capital requirements by upgrading their risk tier. Banks, in particular, can significantly improve their balance sheets if they take care of volatility in their holdings in real‐time and near‐real‐time space. Loan portfolio managers and even the Fed can substantially benefit from a firm grip on intraday volatility. Valuations of corporate bonds and their private‐sector equivalents are subject to marketwide, industrywide, and company‐cohort‐related fluctuations. Entities like the Federal Reserve lend banks money securitized by their equity holdings, which are directly affected by real‐time risk. As this chapter shows, even the longest‐term buy‐and‐hold portfolio managers gain from understanding real‐time risk in their portfolios. After all, real‐time risks directly translate into long‐term risk, which makes its way into risk premia, portfolio inter‐security correlations, and subsequent diversification solutions.

Most investors are trying to minimize volatility, if not in absolute terms, then in terms of volatility related to the returns. Minimizing volatility is a challenging task. In a nutshell, to minimize volatility, one needs to take three steps:

  1. Identify the conditions that result in high volatility.
  2. Correctly predict when the conditions identified in step 1 are about to occur in the near future.
  3. Select an action for managing this volatility. The appropriate actions may include trimming riskier portfolio holdings, or counterbalancing the offending instruments with offsetting or protective financial instruments, such as futures or options.

Much of the research of the 1980s and 1990s focused on managing volatility with a single approach for all market conditions. By designing all‐weather securities that can be bought once and held passively until the securities' expiration (as is the case with futures and options), the new breed of financial engineers advanced portfolio frontiers, and also generated much excitement. For example, collar options protected against too much downward as well as upward movement in portfolio value. Other exotic derivatives helped manage changes in interest rates, oil prices, and things like weather changes in Florida to mitigate the resulting swings in orange juice.

With time, many of the all‐weather securities proved to be imperfect, failing to protect their owners' portfolios when it mattered most, typically in extreme market conditions. Does anyone remember Enron? Many smaller investment firms toppled like dominoes in the market crises of the early 2000, and seemingly unending lawsuits followed. Those who bought exotic securities sued issuers and underwriters of options for misleading them about one aspect of the package or another. The bottom line is that all‐weather strategies did not hold water against extreme events.

For a while, people tried to address the issue by deploying theories about extreme events and modeling the unlikely black swans. However, modeling proved to be too difficult and unrealistic, and the extreme events too rare and unusual to be forecasted with ease. New solutions have to be found and deployed.

In a pretty simple form, premium data came to play since the late 2000s. As a complete surprise to some, premium or “uncommon” data sources have emerged carrying previously unthinkable information. Highly granular and real‐time information found its way onto sophisticated portfolio managers' desktops: from the data about the cars shipped from factories to dealerships collected by companies like ThinkNum.com, to the composition of orders placed online for particular products by ReturnPath.com, to the number of high‐frequency traders operating in the markets by AbleMarkets.com.

This front‐line information, collected and channeled directly from the source to portfolio managers, has been made possible by the evolution in technology. The Internet has enabled software programmers to scan previously disorganized data sources to draw value through synthesizing information into meaningful inferences, directly predictive of near‐term market conditions. The technological advances in computing themselves have enabled firms to deliver the data in a super‐fast (often, real‐time), reliable, and, above all, inexpensive fashion.

How does this uncommon data help to manage portfolio volatility? Acquiring the data is only the first step in the process. Next comes the understanding of how and how far in advance the particular data are able to detect the onset of certain market conditions, and the most appropriate financial engineering approaches are selected for each market condition. Then, when the reliability of prediction is firmly established, the streaming data are deployed to raise warnings about one market condition or another, and the previously chosen volatility mitigation method is quickly deployed to safeguard one's portfolio.

Welcome to the new data‐driven world. Whether data‐enhanced portfolio management is revolutionary or evolutionary, it is no longer optional, but mandatory for sound portfolio management.

TOO MUCH DATA, TOO LITTLE TIME—WELCOME, PREDICTIVE ANALYTICS

Most investors would like to understand the drivers of performance of their portfolios in order to incrementally improve their allocation decisions. One way to identify what is truly behind their portfolios' returns is to compare the performance of portfolios vis‐à‐vis benchmarks.

In the past, available benchmarks were few and far between, easily manageable in one Excel spreadsheet. In the last decade, the number of data sources a responsible portfolio manager needs to take into account have multiplied dramatically, requiring databases and dedicated staff to manage the data and to analyze it in a reasonable time. An alternative to buying and processing more and more data for traditional regression and attribution models is to delegate the data aggregation and storage powers to a new industry, predictive analytics.

Predictive analytics comprise advanced big data models. They allow scientists to create algorithms capable of answering questions far more complex than traditional regressions or segmentation frameworks. These algorithms cut through the nontrivial task of making sense of mountains of fundamental data by providing an index that answers a question. Advances in data science allow real‐time processing of a massive, previously unthinkable, amount of data. For instance, many investors and traders charged with execution want to avoid trading when aggressive high‐frequency algorithms are present. Companies like AbleMarkets now tease out aggressive HFT from streaming data and show in real time when to speed up trading and when to slow it down. Such technology, unthinkable just a decade ago, is now not only feasible but available to all to use.

Predictive analytics transform data into insight. Portfolio managers use performance attribution to explain why a portfolio's performance differed from the benchmark. This review of historical trading attempts to distinguish which of the two factors of portfolio performance, superior stock selection or superior market timing, is the source of the portfolio's overall performance.

However, the past is not necessarily an indicator of the future. Looking at a static benchmark does not necessarily capture the full picture of what is happening in the markets. AbleMarkets uses an approach of continuously reviewing the markets for indications that there is an opportunity to trade, whether because of changes in sentiment or because some aspect of a market's microstructure that is highly predictive of either price movement or changes in volatility. The analysis generates an index and these indexes help to guide trading decisions.

Research on volatility is well‐positioned for fintech advances. At the time this book was written, the equity space alone boasted 9,000+ underlying financial instruments, each with 100+ open options contracts with considerable movement in some options' order books. As a representative statistic, the size of a binary file summarizing option activity in the US markets for just one exchange and one trading day often exceeds 100 GB of computer storage space. To put this number into perspective, consider that sophisticated Apple laptops come with just 250 GB of hard‐drive storage.

A risk‐management platform called ICEBERG, designed by Professor Marco Avellaneda of NYU and approved for risk capital calculations by the SEC and OCC, takes data for the entire universe of financial instruments and processes it at lightning speed to derive correct volatility curves in a matter of minutes. This operation still takes most contemporary practitioners days and months to produce using previous methods. The resulting data can be used to successfully predict volatility over the medium timespans, allowing portfolio managers, risk managers, and execution traders to hedge their positions in a scientific, cost‐effective way.

What is so different about Avellaneda's methodology? In part, it is intelligent data sampling, discussed elsewhere in this book. Utilizing the latest digital imaging techniques, Avellaneda is able to reduce the computational times to their minima, while still preserving the integrity of inferences. Advanced mathematics in the form of principal component analysis (PCA) and, finally, Monte Carlo simulation come in later. Taken altogether, and deployed in a cloud, the method reduces week long processes into mere seconds, all while delivering higher accuracy and predictability.

WANT TO LESSEN VOLATILITY OF FINANCIAL MARKETS? EXPRESS YOUR THOUGHTS ONLINE!

In the last year or so, stock prices have been moving drastically up and down, a phenomenon known as market volatility. The latest research from AbleMarkets shows that investors can help reduce intraday volatility by collectively expressing their opinions about a stock's imminent direction on social media. By speaking up online, investors appear to speed up the formation of market consensus and the resulting price adjustment, minimizing price volatility in the process.

Social media continuously updates our collective knowledge of financial markets. As discussed in Chapter 8, investors posting online about their thoughts and experiences with a particular publicly traded firm may encourage others to consider investing into the stock of that company. Latest research shows that the aggregate volume of social media commentary may make an impact not just on the price but also on the volatility of a particular stock. This section summarizes the results of analysis using the AbleMarkets Social Media Index, an aggregate of Internet mentions of companies on a variety of platforms. The index, running since 2009, deploys a complicated custom‐built technology to continuously poll a vast proprietary universe of social media sites. Twitter is expressly not part of that universe of websites.

As the analysis shows, overall, increased social media activity related to a particular company during a 24‐hour period on a given trading day leads to higher stock volatility on that day. However, more social media activity during market hours results in lower volatility of discussed stocks. This phenomenon may be a product of social media's efficiency in distributing news and the news' subsequent incorporation into stock prices. Increasing social media discussion about a particular company seems to result in the consensus being reached faster. Social media discussions from 4 PM ET until 4 AM ET, however, are highly predictive of the next trading day's volatility.

Research shows that when people discuss their beliefs online, they make their previously private information public, stabilizing the markets. The phenomenon is consistent with academic theories of finance. One of the theories, the efficient market hypothesis, was first posited by Eugene Fama of University of Chicago in the 1970s, long before the Internet existed as we know it today. According to Fama's thinking, if everyone's knowledge and beliefs were available for everyone else to see, prices would reach steady levels almost immediately following any news.

Can one individual's contributions to social media really calm down the markets? As with all social media, strength is in the numbers—the more that people decide to share their thoughts, the faster the market will reach a consensus of the appropriate price level.

MARKET MICROSTRUCTURE IS THE NEW FACTOR IN PORTFOLIO OPTIMIZATION

Understanding market microstructure is traditionally thought to aid execution traders and market makers, the two types of intraday financial practitioners continuously interfacing the markets. The market's microstructure is not usually considered to be a variable in long‐term portfolio optimization. However, the latest research shows that long‐only managers ignore the market microstructure effects at the expense of their clients' portfolios.

Adding market microstructure as a factor into the methodology of rebalancing a portfolio improves the Sharpe ratio for any long‐only portfolio.

As discussed in Chapter 3, in broad terms, the science of market microstructure examines in minute detail the evolution of the orders and matches that occur in exchanges' limit order books. A limit order book is a central marketplace for any given financial instrument—a stock, futures contract, bond, foreign exchange rate, or an option. Limit order books have been shown to be universally superior tools to match buy and sell orders in finance. The order matching process that occurs in each limit order book is somewhat similar to the matching of produce and customers at a grocery store:

  1. The grocers desiring fixed prices place their merchandise on shelves of the store. In financial lingo, the fixed‐priced wait‐for‐customer merchandise displays are known as limit orders.
  2. The customers who want merchandise right away and at the best available price pick the merchandise off the shelves. The right‐away customers use what in finance is known as market orders to accomplish their purpose.
  3. Exchanges ring up (match) the orders, making the transaction official.

Market microstructure deals with the all aspects of order shelving (who gets the best display space?), customer arrivals (when do most customers arrive? which customers have the biggest budgets?), and similar issues. In the process, market microstructure incorporates topics like high‐frequency trading and runaway algorithms.

Most of the market microstructure activity is sticky. Typically, dynamics persist from one day to the next, even though in the very short term variability can be significant. As a result, phenomena such as volatility and risks associated with microstructure lend themselves well to extrapolation in the near future on the scale of days and weeks.

Market microstructure does little to predict long‐term returns, but it works in predicting intraday risks. Long‐term portfolio management concerns itself with increasing returns of investments while minimizing risks. By accounting for the market microstructure risk, long‐only portfolio managers can reduce volatility and significantly improve the performance of their portfolios. For example, by adjusting the relative weights in their portfolios by the proportion of aggressive HFTs present in the markets, portfolio managers can optimize portfolio performance.

In addition to the risks associated with HFT, understanding market microstructure can help predict flash crashes days ahead, minimize slippage when placing trades, and, of course, predict short‐term price movements in the markets. All of these features help improve portfolio performance even for the largest‐scale pension funds and hedge funds.

Why hasn't market microstructure become important sooner? First, the data required for market microstructure analysis used to be scarce. Few organizations archived tick‐by‐tick data beyond the 21‐day time frame mandated by the regulators. Second, computing was too slow and too expensive to make market microstructure analyses economical. Third, transaction costs used to be hundreds of times higher just a decade ago, making gains from market microstructure comparatively negligible. Today, in the markets with razor‐thin profits, every penny and even basis point (1 percent of 1 percent) count.

Of course, market microstructure analysis is not trivial and requires an extensive understanding of issues underlying modern market dynamics. Also, markets do not stand still. Innovations in order routing, exchange and other trading venue configurations, and pre‐trade and post‐trade risk analytics affect market microstructure and many associated models. Staying on top of modern innovation in the markets is much more than a simple job. However, incorporating the market microstructure analytics into financial decisions is no longer an option but a requirement for sound portfolio management.

YES, YOU CAN PREDICT T + 1 VOLATILITY

One can predict volatility at least one day ahead (T + 1 basis) by examining the microstructure elements that have been shown to make financial instruments prone to extreme volatility. Three dimensions that are uncorrelated aspects of market microstructure include:

  1. Market “jitteriness” for a single financial instrument—a.k.a., market runs in technical lingo
  2. Propensity for downward volatility for a single financial instrument
  3. Likelihood of a market‐wide pandemic

Both the market jitteriness dimension and the market's intrinsic volatility predict T + 1 volatility. Taken together, the two dimensions deliver insight on whether specific equities are prone to crashes.

For instance, AbleMarkets' marketwide crash predictor can be based on aggregating market microstructure‐driven volatility parameters into a marketwide indicator. The index delivers considerable predictive power for intraday crashes, also known as flash crashes. Table 9.1 shows performance of the T + 1 index values versus realized daily (Low – Open)/Open return for the SPDR S&P ETF (NYSE: SPY).

The market microstructure effects captured by this index build up gradually leading up to a crash. As a result, it detects flash crash conditions up to several days in advance, and allows clients to liquidate large positions ahead of the crash, if desired.

Table 9.1 shows its performance out‐of‐sample T + 1 performance for the second half of 2014. As Table 9.1 illustrates, the Index is highly sensitive to impending crashes, and produced an especially high crash warning (0.84) one day ahead the infamous crash of October 15, 2014.

Table 9.1 AbleMarkets Flash Crash Index, Predictability of T+1 Downward Volatility

AbleMarkets EFCI threshold Mean NYSE:SPY (Low‐Open)/Open Std Dev NYSE:SPY (Low‐Open)/Open Count
>0.05 –0.0043 0.0041 155
>0.25 –0.0044 0.0041 151
>0.5 –0.0050 0.0046 61
>0.6 –0.0059 0.0052 27
>0.7 –0.0110 0.0052 5
>0.8 –0.0174 N/A 1

It can be used to trade volatility, hedge exposure ahead of marketwide crashes, and improve intraday execution. Here is a summary of use cases:

  1. Trading volatility. When indexes predict high volatility the following trading day, a trader may choose to buy a straddle on the underlying asset. Conversely, when indexes predict low volatility, a trader may choose to buy an option butterfly on the underlying asset.
  2. Hedging market exposure. When AbleMarkets predicts a high probability of a marketwide flash crash the following trading day, a portfolio manager may choose to hedge his portfolio's exposure to the broader markets by, for example, buying out‐of‐the‐money put options on the S&P 500 and its ETFs.
  3. Improve intraday execution. When indexes predict high volatility in an individual asset, an execution trader may choose to use a more passive execution algorithm to capture better prices via limit orders. When the AbleMarkets marketwide index predicts a marketwide crash, the execution trader with a mandate to sell an instrument may choose to speed up his execution algo to avoid selling in a potential crash later in the day. This nets about 30 percent per annum extra out of sample in AbleMarkets studies.

MARKET MICROSTRUCTURE AS A FACTOR? YOU BET

As most portfolio managers are well aware, the valuation of financial instruments no longer depends on single‐factor models, such as Dodd‐Graham NPV valuation. While aspiring Warren Buffetts of the world may have an occasional win, the markets have grown to be too complex to depend on simple cash‐flow projections. In equities, for example, decade‐old studies have shown that when the market as a whole goes up, some stocks may go up according to the Dodd‐Graham valuation forecasts, while others may go down. However, when markets drop, many investors choose to cut their losses by liquidating their holdings, forcing all stocks with good and bad financial projections down the market drain.

How can portfolio managers navigate such treacherous waters? A popular way that has been a trend on the street for years is to layer screens or signals of disparate well‐performing valuation techniques in order to develop precise, well‐timed signals about financial instruments' impending moves.

As an investor, you feel it, you know it: Some stocks tend to have more intraday volatility than other stocks. Some stocks are specifically more prone to flash crashes than others. Some stocks have higher aggressive high‐frequency trading (HFT) participation than other stocks. At this point in financial innovation, no savvy portfolio manager can afford to ignore intraday risk, and, instead, must make it an integral part of his portfolio selection model.

Why do intraday dynamics need to enter portfolio selection models? Can't portfolio managers simply ride out the intraday ups and downs in their pursuit of longer‐term goals? The answer is yes, but at a considerable cost. As the latest AbleMarkets.com research indicates, aggressive HFT participation and flash crashes are “sticky” and change slowly from one month to the next, not to mention from one day to another. Understanding which securities are prone to flash crashes can help avoid unnecessary stop losses. Avoiding financial instruments with high aggressive HFT participation can save double‐digit percentage costs in execution.

Why do metrics like aggressive HFT and flash crash probabilities persist in given stocks? The answer can be found in modern market microstructure. The microstructure phenomena such as aggressive HFT participation are directly linked to the automation of financial markets. As trading becomes increasingly electronic, many financial market participants build proprietary computer programs to obtain a cutting edge in the markets. The programs are time‐consuming and costly to build, and successive iterations take months and even years. As a result, the intraday dynamics remain stable over long time horizons and may differ significantly from one security to the next.

Why would market participants choose to build and run programs for some financial instruments, but not others? The answer has three parts:

  1. Cost of historical data
  2. Cost of interpreting historical data
  3. Processing power restrictions

Building a profitable trading algorithm requires a considerable investment in highly granular, and, as a result, voluminous data. Data can be very expensive to buy and also to store. Major data companies sell historical data for tens and hundreds of thousands of dollars. Just one day of highly granular data containing individual orders from a major exchange takes up 5 GB of space—the data storage space that can hold more than 1,000 high‐resolution digital photos. Given the data costs, trading developers may choose to acquire data for only a selected set of securities, paving the way to persistent discrepancies in microstructure among various financial instruments.

Making sense of data is perhaps the most expensive part of the process. New data‐based ideas are hard to come by, and capable data scientists are in high demand and command premium compensation. Even the seemingly basic tasks of retrieving the data and structuring it in a computer database can be an expensive process requiring numerous hours of computer programming. The disparity of data standards among various data providers and financial products make the basic tasks of reading file names very complex.

Even when the data are acquired, properly stored, opened, and turned into successful models, the majority of data‐related costs may still lie ahead. To ascertain model performance, the model needs to run on at least one year (most often, two years) of historical data, requiring advanced computer power. Once the backtests are completed and the models are verified to work (80 percent of models will fail), the costs are just about to ramp up. In trading “production” environment, the vast volumes of streaming data need to be received, captured, processed, stored, and turned into trading communication. To capture the full range of data, one needs to invest not just into advanced computer processing power, or hardware, but also into advanced network architecture and data centers, as well as physical network equipment such as fast network switches and physical communication like microwave networks.

As a result, once a working electronic trading approach has been developed and deployed, it can be extremely costly to change. As long as the systems remain profitable, they are typically left online. This, in turn, leads to great persistence in market microstructure in each individual financial instrument. The microstructure dynamics, however, may differ considerably from one financial instrument to the next.

For example, as AbleMarkets.com research shows, in the equities space, Google Inc. Class A stock (NASDAQ:GOOGL) had the highest participation of aggressive HFTs among all the S&P 500 stocks in 2014. Switching investment activity from GOOGL to GOOG, Class C shares of the same company, lowers aggressive HFT participation by 1 percent of volume, reducing trading costs. Similarly, avoiding securities with high flash crash risk can deliver considerable performance improvement. Understanding market microstructure risks is no longer a matter of curiosity, but that of sound portfolio management.

A myriad of solutions for dealing with the aggressive HFTs have been proposed to date. Regulators have been called on to ban HFT from the markets altogether. New exchanges have sprung up with the aim to exclude HFTs from their ranks. None of these approaches give investors any ability to experience the upside of HFT while tuning out its downside.

In response, the AbleMarkets Aggressive HFT Index activity in real time and near real time was developed. Iteratively researched over the past eight years, the computationally intensive methodology for estimating aggressive HFT participation examines every movement of market data and delivers estimates of the percentage of volume driven by HFT orders in any electronically traded financial instrument: equities, commodity futures, foreign exchange, and options. Armed with this information on HFT participation, traditional portfolio managers, low‐frequency quant strategists, and regular investors can hedge their exposure to aggressive HFT.

The research shows that a simple daily procedure for adjusting the portfolio weights of various financial instruments comprising one's portfolio significantly increases portfolio returns and reduces portfolio volatility. The process can be remarkably simple: Increase the holdings of securities where the proportion of aggressive HFT has decreased, and vice versa.

How does HFT hedging work in practice? For long‐only portfolios, the simplest aggressive HFT hedging strategy works as follows:

  1. Obtain the previous day's AbleMarkets Aggressive HFT Index or all the securities in your portfolio. We will denote this number as AHFT.
  2. Compute the existing percent allocation, also known as weight, of each security in your portfolio by value. For example, if the total holding of AAPL in your portfolio is $20,000 and the total dollar value of your portfolio is $100,000, then the weight of AAPL in your portfolio is 20 percent.
  3. Multiply all the weights computed in 2 by the following factor: (1/AHFT).
  4. For all the securities in your portfolio, find the sum of all weights (1/AHFT):

    SAHFT = (1/AHFT1) + (1/AHFT2) + ⋯ + (1/AHFTN), where N is the total number of securities in your portfolio.

  5. Divide the weight of every security by SAHFT obtained in step 4 above.

The resulting strategy could not be simpler and delivers sizable returns for portfolios of any composition. As our recent research indicates, for the long‐only S&P 500, this simple strategy delivers about 1 percent extra unlevered return and 0.1 improvement in the Sharpe ratio. When the portfolio is sizable, such gains amount to a considerable increase in return. Furthermore, the strategy works in all markets: moving up or down, calm and volatile, making the Index an indispensable tool in every investor's portfolio.

Best of all, it allows investors to contain the impact of aggressive HFT on their portfolio with minimal disruption to their current way of doing business: no switching exchanges or adjusting for new regulations is required. How much is the peace of mind worth to you to contain the impact of aggressive HFT algos on your portfolio holdings?

CASE STUDY: IMPROVING EXECUTION IN CURRENCIES

Aggressive high‐frequency trading in equities generates plenty of press. However, aggressive HFT in currencies is also on the rise and deserves attention. This case study considers the impact of aggressive HFTs on short‐term price changes in GBP/USD, and how execution traders can utilize aggressive HFT data to improve timing of block execution.

The bursts of aggressive HFTs' market orders temporarily wipe out limit orders, resulting in three prominent outcomes:

  1. Increasing the bid‐ask bounce and, subsequently, realized volatility
  2. Increasing slippage via increases in the bid‐ask spread
  3. Bidding up short‐term execution prices

Tracking aggressive HFT activity allows execution traders to significantly improve their current execution methodology. Traders charged with the acquisition of a specific currency pair over a short horizon may dramatically improve their performance by detecting an onset of aggressive HFT buying activity, usually leading to price increases, and speeding up execution in the short‐term. Similarly, execution traders commissioned to sell blocks of a given currency will find improvements in execution from executing larger slices of blocks at the first sign of a rise of aggressive HFT sellers.

The vanilla strategy executes the same size of trade once every minute. The enhanced strategy tracks a 20‐minute moving average of the difference between the aggressive HFT buyers and aggressive HFT sellers. When the 20‐minute moving average of the HFT buying activity exceeds HFT selling activity by at least 10 percent, the buying strategy buys three‐minutes' worth of the currency volume, and executes nothing in the following two minutes. From November 18, 2015, through November 27 alone, the enhanced strategy was triggered 256 times, on average buying at 0.3 pips lower than the benchmark (mid or ask) each time over the eight‐day period. The corresponding sell strategy performed even better, on average selling at 0.53 pips above the benchmark each trading period out of 294 instances during the eight days. Buying with the enhanced strategy reduced purchasing cost by nearly 80 pips over eight days, and selling with the enhanced strategy delivered prices nearly 175 pips higher than benchmark execution.

The results hold up for various currencies and extended time periods, making the aggressive HFT an indispensable item in an execution trader's tool bag. Due to its nature, the aggressive HFT can be used to enhance both low‐touch and high‐touch execution, delivering value to execution traders' clients and much‐deserved profitability to execution traders themselves.

FOR LONGER‐TERM INVESTORS, INCORPORATE MICROSTRUCTURE INTO THE REBALANCING DECISION

Long‐term buy‐and‐hold investors would improve their 2015 portfolio allocation decisions by increasing allocations to stocks frequented by aggressive high‐frequency traders (HFTs), according to AbleMarkets research. An illustration is a value‐weighted $100 million buy‐and‐hold portfolio of the S&P 500 stocks formed at the end of December 2014 and held through December 2015 without any activity. This portfolio would have generated $4.6 million, a 4.6 percent return. The same portfolio where holdings were increased in proportion with aggressive HFT participation delivered $4.9 million, or a $300,000 improvement.

The allocations were once again completely passive: the aggressive‐HFT‐adjusted weights were selected in December 2014, according to the average values of aggressive HFT reported by the AbleMarkets Aggressive HFT Index, and left unchanged for one year.

Aggressive HFT algorithms generate considerable volatility, as shown by many studies. Most of the volatility is a direct result of the aggressive HFTs' mode of execution, namely market orders. A concentrated stream of market orders erode liquidity, widening the bid‐ask spread and inducing the so‐called bid‐ask bounce, increasing volatility. More volatile stocks are more risky, and in the long term, require higher risk premium to compensate long‐term long‐only investors for holding the risk. Hence, long‐term long‐only investors in stocks preferred by aggressive HFT reap the extra return award.

What about portfolio risk? It turns out that the stocks favored by the aggressive HFT strategies have higher stock‐specific risk and lower market risk than their peers. Stock‐specific, or idiosyncratic, risk is relatively easy to diversify away in a portfolio. At the same time, the market risk, commonly known as beta, is often harder to diversify, and its relatively low occurrence among the stocks favored by aggressive HFT is welcome by institutional portfolio managers. On average, for every 1 percent increase in aggressive HFT participation among the Russell 3000 stocks, the beta decreases by 0.4 percent.

Since all HFTs are profit‐maximizing agents, and since it takes months, if not years, to build successful trading systems, HFTs will naturally gravitate toward the stocks where their profitability is reliably higher. When a HFT strategy in a particular stock is profitable, it is likely a result of the stock's idiosyncratic propensity to predictably respond to news. Profitable HFT systems, once built, tend to hang around for a long time, as the operating costs of the systems are low. Even when profitability eventually begins to decline, it does so gradually and can keep the systems afloat for months. Therefore, long‐term investors can be assured of the considerable longevity of their strategies, even though the underlying nature of the strategy is very short‐term.

Buy‐and‐hold investors should enhance their core strategy by including a rebalancing adjustment based on aggressive HFT participation.

CONCLUSIONS

Traditional portfolio diversification in the age of ETFs does not work as well as when it was devised in the 1950s. Microstructure is a new frontier helping investment managers make quality decisions about their portfolio allocation strategies that result in minimal volatility and better overall performance.

END OF CHAPTER QUESTIONS

  1. What are the recent changes in market volatility?
  2. What are the causes of changing volatility characteristics?
  3. How can investors mitigate the changes in volatility profiles?
  4. How can long‐term investors benefit from market microstructure in their portfolio allocation?
  5. What are other uses of market microstructure, and how does it relate to volatility?
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset