Traders Pocket $250 mln From Fed Leaks: Study

May 19th, 2014

According to a recent academic study, some investors may have pocketed between $14 mln to $256 mln in profits by getting early word on Fed policy changes between 1997 and 2013.

Study Background

The study, written by Gennaro Bernile, Jianfeng Hu and Yuehua Tang from Singapore Management University, called “Can information be locked up? Informed trading before macro-news announcements” revealed statistically significant price movements and imbalances in buy and sell orders that were consistent with the subsequent surprises in the Federal Reserve’s announcement of whether it was tightening or loosening its monetary policy.

The unexpected market moves took place before and during the period that journalists had information about the upcoming Fed policy announcements, but before they were officially released called the “lock up period”.

The authors analyzed whether the E-mini S&P 500 futures contract, E-mini Nasdaq 100 futures, the SPDR S&P 500 ETF, and the PowerShares QQQ ETF tracking the Nasdaq 100 index moved before the FOMC release.

Robust Evidence

What the researchers found was “robust evidence” that significant order imbalances that accurately predicted the post-release market reaction tended to arise in the E-mini S&P 500 futures market between 10 minutes and 20 minutes before the scheduled release of the FOMC’s policy statement.

In the 10 minutes prior to the release of the statement, E-minis rose on average 0.2% more on days when the announced policy decision was a surprise, compared with days when the decision was in line with the market consensus, the authors said.

The study concluded that the impact of early access to Fed policy decisions when the announcements were unexpected totaled roughly between $14 million and $256 million.

Lock Up Process

The Federal Open Market Committee (FOMC) regularly releases statements announcing the Fed’s decision about policy including where they are setting short-term interest rates, what their targets are for the money supply, and other actions to stimulate or reign in the economy like purchasing Treasury or mortgage-backed securities.

Historically, the Federal Research released a statement of the FOMC’s policy decision in the U.S. Treasury Department press room approximately 10 minutes before the official release time.  This gave journalists from accredited news organizations time to write their stories.  Once the embargo time was reached, journalists were allowed to submit their stories for publication.

Unfortunately, this process was not terribly strict. While journalists promised to respect the embargo, computer lines weren’t blocked enabling them to surreptitiously communicate with the outside.

Fed Changes

Issues with the Fed’s process for releasing the FOMC policy statement came under attack after trading in gold futures and exchange-traded funds linked to gold on the New York and Chicago commodity exchanges took place within one millisecond of the 2 p.m. ET embargo time for the FOMC release on September 18th, 2013.

This prompted the Fed, starting with its FOMC policy statement release on Oct. 30th 2013, to tighten up its regulation of the lockup.

Under new procedures, journalists gather in a room at Fed headquarters in Washington. They are forbidden to carry phones into the lockup, and lines connecting their computers to the Internet are blocked.

Journalists are given the FOMC statement 20 minutes before its release to the public, enabling them to prepare their stories. When the embargo time is reached, lines of communications are opened and journalists are allowed to transmit their stories.

Other Government Releases Clean

The Singapore Management University study found no evidence that investors were getting early access to other US government releases, including the Bureau of Labor Statistics’ monthly employment, wholesale inflation, or consumer inflation reports.  In addition, there didn’t appear to be any evidence that the Bureau of Economic Analysis’ quarterly GDP report was being leaked to certain investors.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

The Future of “Big Data” and Other Musings

May 14th, 2014

Yesterday afternoon, at the Intelligent Trading Summit held in New York City, I moderated an insightful panel on the growing use of big data and data driven analytics in the buy-side research process.  The panel of experts discussed a number of interesting topics including what they see as the future for “big data” on Wall Street.

Big Data Panelists

Members of this panel included Harry Blount, CEO of DISCERN, Brian Lichtenberger, Principal & Co-Founder of 7Park Data, and David Kedmey, President of EidoSearch.

DISCERN is an institutional investment research firm which conducts statistical analysis of huge amounts of structured and unstructured data to identify unique investment signals.  The firm then overlays this data analysis with contextual insights generated by experienced sell-side analysts to identify potential investable opportunities.  Discern provides a wide range of unbundled services, including allowing buy-side clients to license its data, the predictive signals it has developed, or the extensive research it generates.

7Park Data is a boutique data and analysis firm that sources exclusive structured and unstructured data from third-party providers, cleanses and stores it utilizing big data technologies, and develops predictive analytics based on this data to generate investment insights.  7Park markets these data driven research and information products on a subscription basis to institutional investors world-wide.

EidoSearch is a Financial Technology firm that applies pattern search technology to an extensive historical database of market data to generate predictive analytics and return projections for stocks, futures, currencies and other market indices.  EidoSearch quantifies the way investors are likely to respond today by studying their behavior when similar price patterns and market environments occurred in the past, providing a historical backdrop for every trade and investment decision.

Shouldn’t It Be Called “Big Analytics”?

Warehousing huge volumes of data is really irrelevant to analysts or investors.  Instead, the real key to driving value from “big data” initiatives is the development of predictive analytics at scale.  When asked how their various solutions address this issue, Harry, David and Brian explained the following:

Harry explained that DISCERN sells signals as a service.  What this means is they try to increase the signal to noise ratio, thereby enabling them to discover weak signals from the structured and unstructured data they collect. Consequently, the team at DISCERN is all about developing analytics which can identify these patterns or signals which could identify possible investment opportunities for their clients.

David noted that their clients are under tremendous pressure to sort through vast amounts of time series data in order to make better investment decisions.  He explained that EidoSearch harnesses the power of pattern recognition technology, applies this to decades of market data, and generates a series of possible forecasts for the future of a security price based on how investors have reacted to similar securities in the past.  This enables investors to make better informed trading and investment decisions.

Brian commented that his real business at 7Park Data is delivering insights to clients which result from the analytics they develop.  However, he warned that you can’t generate investment ideas from all data sources.  That’s why 7Park spends considerable time and effort finding unique and exclusive datasets.  Once that is accomplished they spend an equal amount of effort developing quantitative analytics which produces the predictive insights their clients are really looking for.  

Big Data and the Human vs Computer Debate

Most traders and analysts have traditionally relied on human judgment to make investment or trading decisions.  Big data is pushing investors to rely more on computer generated signals.  When asked how they see users integrating these two approaches, our panelists replied in this manner.

Brian explained that their job at 7Park Data is not to replace human input to the research process, but rather to take on many of the mundane and time consuming tasks many buy-side analysts perform today.  He contends that this will give investors more time to make thoughtful investment decisions.  Ultimately, Brian noted that their team works with buy-side clients to conduct the analysis they are interested in.  Brian acknowledged that the only person they might be replacing in the investment research process is the sell-side analyst.

David commented that some clients could use EidoSearch to generate computer trading strategies.  However, he saw most clients using a tool like EidoSearch to conduct an analysis which would require human decisions and insights to make actual trading or investment decisions.  For example, David explained that clients might decide to eliminate certain time frames or securities from the historical analysis based on their own experience or insight.

Harry noted that the DISCERN platform is not just about generating investment ideas, but it is also about helping a client get answers to all the follow on questions they are likely to ask as a result of that answer.  Harry felt the relationship between their platform and the client was symbiotic where their system learned what matters to that client over time.  In other words, Harry felt that the DISCERN platform didn’t replace human input or insight, but required it to work most effectively.

The Future for Big Data on Wall Street

As is the case with most panel discussions, we ended the session by asking the panelists to share their visions of the future of the “big data” landscape for investors in 3 to 5 years.

Brian commented that in the next few years, a small number of the largest mutual funds and hedge funds will have built their own in-house “big data” programs which would serve internal users such as analysts and portfolio managers.  He expects these groups will act much like quantitative investors, consuming unique data sources and generating their own proprietary analytics and signals.  However, he expects the rest of the market will look to external providers like 7Park or DISCERN to help them conduct analysis on big datasets to generate unique investment ideas.

David felt that the big data trend will likely have a significant impact on a number of major vendors to the buy-side, but the one player he thought would be impacted most would be the market data providers.  He explained that these vendors would need to switch from providing just data, to providing a platform where clients could generate real unique insights.  As a result, David thought that market data vendors would either have to develop a “big data” strategy or else they might lose relevance in the marketplace.

Harry’s vision was probably the most expansive.  He explained that in the future, corporations or institutional investors who did not have a big data strategy in place would be at a severe competitive disadvantage.  He expects the firms with a working big data program will be able to better identify new trends quicker and take advantage of these trends, whereas firms without such a program will be forced to react to developments – a fact that could lead some of them to go out of business altogether.

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

FCA Bans Commission Payments for Corporate Access

May 8th, 2014

The UK Financial Conduct Authority (FCA) released its updated rules on payments for investment research, eliminating payments for corporate access with client commissions.  Moreover, the rules go into effect in less than a month, reflecting the FCA’s view that the rules are simply a clarification, and that investment managers should have been following them all along.

The full text of Policy Statement (PS) 14/7 can be found at http://www.fca.org.uk/your-fca/documents/policy-statements/ps14-07.

Corporate Access

The FCA is implacable in its aversion to corporate access as a form of research. The FCA received 62 comment letters and met with investment managers, investment banks and corporate issuers prior to issuing its final rules.  In the end, the FCA did not make major changes to the proposed rules.

Some respondents argued that the definition of corporate access was too broad.  Nevertheless, the FCA held to its definition denoting any service of arranging or bringing about contact between an investment manager and an issuer or potential issuer.

The FCA also pointedly addressed the issue of bundling corporate access with other services, such as conferences or analyst input.  Investment managers will need to explicitly disaggregate the research portion and make a ‘fair assessment’ of the charge to pass on to their customers.

It will also scrutinize situations where the corporate access is deemed a ‘free good’.  Effectively, investment managers will need to fully justify any other payments made to a broker arranging corporate access as a non-monetary benefit and demonstrate that corporate access is not being subsidized by other payments.

Bundled research

With or without corporate access, investment managers will need to scrutinize unpriced bundled services more carefully.  The FCA expanded the final rules to place more emphasis on the need for investment managers to proactively value bundled services, whether or not investment banks are cooperative.

Investment managers should do “fact-based analysis” of bundled services. This would include comparisons to priced services such as independent research (which is often an order of magnitude less expensive).  Asset managers are also encouraged to estimate what it would cost them internally to perform the service. Or determine what they would be willing, in good faith, to pay for the service.

The FCA seems unsympathetic to the investment manager lament that investment banks are not forthcoming in providing pricing for their services:

Where an investment manager feels they need information from the broker or third party provider to assist them in a valuation process, we would expect a reasonable level of transparency to be present as part of any commercial arrangement. This could include the nature of a good or service provided, and the extent to which it has been used.

Primary research

FCA singled out corporate access but is less clear for other forms of research, leaving it up to investment managers to justify whether primary research meets the test.  Primary research providers were particularly troubled by the definition that research needs to “present” meaningful conclusions.  The FCA stuck to its guns on this point, arguing that the phrase ‘meaningful conclusions’ has been part of the UK’s definition of eligible research since 2006.  Moreover the FCA “would not expect an investment manager to accept as substantive research a good or service that only has a purely ‘artificial’ conclusion added by a broker or third party.”

On the plus side, the FCA says it does not intend to rule out “research that is used by the investment manager to feed into their own further research or assessment of investment and trading ideas.”  However, it concludes that eligible research must meet its cumulative definition of research, including presenting meaningful conclusions:

In the UK, the ability to use dealing commissions to acquire research is not intended to cover all non-execution related inputs into the investment manager’s decision-making process, but only additional third party research that can met our cumulative evidential criteria.

Conclusion

Those firms hoping the FCA would moderate its position will be disappointed in the published rules, which, if anything, are tougher in their final form.  PS 14/7 sends the signal that the FCA will err on the side of less rather than more eligible research, and none of the arguments presented to date seem to have softened their position.

Investment banks will be hardest hit by the new rules.  The FCA intends to erase £500 million (US$800 million) in commissions it estimates are spent on corporate access in the UK.  Most of that goes to investment banks.  However, if the FCA succeeds in pressuring investment managers to put a fair value on bundled investment bank research, the damage could be larger.  Priced independent research is generally significantly less than what investment banks have traditionally been paid for their research.  Or to use the other metric suggested by the FCA, investment managers could hire a lot of internal staff with the amounts paid for bank research.

The FCA has not only been fielding comments to CP 13/17.  From November 2013 to February 2014, it conducted a thematic supervision review examining both asset managers and sell-side brokers to review current market practices and business models linked to the use of dealing commissions.  Reading between the lines, it does not appear that this review relaxed the FCA’s perspective.   The FCA promises to report on this review later this year.

The Damoclean sword over everyone’s head is the MiFID II, which could result in an outright ban on paying for any research with client commissions.  As the FCA puts it, “MiFID II has the potential to impact the ability of portfolio managers to receive third party inducements, which may include research acquired in return for dealing commissions linked to execution services.”

The FCA is a party in the MiFID II discussions, but, judging by PS 14/7, a ban on research commissions would not break its heart.

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

A Vision for Big Data

May 7th, 2014

Big data is like teenage sex: everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it…

To help those of us innocent to thrills of big data, we’ve asked Harry Blount, founder of DISCERN, to share his views in this guest article.  Harry Blount was a member of the Committee on Forecasting Disruptive Technologies sponsored by the National Academy of Sciences, and since 2009 has been running an investment research firm dedicated to harnessing big data insights.

As a technologist, Blount has focused not only in gathering big data sources but also on developing analytic tools to help process them.  By comparison, Majestic Research, now part of ITG Research, ended up hiring analysts to interpret the data and provide insights.  Here is Blount’s perspective, which we have excerpted from a DISCERN white paper, “Big Data for Better Decisions”, which can be obtained at http://www.aboutdiscern.com/white-paper.

A vision of the big data future

It is 2015. Acme Commercial Drones has launched its private drone fleet following a 2013 congressional mandate ordering the FAA to integrate drones into commercial U.S. air space. Acme has begun conducting daily flyovers over the major U.S. ports counting the number of cargo ships manifested to carry corn to foreign ports. The average ship is floating 2-3 feet lower in the water than the prior month, suggesting higher than expected corn exports. Another Acme drone flying over the U.S. corn-growing region registers a subtle and non-trendline shift towards rust coloration on the crop suggesting that the late-growth corn season may not be as robust as consensus expectation. Finally, live camera feeds from the newly opened “New Panamax-class” locks at the Panama Canal show that a cargo ship has damaged one of the hydraulic gates, slowing transit to a fraction of capacity.

The purchasing officer of a large soft-drink company uses DISCERN’s Personal Analytics Cloud (PAC) to monitor the global corn transport infrastructure and major corn-growing regions. His PAC generates signals notifying him of these non-trendline events. He immediately executes purchase orders to secure longer-term contracts at current spot prices anticipating that the price of corn is about to spike. His fast action locks in a six-month margin advantage over the competition, allowing the company to report record profits.

What is Big Data?

The phrase “Big Data” refers to massive bundles of datasets, allowing for powerful aggregation and analysis at unprecedented speeds. Some 2.5 quintillion (2.5×1018) bytes of data are created every day2. Mobile devices, RFID readers, wireless sensing networks, video cameras, medical organizations, government entities – to name just few – are collecting an ever growing torrent of data.

Source: BigData-Startups.com

It is our belief that most decision-making tools, analytics and processes commonly applied to business have not kept pace with the aggressive explosion of data and the visualization capabilities of big data aggregation engines. Just as there is a growing digital divide between those with access to the internet and those without, we believe there is an equally important analogue for organizations with big data strategies in place, and those without.

One of the more popular definitions of big data is encompassed by the 4Vs – Volume, Velocity, Variety and Veracity. They sound more like bullet points in a brochure advertising high-end sports cars than in a comprehensive description on what your big data package should bring to the table. While the 4V’s may help a vendor sell data feeds and speeds, nowhere does this definition speak to the customer’s need for more insights earlier and more often.

Evaluating Big Data

The real question is – “How can you get more insights more often from big data?” In our opinion, the most critical aspect organizations need to assess when selecting a big data vendor is in the vendor’s ability to convert noise to signals. Said differently, most big data technology vendors will be able to bring you “the world of data” via a big data aggregation platform, but only a few will have a process capable of delivering you the all the relevant data (e.g. structured, unstructured, internal, external) in deep context in a way that is personalized and persistent for your requirements.

In analyzing a big data vendor, it would be wise to ask a few questions:

1) Can the vendor deliver a solution without selling you hardware and without expensive and extensive IT configuration?
2) Does the vendor aggregate data of any type – internal and external, structured and unstructured, public and commercial?
3) What is the data model for organizing the data?
4) What is the process for providing data in its original and curated context?
5) Can the vendor deliver the data in accordance with the context and unique needs of individual users within the organization?
6) Is the data accessible from any browser-enabled device?
7) Does the solution fit into your existing workflow?

Conclusion

During my involvement with the National Academy of Science study “Persistent Forecasting of Disruptive Technologies“, I came to strongly believe that in most cases, all of the information required to make an early and informed decision was available in the public domain or from inexpensive commercial sources. However, there simply was not a tool or service that was optimized to persistently scan and process the data to surface these insights to decision-makers.

We believe human interaction will always be critical to the decision-making process. However, too much time is spent searching, aggregating, cleansing and processing data, and not enough time is spent on value-added activities such as analysis and optimization.

We believe the emergence of big data tools such as DISCERN’s personal analytics clouds (PACs) and other persistent and personalized decisionmaking frameworks have begun to gain traction because they are able to address the shortcomings of traditional tools and processes. These platforms combine the power of big data platforms with the increasingly sophisticated capabilities of advertising networks and on-line shopping vendors.

Harry Blount is the CEO and Founder of DISCERN, Inc., a cloud-based, big data analytics company that delivers signals as a service to decision-makers.  The vision for DISCERN was inspired, in part, based on the work of Harry and Paul Saffo, DISCERN’s head of Foresight, during their tenures as members of the National Academy of Science Committee for Forecasting Disruptive Technologies.  Prior to founding DISCERN, Harry spent more than 20 years on Wall Street, including senior roles at Lehman Brothers, Credit Suisse First Boston, Donaldson Lufkin & Jenrette, and CIBC Oppenheimer. Harry has been named an Institutional Investor All-American in both Information Technology Hardware and Internet Infrastructure Services, and The Wall Street Journal has recognized him as an All-Star covering the Computer Hardware sector.  In addition, he is Chairman of the Futures Committee for The Tech Museum of Innovation in San Jose, California and is on the Silicon Valley Advisory Board of the Commonwealth Club. Harry graduated from the University of Wisconsin – La Crosse in 1986 with a B.S. in Finance.

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Layoffs Surge In April, Though Signs of Improvement Show

May 5th, 2014

Layoffs in the financial services industry surged in April as Wall Street firms continued to slash jobs last month due to weakness in mortgage applications and overall rightsizing at many of the nation’s largest banks.  However, the data shows some signs of improvement signaling that the employment picture may be turning around in the financial services industry.

April Challenger, Gray & Christmas Report

According to Challenger, Gray & Christmas’ monthly Job Cuts Report released last week, the financial services industry announced a surge in planned layoffs during April with plans to cut 4,124 jobs in the coming months.  This is a 490% increase from the 698 planned job cuts announced in March, and is an almost 300% gain from the number of layoffs announced in April of 2013.

Despite the surge in announced April layoffs, Wall Street firms have announced 19,430 planned layoffs on a year-to-date basis – a 44% decline in the number of layoffs announced during the same period in the previous year.  This suggests that the pace of right sizing in the financial services industry might finally be turning around.

John Challenger, CEO of Challenger, Gray & Christmas explained this development, “We are seeing some stabilization in the banking industry. We may continue to see cutbacks in the mortgage departments, as banks shed the extra workers hired to handle the flood of foreclosures, but those areas are getting back to normal staffing levels.”

Planned hiring at financial services firms rose 133% in April to 350 jobs from a modest 150 new jobs announced in March.  In addition, the number of new April hiring plans represents a 600% gain from the level seen in April of the prior year.  However, it is important to note that year-to-date hiring in the financial services industry remains anemic as it is 53% below the levels seen during the same period in 2013.

Major Industry Moves

The one major financial services firms which announced layoffs in April was Citigroup which announced that it was cutting 200 to 300 jobs (2% of it staff) in its global markets business.  The primary reason for this reduction was a response to a slump in profits due to losses in its bond trading unit.

Another major evolving story on Wall Street layoff front is Barclays Plc’s planned revamp of its investment bank that is expected to be presented to shareholders on May 8, and which executives have said will include significant job cuts.  Analysts at Sanford C. Bernstein estimate that Barclays Plc could eliminate 7,500 jobs at its investment bank to improve returns at its securities unit. Bernstein analysts note that the European fixed-income, currencies and commodities business may be the hardest hit, with about 5,000 job losses. Cuts of 6,500 to 7,500 would represent a reduction of between 25% to 30% of the unit’s employees.

Positives for Equity Research

As we have written recently, the outlook for Wall Street’s equities businesses has become more upbeat in the past few months.  For example, Bank of America, Morgan Stanley, Credit Suisse, and JP Morgan have all recently reported growth in their equities revenues (http://www.integrity-research.com/cms/2014/04/23/upbeat-equities-environment/).

In addition, institutional investors received $431 bln in new equity mandates in 2013 with a significant portion of these asset inflows going into international equities.   Another report shows that as of March 31st, hedge fund assets have hit new highs of $2.7 trillion thanks to healthy net capital inflows http://www.integrity-research.com/cms/2014/04/30/more-positives-for-equity-research/).

Relevance for the Research Industry

Despite the surge in layoffs in April, it looks like most Wall Street firms are starting to achieve more stable employment levels, eliminating the need for large layoffs in the near future.  In addition, a number of developments are starting to suggest that the sell-side and buy-side equities businesses are starting to turn around.

Consequently, we are becoming more constructive about the employment outlook for research analysts, research sales people, and other research support staff at investment banks and buy-side firms.  In addition, we suspect that a future increase in research staffing levels at sell-side and buy-side firms will also create higher demand for alternative research, unique datasets, and other analytic inputs – a virtuous cycle that has been evident in the past.  Certainly, this would be great news for an industry that has been buffeted over the past few years.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

More Positives for Equity Research

April 30th, 2014

Increased equity mandates, growing hedge fund assets and humbled momentum players equal more good news for equity research providers.

Increased equity assets

According to a recent article in Pensions & Investments, equities have received growing net inflows from institutional investors.  U.S. and international active equity mandates rebounded in 2013, according to a report released in January by consultant Eager, Davis & Holmes.  In 2013, 373 U.S. active equity hires were made, up from the 305 made in 2012 and the 288 in 2011.

Investment consultant Mercer reported that $431 billion in new assets were committed to equities in 2013.  This was an order of magnitude greater than previous years, which averaged  about $40 billion each year from 2009 to 2012 inclusive.  Comparatively, Mercer saw $1.2 billion go into bond funds in 2013.

Of the investors returning to equities, many are turning toward international equities, particularly those within Europe and emerging markets, which are expected to continue to rise. Data from Eager, Davis & Holmes shows that 291 active international equity hires were made in 2013 — a record high — compared to 271 hires made in 2012 and 210 in 2011. Of these hires, the majority of them — 136 — were in emerging markets equity, also a record high.

The flows into equities have tended to be from new money, rather than a dramatic shift from fixed income.  Most money managers and consultants contacted by P&I agreed they hadn’t been seeing flows coming out of fixed income and into equities as much as they have seen investors reallocating within their fixed-income and equity portfolios.

Momentum strategies punished

At the same time there has been a growing shift away from momentum strategies, and growth stocks, and into value.  According to a recent article in the Financial Times, trend-following hedge funds have suffered further outflows amid weak investment performance, raising questions about their survival.

Commodity trading advisers (CTAs), hedge funds that tend to follow trends in markets using computers to place bets, suffered their 10th consecutive month of net outflows in March, according to Eurekahedge, the data provider.

Overall, investors pulled $5.3bn from CTAs in the first quarter. This follows continued performance difficulties for these funds, which were down 1.8 per cent in the first three months of the year – the worst return of any hedge fund strategy.

CTA funds have experienced losses of 3.3 per cent over the past five years, a sharp reversal from the 11 per cent a year such funds made between 2000 and 2008, according to Hedge Fund Research, the data provider.   Last year the strategy was down 0.61 per cent, Eurekahedge figures show.

Matthew Beddall, chief investment officer of Winton Capital Management, believes that CTAs will need to embrace research more aggressively to survive: “There is nothing about the CTA business model that I see as broken. For CTAs that have invested less heavily in research, I am maybe inclined to say yes. It is just not easy to make money in financial markets, so our saying is: evolve or die.”

Growth in hedge fund assets

Despite troubles in the CTA segment, overall hedge fund assets are hitting new highs.  Pension and Investments reported industry assets hit a new peak of $2.7 trillion thanks to healthy net inflows.

Despite slight dips in aggregate performance in the months of January and March, investment performance and net capital inflows throughout the quarter were sufficient to push hedge fund industry assets to a new peak of $2.7 trillion as of March 31, according to Hedge Fund Research.

The first quarter was the industry’s seventh consecutive quarter of record-breaking net growth, HFR said in a recent research report. The jump in aggregate hedge fund industry assets was the result of the combination of $26.3 billion in net inflows and $47 billion of net investment performance gains.

HFR noted that net inflows in the three-month period ended March 31 were the highest quarterly inflow since the second quarter of 2011 when net inflows totaled $32.5 billion.

If the current pace of new money continues, net inflows for 2014 will top $105 billion, making it the best year since 2007, when net asset growth totaled $195 billion, historical HFR data show.

Equity hedge fund strategies experienced the high net inflows in the first quarter of $16.3 billion, followed by relative value strategies, $11.2 billion, and event-driven approaches, $4.1 billion. Macro hedge funds had net outflows of $5.3 billion, according to HFR tracking data.

Conclusion

Asset owners’ transfer of new assets into equities in 2013 was reflected in the excellent equities markets last year.  Nevertheless, the mandates tend to be relatively sticky, suggesting a continued positive for equities market participants.

Even more bullish for equity research providers is the growth in hedge fund assets, particularly inflows to equity hedge fund strategies.  In their search for alpha, hedge funds are the most aggressive and innovative users of investment research.

Nor does it hurt research providers that CTAs are being pressured to modify their pure quantitative momentum-oriented strategies to incorporate more research.

Combined with the positive commission trends we noted in recent earnings announcements of the large investment banks  the environment for equity research is looking very robust indeed.

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

New Model for Economic Consensus Estimates

April 28th, 2014

News agencies and specialized newsletters have traditionally surveyed a limited number of economists to ascertain the market consensus for various economic indicators.  However last week, crowd sourcing data provider, Estimize added Economic Indicators to its platform in order to obtain a more comprehensive picture of what the market expects will be reported for regular government economic releases.

Value of Consensus Estimates

Determining the “consensus estimates” for corporate earnings announcements, FDA drug trial results, political elections, or government economic releases has historically been an important part of the investment research process as this has helped investors determine what “the market” has discounted into existing securities prices.

Consequently, once an investor knows what the market consensus is for a particular event (like a corporate earnings release), then he/she has a good idea how investors will respond if the actual earnings release for that stock is better or worse than that market expectation.  A better than expected earnings report will generally lead to an improvement in the price of that particular stock while a worse than expected earnings report will lead to a decline in the price of that stock.

Of course, it is important to remember that different types of assets like stocks, bonds, or commodities will respond differently to these expectations surprises as misses will have different implications for the specific security.

In addition, some investors use consensus estimates to help then position trades before the announcement of a specific event – particularly if they expect an outcome that is markedly different than what the consensus view.

Traditional Way to Discover Market Consensus

In the past, various news agencies or newsletter providers have determined the consensus estimate for any formal event or release by periodically surveying a limited number of “experts” like sell-side analysts or economists and calculating the mean or median estimate of that survey.

This practice has been well established for most of the data reported in quarterly corporate earnings announcements.  In fact, over the years a number of consensus earnings databases and calendars have been established enabling investors to get a good handle on current market expectations and how these have changed over time.

However, consensus estimates for monthly and quarterly government economic releases have not been tracked as rigorously.  Of course, most major newswires including the Wall Street Journal, Thomson Reuters, Bloomberg and CBS MarketWatch all conduct regular surveys of the major government economic releases a week or two before the release date.

Typically, these surveys include the estimates of a few dozen well-known Wall Street economists at that point in time.  Unfortunately, none of these surveys track how these forecaster’s estimates have changed over time.  This is important because most economists revise their forecasts as they see evidence from other data releases about actual economic conditions.  For example, many economists might publish a preliminary forecast for monthly retail sales, which they later revise after auto sales, same store sales, and other private sales data is made available.

Of course, there are a few more formal surveys of government economic releases that are commercially available to customers, including Consensus Economics, Focus Economics, and Blue Chip Economic Indicators.  These products typically focus on longer-term estimates like quarterly GDP, but they do collect estimates for monthly time series like retail sales, housing starts and unemployment.  The good news about these consensus surveys is they generally include estimates from a much larger universe of economists.  However, the frequency of these surveys (at best monthly) means the estimates included might be preliminary and not take advantage of the most up-to-date data.

New Estimize Offering

Strangely, very little has changed in the way investors have collected economic consensus estimates for decades, that is until last week when Estimize announced that they were adding economic indicators to their crowd-sourcing financial data platform.

The new service will include market estimates on 17 different US economic indicators, including:

  • Nonfarm Payrolls
  • Unemployment Rate
  • Average Weekly Earnings
  • GDP
  • Producer Price Index
  • Consumer Price Index
  • Retail Sales
  • Purchasing Manager’s Index
  • Existing Home Sales
  • New Single Family Home Sales
  • Case-Shiller Home Price Index
  • Housing Starts
  • Durable Goods Orders
  • ISM Non-Manufacturing Index
  • Initial Claims for Unemployment Insurance
  • Manufacturing New Orders
  • Industrial Production

 

Estimize expects to add other US indicators and international economic time series as this service gains traction.

Estimize CEO, Leigh Drogen explained the reason for adding this new service, saying, “The currently available Economic Indicator estimate data sets are extremely poor, to put it lightly. They are literally a poll, a few weeks before the report, from 30 economists, with no revision history, and no forward looking estimates. One has to wonder how this has been the case for so long.  We’re going to change all of that now, by using the philosophies that have made Estimize so successful to this point. We believe a more accurate, and more importantly, more representative data set will be produced, with greater depth of information by collecting data the way we believe it should be collected.”

Summary

Investors have had to put up with getting consensus estimates for economic indicators in the same old way for decades – by conducting a point in time survey of a limited number of Wall Street economists.  However, last week’s announcement by Estimize that it was going to start collecting forecasts on US economic indicators by leveraging the “wisdom of the crowds” could completely change this practice.

It will be extremely interesting to see if Estimize can attract the same volume of forecasts that it currently gets for corporate earnings, and whether this consensus can be as accurate in aggregate as its earnings data has been.  However, there is no reason to believe that the consensus estimate for economic indicators will be any less accurate than what they are currently finding for corporate earnings.  In fact, given the real-time nature of the Estimize platform, we have every reason to believe that the Estimize economic consensus might be even more accurate than its earnings data as forecasters will be able to include estimates with the most recent data releases.

If this proves to be true, then we don’t expect too many serious investors will continue relying on obtaining their consensus economic indicator estimates from traditional sources in the future.  It will be exciting to see how this all pans out.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Upbeat Equities Environment

April 23rd, 2014

Earnings reports from major investment banks are showing continued improvement in the equities environment during the first quarter of 2014.  The longer term trend also suggests a more stable context for equity research.

Quarterly equities growth

Bank of America, Morgan Stanley, Credit Suisse, and JP Morgan have reported sequential growth in their equities revenues and all except JP Morgan had higher equities revenues than the first quarter of 2013.  Citigroup was the outlier.  Although its overall earnings report was well received, its equities business is down 12% relative to comparable quarters.

Goldman Sachs equity revenues were also down in aggregate, which it explained as the result of the sale of its reinsurance business and lower derivatives revenues.  However, its commission revenues were up 4% compared to the first quarter of last year and 11% sequentially.

The average for the banks reporting so far shows slight growth over the first quarter of 2013 and a robust 21% uptick from the fourth quarter.  Still to report: Barclays, Deutsche Bank and UBS.

Commissions as a component of equities

Unlike its competitors, Goldman provides detail on the components of its equities revenues.  Its report of equity commission revenues excludes the ‘noise’ from other equities products less directly related to investment research.  This is a reminder that although overall cash equities revenues are the best metric we have, it isn’t a perfect proxy for commission revenues.

The negative press surrounding high frequency trading has led to speculation that banks may be under pressure to close their dark pools.  Goldman’s CFO told analysts that there were “no strategic plans” to close its Sigma X pool during the first quarter conference call.  Closure of dark pools would negatively impact overall equities revenues, but not commissions revenues tied to investment research.

Equities trend

Looking at the longer term trend in equities revenues for one of the top equities houses, Credit Suisse, we see a near doubling of equities revenues in the run up to the financial crisis followed by an equally quick decline.   By 2011, Credit Suisse’s equity revenues were back to 2005 levels.  However, since then equities revenues have stabilized and show moderate growth (10%) in 2013.  (Note: CHF = 1.13 US$ and has trended around $1 US.)

The quarterly revenue picture is more complex, but has a similarly encouraging picture of an improving near term trend.   The quarterly revenues also show decreasing volatility, which is the trend across all the major investment banks.

The quarterly pattern also suggests some seasonality to the revenues with stronger revenues earlier in the year and weaker results later in the year.

Conclusion

Overall, it is premature to break out the champagne.  Not all banks have reported yet, so the picture could alter. More importantly, past experience tells us that equity revenues are volatile and a strong first quarter does not necessarily presage a strong year.   Nevertheless, the general trend suggests a generally more stable equities environment, at least for the leading equities players, which is a positive for the research segment.

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

Big Data and Investment Research: Part 2

April 21st, 2014

Last week we wrote about why buy-side firms are considering adding “big data” analytic techniques to their research process.  This week, we will investigate the impact that “big data” is now having on the sell-side and alternative research industry.

The “Big Data” Evolution

As we mentioned last week, the term “big data” when applied to investment research is the practice of regularly collecting, storing and analyzing huge amounts of structured and unstructured data sources to generate predictive insights with the sole purpose of generating consistent investment returns.

Despite pronouncements to the contrary, this is not really a new phenomenon in the research business as most of the best sell-side and alternative research firms have been doing just this for many years — albeit to a more limited extent.  Countless research analysts have conducted formal channel checks, surveys, or shopping bag counts; or they have licensed unique commercial data feeds; they have warehoused this data; and they have analyzed these unique data points as part of their research process for many years.  In fact, some research franchises like Midwest Research and it’s many spinoffs were widely known for this type of data-driven research process.

However, in recent years as computing power has grown, computer storage costs have fallen, and the internet has exponentially increased the availability of both structured and unstructured data, both the capability and interest in expanding the data-driven research model also increased with buy-side research consumers and the third-party research producers that serve them.

History of “Big Data” Investment Research

As mentioned in last week’s article, one of the first third-party “big data” firms to serve institutional investors was Majestic Research, founded in 2002, who collected, warehoused, and analyzed a number of unique data sets to identify investable signals for asset managers.  However, Majestic Research found this business model was difficult to maintain as quantitatively oriented hedge funds that initially used the service felt the predictive edge of this analysis deteriorated as more investors used it.  In other words, Majestic Research could not scale its research business.

In response, the firm decided to hire fundamental securities analysts who could leverage their proprietary data and statistical analysis to produce data-driven fundamental research.  They found the market for this type of research was much broader than pure quantitative investors, and these buy-side clients were less worried that the data was too widely distributed.  They valued the insight Majestic provided more than the underlying data which led to that insight.  As discussed last week, Majestic was acquired by ITG in 2010 and this data-driven research product became the foundation for ITG’s research product since.

However, other firms saw that the Majestic model could enhance the value of the traditional fundamental research product.  The largest single “big data” research initiative was rolled out by Morgan Stanley in 2008 with its Alpha Wise initiative.  Initially, AlphaWise conducted and charged hedge funds and asset managers for customized primary research, including market research, web bots, and expert surveys.  However, eventually, AlphaWise morphed into a unique data-driven research product (they call it evidence based) that Morgan Stanley clients could access based on hundreds of terabytes of proprietary survey and web scraped data.

Then in 2009, San Francisco-based alternative research firm, DISCERN started up to build a “big data” research model to meet the varied needs of different types of institutional investors.  As mentioned last week, DISCERN is an institutional equity research firm which covers a wide range of sectors, including specialty pharmaceuticals, financial services, energy, and real estate investment trusts.  Its research is based on the statistical analysis of huge amounts of structured and unstructured data to identify unique investment signals, and then overlays this data analysis with contextual insights generated by experienced sell-side analysts to identify key trends, market deflection points, and other potential investable opportunities.  Discern provides a wide range of unbundled services, including allowing buy-side clients to license its data, the predictive signals it has developed, or the extensive research it generates.

Consequences for the Research Industry

So, does the adoption of “big data” by these three research firms have any long-term consequence for the research industry?  We think so.  Clearly, these three firms have found extensive market acceptance of their data centric research products.  Based on our own analysis, DISCERN has been one of the faster growing alternative research firms in the industry over the past few years.  In addition, research produced by the AlphaWise team has been some of the most highly demanded research produced by Morgan Stanley.

If investment research firms want to compete with growing research firms like Majestic (now ITG), Morgan Stanley’s AlphaWise, or DISCERN, they are going to have to make significant changes to their traditional research businesses.  They will need to invest in building (or renting) scalable data warehouses and other technical infrastructure.  In addition, they will need to start thinking about what public data they want to collect; what commercial data they want to license; and more importantly, what proprietary data they should create by conducting longitudinal surveys, implementing web scraping technology, or utilizing other primary research techniques.  They will also need to hire data scientists or analysts skilled in building predictive analytics to work alongside their traditional analysts with deep company or industry expertise.

In addition, they will need to think through their business model and product set.  Do they only want to sell access to their analysts and the research reports they wrote as they did in the past?  Or, do they want to unbundle their research process and allow customers to cherry pick exactly what they want, including data, investment signals, full research reports, custom work, etc.

And of course, how do they want their clients to pay for this research – through hard dollar subscriptions, CSAs, trading, or some combination of the above? It is interesting to note that currently, all three of the companies mentioned earlier in this article, ITG, Morgan Stanley and DISCERN enable their clients to pay for these “big data” research services by trading through their trading desks.

Winners & Losers in “Big Data” Research?

Given the significant financial investment required in people, data, and technology, we suspect the obvious winners in rolling out “big data” research services are likely to be sell-side investment banks.  Clearly, many of these firms have produced data driven research in the past based on the analysts they have hired.  The move to a “big data” focus will really be a commitment on the part of the firm to basing all their investment research on a deep statistical analysis of underlying data sources.

However, it is important to note that many sell-side firms will not choose to make the switch to a “big data” based research process, nor will everyone that tries to do so succeed.  One of the major impediments to success is a bank’s unwillingness to make the financial commitment necessary to succeed.  Certainly, the downturn in equity commissions over the past five years and the struggle to make equity research pay could convince many management teams that an investment in “big data” research is just too risky with no obvious payoff.   Another reason some firms will fail in its big data efforts is their inability to adapt the firm’s culture to this new approach to investment research.

So, can alternative research firms succeed in developing “big data” research services?  We think so, though we do not think it will be a quick road to success.  In the past, both Majestic Research and DISCERN were alternative research firms that became successful in this space, though in each case, the firms developed their coverage and their product offering incrementally rather than trying to roll out broad coverage at the outset.

Similarly, we suspect that other alternative research firms will be successful in the future by initially focusing on a limited number of sectors based on a discreet set of proprietary data they have collected and few predictive signals they have developed.  These firms will be able to expand their businesses as they gain commercial traction by adding new sectors, more proprietary data, and additional analytic signals.

Another possible winner in the “big data” research space could be existing data aggregators or predictive analytics providers who decide to move up the value chain with their data products by adding context and investment insight to their offerings by hiring investment research analysts.  Unfortunately, we think that very few data aggregators or analytics providers will take the risk of stepping outside their domain expertise to enter the investment research business.  Consequently, we  don’t expect to see too many of these types of firms enter the research business.

Summary

In our view, the acceptance of “big data” techniques in the investment research industry is a forgone conclusion.  Buy-side investors have increasingly exhibited their appetite for unique datasets and for data driven research over the past decade.  To meet this hunger, a number of sell side and alternative research firms have well developed efforts under way, and we suspect that a few significant new entrants are likely to announce initiatives in 2014.

The real question remaining is how existing research firms plan to respond to this trend.  As we mentioned last week, the adoption of “big data” techniques enables a firm to develop a proprietary edge – whether it is through the collection of unique datasets or the development of proprietary signals that have predictive value.  We believe that as more research firms adopt a “big data” oriented research process, it will be increasingly harder for other traditional research firms, with no discernible proprietary edge to compete.  The days where a research firm could succeed solely on the basis of having a cadre of analysts with deep industry experience might be over.

 

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader

High Frequency Research: How HFT Impacts Investment Research

April 16th, 2014

The hubbub over high frequency trading (HFT) has implications for investment research.  The future direction of HFT will have direct implications for research.  And the Michael Lewis phenomenon is a cautionary tale for the research industry.

Low carb trading

It is estimated that HFT accounts for about half of trading in the US (down from two-thirds).  From the perspective of research, HFT is low-carb trading.  In other words, it doesn’t pay the research bills because it is too low margin and HFT firms don’t use Wall Street research.

From the HFT perspective, research is one of the obstacles to HFT growth.  In Michael Lewis’s controversial new book, the founder of IEX, a new exchange designed to counteract the evils of HFT, was challenged by buy-side inertia in directing trades because of the need to pay for research.  According to Lewis, the buy-side traders were outraged by HFT predatory tactics, yet continued to send orders to the HFT-infested dark pools operated by Wall Street banks like lambs to the slaughter.  All because of bank research.

The rise of high frequency research

What Lewis does not mention is that HFT has declined from its heyday, when it accounted for two-thirds of US market trading.  In 2009, high-frequency traders moved about 3.25 billion shares a day. In 2012, it was 1.6 billion a day, according to a Bloomberg BusinessWeek article.   It has declined because margins have declined from fierce competition among HFT firms.   Average profits have fallen from about a tenth of a penny per share to a twentieth of a penny.

Lewis misses the fundamental point with HFT: it is simply automated trading.  Yes, HFT trading has predatory practices, but that is not the core.  The core is computerized trading.  An unnerving aspect of HFT given surprisingly short shrift by Lewis is the frequency of flash crashes, which occur regularly but so quickly that humans can’t detect them.  Check out his excellent TED talk on HFT:  http://youtu.be/V43a-KxLFcg

For researchers, it is worth noting the current direction of HFT: high frequency research (HFR).  HFTs are using sophisticated programs to analyze news wires and headlines to instantly interpret implications. Some are scanning Twitter feeds, as evidenced by the Hash Crash, the sudden selloff that followed the Associated Press’s hacked Twitter account reporting explosions at the White House.  We can expect further developments and innovations in HFR as algorithms get more sophisticated.

Much has been written about the automation of trading and the declining role of human traders.  The automation of research is yet to be written.

A cautionary tale

One last point.  Flash Boys is compelling story of an outsider who uncovers HFT abuses and works to counteract them.   While it makes for a great read, it stretches belief that Brad Katsuyama was the first to discover the pernicious effects of HFT.

Like many other aspects of Wall Street, HFT was yet another open secret (although perhaps not understood in the exacting detail that Brad pursues).  Can you think of another generally accepted quirk that applies to research, just waiting for the next Michael Lewis tome?

HFT & Research

While the Wall Street banks have used HFT to augment cash equities revenues, that game is declining.  HFT is fundamentally hostile to traditional bank research.  Its trades don’t pay research bills, and ultimately HFT leads to a very different form of research that sends chills down the spine of every fundamental analyst.

Wall Street offers opportunities for talented writers to prosper by spotlighting commonly accepted idiosyncrasies in the markets.  Or for talented politicians seeking greater fame.  Will Soft Boys the next Lewis opus?

Subscribe to Integrity ResearchWatch by Email or  in an RSS/XML reader