Solving the Equity Market Neutral Dilemma

The Equity Market Neutral sector, currently comprising more than 100 hedge funds in the U.S., serves as an important element of diversified portfolios yet has long underperformed for individual investors due to undifferentiated approaches and misaligned incentives.  With tens of billions of dollars invested in market neutral hedge funds (and similar amounts invested in an additional 100 ETFs and mutual funds), an effective solution to this problem is long overdue – and Subset Capital intends to provide it.

Market neutral funds are suffering through a category average net return < 1% over the past five years with a negative Sharpe (as reported by Hedge Fund Research).  It seems few such funds are able to offer the ABCs investors require: meaningful alpha (ideally 10-15%), low beta (+/- 0.3) and reasonable costs (to limit erosion of gross returns).  This failure occurs because most market neutral offerings are built using the same ideas, methods and resources as long-only funds – an inherently problematic approach.

Subset created an equity market neutral fund working with these ABCs in mind.  We have a proprietary methodology for processing emergent data sources in a patented system that results in significant alpha, very low correlation to the markets, and a low-cost, highly scalable strategy.  Add to that an investor-friendly performance fee structure, and Subset’s flagship fund significantly outperforms benchmarks.

data sources.png

The low-cost, high-scale strategy features quantitative models that use newly-emerging data sources (including blogs, micro-blogs and social data related to equities, see left for examples), macro market variables and fundamental company data.  These models continue to evolve with new sources of data and with machine learning. Resulting signals are statistically monitored and systematically traded under the supervision of experienced traders.

fee comparison.png

Subset Capital inverts usual fee structures, putting LPs first:

  • there is a hurdle rate, so Subset doesn’t get paid until partners get reasonable returns
  • there is no management fee, only direct expenses (with a 1% cap), so LPs benefit as the fund grows and Subset only does better as the partners do better
  • in some cases, to provide even greater downside protection, Subset will slope the performance fee and take increased upside instead

Too many investors have had to choose to pursue either low correlation or solid returns.  With innovative fund strategies and client-focused fund terms, Subset aims to solve the dilemma.

 

 

The Growth of Big Data

Much has been said about “big data” in recent years, but often conversations get murky and unproductive because people think it is all about math and analysis.  It isn’t.  It’s about politics and the economy and people and technology (and yes, a little math and analysis).  This is particularly important to understand where the rapidly-growing quantities and various kinds of “social data” are involved since it helps put the relevance of the data in context – context that can be especially valuable in the investment arena.

Data related to retail trading – whether it takes the form of order flows, personal finance blogs, Facebook “likes” or tweets – has undergone exponential growth over the past two decades.  Regulatory revision and industry innovations lowered trade prices. Workplace changes and demographic shifts increased demand. Many formerly private topics such as personal finance have become comfortably public.  In addition, technological advances massively increased the ability to capture, store and track all of this opinion and activity.

While some data can be validated (for example, E*Trade’s reporting of which stocks customers are looking at the most), much of it can not (such as the fake AP tweet that caused a mini “flash crash”).  Knowing how to acquire, clean and check the data is essential.  Once that process is in place, sources that have the potential to be unreliable can turn out to be helpful; for example, there is a strong correlation between tweet volumes for each stock ticker and actual dollar trade volumes. 

correlation.png

When data is cleaned and filed, it becomes information.  That can, in turn, generate ideas and lead to hypotheses for testing, which can lead to knowledge. But how is knowledge then best collected into wisdom?  Many models have emerged over time: consultation, collaboration, cooperation, conglomeration.  Given the strengths of each approach and looking at the current environment, we believe the most effective collective knowledge will arise from a coordination-based model.

models.png

There are hundreds of potential data sources with varying degrees of relevance to financial markets, and more are born (or buried)  every day.  While black-box, quant-heavy firms may seem better equipped to make use of it all, there are equally large gains to be made by firms that take a less quantitative approach as long as they are familiar with the data, understand their own theories and know what is required to constantly validate those theories.  One of their biggest moves can be to automate strategies – not to replace humans but to free them up for more idea generation.